WorldWideScience

Sample records for interaction methodology applicable

  1. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  2. Development of a flow structure interaction methodology applicable to a convertible car roof

    Knight, Jason J.

    2003-01-01

    The current research investigates the flow-induced deformation of a convertible roof of a vehicle using experimental and numerical methods. A computational methodology is developed that entails the coupling of a commercial Computational Fluid Dynamics (CFD) code with an in-house structural code. A model two-dimensional problem is first studied. The CFD code and a Source Panel Method (SPM) code are used to predict the pressure acting on the surface of a rigid roof of a scale model. Good agreement is found between predicted pressure distribution and that obtained in a parallel wind-tunnel experimental programme. The validated computational modelling of the fluid flow is then used in a coupling strategy with a line-element structural model that incorporates initial slackness of the flexible roof material. The computed flow-structure interaction yields stable solutions, the aerodynamically loaded flexible roof settling into static equilibrium. The effects of slackness and material properties on deformation and convergence are investigated using the coupled code. The three-dimensional problem is addressed by extending the two-dimensional structural solver to represent a surface by a matrix of line elements with constant tension along their length. This has been successfully coupled with the three-dimensional CFD flow-solution technique. Computed deformations show good agreement with the results of wind tunnel experiments for the well prescribed geometry. In both two-and three-dimensional computations, the flow-structure interaction is found to yield a static deformation to within 1% difference in the displacement variable after three iterations between the fluid and structural codes. The same computational methodology is applied to a real-car application using a third-party structural solver. The methodology is shown to be robust even under conditions beyond those likely to be encountered. The full methodology could be used as a design tool. The present work

  3. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  4. Methodology for the interactive graphic simulator construction

    Milian S, Idalmis; Rodriguez M, Lazaro; Lopez V, Miguel A.

    1997-01-01

    The PC-supported Interactive Graphic Simulators (IGS) have successfully been used for industrial training programs in many countries. This paper is intended to illustrate the general methodology applied by our research team for the construction of this kind of conceptual or small scale simulators. The information and tools available to achieve this goal are also described. The applicability of the present methodology was confirmed with the construction of a set of IGS for nuclear power plants operators training programs in Cuba. One of them, relating reactor kinetics, is shown and briefly described in this paper. (author). 11 refs., 3 figs

  5. Neural Networks Methodology and Applications

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  6. Analysis of damaged DNA / proteins interactions: Methodological optimizations and applications to DNA lesions induced by platinum anticancer drugs

    Bounaix Morand du Puch, Ch

    2010-10-01

    DNA lesions contribute to the alteration of DNA structure, thereby inhibiting essential cellular processes. Such alterations may be beneficial for chemotherapies, for example in the case of platinum anticancer agents. They generate bulky adducts that, if not repaired, ultimately cause apoptosis. A better understanding of the biological response to such molecules can be obtained through the study of proteins that directly interact with the damages. These proteins constitute the DNA lesions interactome. This thesis presents the development of tools aiming at increasing the list of platinum adduct-associated proteins. Firstly, we designed a ligand fishing system made of damaged plasmids immobilized onto magnetic beads. Three platinum drugs were selected for our study: cisplatin, oxali-platin and satra-platin. Following exposure of the trap to nuclear extracts from HeLa cancer cells and identification of retained proteins by proteomics, we obtained already known candidates (HMGB1, hUBF, FACT complex) but also 29 new members of the platinated-DNA interactome. Among them, we noted the presence of PNUTS, TOX4 and WDR82, which associate to form the recently-discovered PTW/PP complex. Their capture was then confirmed with a second model, namely breast cancer cell line MDA MB 231, and the biological consequences of such an interaction now need to be elucidated. Secondly, we adapted a SPRi bio-chip to the study of platinum-damaged DNA/proteins interactions. Affinity of HMGB1 and newly characterized TOX4 for adducts generated by our three platinum drugs could be validated thanks to the bio-chip. Finally, we used our tools, as well as analytical chemistry and biochemistry methods, to evaluate the role of DDB2 (a factor involved in the recognition of UV-induced lesions) in the repair of cisplatin adducts. Our experiments using MDA MB 231 cells differentially expressing DDB2 showed that this protein is not responsible for the repair of platinum damages. Instead, it appears to act

  7. CIAU methodology and BEPU applications

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  8. PSA methodology development and application in Japan

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  9. Acoustic emission methodology and application

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  10. Developing educational hypermedia applications: a methodological approach

    Jose Miguel Nunes

    1996-01-01

    Full Text Available This paper proposes an hypermedia development methodology with the aim of integrating the work of both educators, who will be primarily responsible for the instructional design, with that of software experts, responsible for the software design and development. Hence, it is proposed that the educators and programmers should interact in an integrated and systematic manner following a methodological approach.

  11. The GPT methodology. New fields of application

    Gandini, A.; Gomit, J.M.; Abramytchev, V.

    1996-01-01

    The GPT (Generalized Perturbation Theory) methodology is described, and a new application is discussed. The results obtained for a simple model (zero dimension, six parameters of interest) show that the expressions obtained using the GPT methodology, lead to results close to those obtained through direct calculations. The GPT methodology is useful to be used for radioactive waste disposal problems. The potentiality of the method linked to zero dimension model can be extended to radionuclide migration problems with space description. (K.A.)

  12. Current trends in Bayesian methodology with applications

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  13. Methodology and applications of eyetracking

    Arkadiusz Rajs

    2016-05-01

    Garbary 2,85-229 Bydgoszcz, jacek.gospodarczyk@byd.pl   Summary Eyetracking gives great capability of computer’s systems control and study of usability applications. In this paper we show construction of eyetracker and range of applications.   Key words: eyetracker, computer vision.

  14. Application of an allocation methodology

    Youngblood, R.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrate performance) in an optimal way over elements of a conceptual design

  15. Application of an allocation methodology

    Youngblood, R.; de Oliveira, L.F.S.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrated performance) in an optimal way over elements of a conceptual design. 6 refs., 3 figs., 2 tabs

  16. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  17. Proposed Methodology for Establishing Area of Applicability

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This paper presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the data validation tasks of a criticality safety computational study. The S/U methods presented are designed to provide a formal means of establishing the area (or range) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters form the key to the technique. These parameters are the so-called D parameters, which represent the differences by energy group of S/U-generated sensitivity profiles, and c parameters, which are the k correlation coefficients, each of which give information relative to the similarity between pairs of selected systems. The use of a Generalized Linear Least-Squares Methodology (GLLSM) tool is also described in this paper. These methods and guidelines are also applied to a sample validation for uranium systems with enrichments greater than 5 wt %

  18. Evolving Intelligent Systems Methodology and Applications

    Angelov, Plamen; Kasabov, Nik

    2010-01-01

    From theory to techniques, the first all-in-one resource for EIS. There is a clear demand in advanced process industries, defense, and Internet and communication (VoIP) applications for intelligent yet adaptive/evolving systems. Evolving Intelligent Systems is the first self- contained volume that covers this newly established concept in its entirety, from a systematic methodology to case studies to industrial applications. Featuring chapters written by leading world experts, it addresses the progress, trends, and major achievements in this emerging research field, with a strong emphasis on th

  19. Simulation and Modeling Methodologies, Technologies and Applications

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  20. Mining software specifications methodologies and applications

    Lo, David

    2011-01-01

    An emerging topic in software engineering and data mining, specification mining tackles software maintenance and reliability issues that cost economies billions of dollars each year. The first unified reference on the subject, Mining Software Specifications: Methodologies and Applications describes recent approaches for mining specifications of software systems. Experts in the field illustrate how to apply state-of-the-art data mining and machine learning techniques to address software engineering concerns. In the first set of chapters, the book introduces a number of studies on mining finite

  1. MicroComputed Tomography: Methodology and Applications

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  2. PET/MRI. Methodology and clinical applications

    Carrio, Ignasi [Autonomous Univ. of Barcelona, Hospital Sant Pau (Spain). Dept. Medicina Nuclear; Ros, Pablo (ed.) [Univ. Hospitals Case, Medical Center, Cleveland, OH (United States). Dept. of Radiology

    2014-04-01

    Provides detailed information on the methodology and equipment of MRI-PET. Covers a wide range of clinical applications in oncology, cardiology, and neurology. Written by an international group of experts in MRI and PET. PET/MRI is an exciting novel diagnostic imaging modality that combines the precise anatomic and physiologic information provided by magnetic resonance imaging (MRI) with the molecular data obtained with positron emission tomography (PET). PET/MRI offers the promise of a simplified work flow, reduced radiation, whole-body imaging with superior soft tissue contrast, and time of flight physiologic information. It has been described as the pathway to molecular imaging in medicine. In compiling this textbook, the editors have brought together a truly international group of experts in MRI and PET. The book is divided into two parts. The first part covers methodology and equipment and comprises chapters on basic molecular medicine, development of specific contrast agents, MR attenuation and validation, quantitative MRI and PET motion correction, and technical implications for both MRI and PET. The second part of the book focuses on clinical applications in oncology, cardiology, and neurology. Imaging of major neoplasms, including lymphomas and tumors of the breast, prostate, and head and neck, is covered in individual chapters. Further chapters address functional and metabolic cardiovascular examinations and major central nervous system applications such as brain tumors and dementias. Risks, safety aspects, and healthcare costs and impacts are also discussed. This book will be of interest to all radiologists and nuclear medicine physicians who wish to learn more about the latest developments in this important emerging imaging modality and its applications.

  3. Cleansing methodology of sites and its applications

    De Moura, Patrick; Dubot, Didier; Faure, Vincent; Attiogbe, Julien; Jeannee, Nicolas; Desnoyers, Yvon

    2009-01-01

    The Commissariat a l'Energie Atomique (CEA, French Atomic Energy Commission) has set up over the last 10 years an innovative methodology aiming at characterizing radiological contaminations. The application of the latter relies on various tools such as expertise vehicles with impressive detection performances (VEgAS) and recently developed software platform called Kartotrak. A Geographic Information System tailored to radiological needs constitutes the heart of the platform; it is surrounded by several modules aiming at sampling optimization (Stratege), data analysis and geostatistical modeling (Krigeo), real-time monitoring (Kartotrak- RT) and validation of cleaning efficiency (Pescar). This paper presents the different tools which provide exhaustive instruments for the follow-up of decontamination projects, from doubt removal to the verification of the decontamination process. (authors)

  4. Methodological study of the diffusion of interacting cations through clays. Application: experimental tests and simulation of coupled chemistry-diffusion transport of alkaline ions through a synthetical bentonite

    Melkior, Th.

    2000-01-01

    The subject of this work deals with the project of underground disposal of radioactive wastes in deep geological formations. It concerns the study of the migration of radionuclides through clays. In these materials, the main transport mechanism is assumed to be diffusion under natural conditions. Therefore, some diffusion experiments are conducted. With interacting solutes which present a strong affinity for the material, the duration of these tests will be too long, for the range of concentrations of interest. An alternative is to determine on one hand the geochemical retention properties using batch tests and crushed rock samples and, on the other hand, to deduce the transport parameters from diffusion tests realised with a non-interacting tracer, tritiated water. These data are then used to simulate the migration of the reactive elements with a numerical code which can deal with coupled chemistry-diffusion equations. The validity of this approach is tested by comparing the numerical simulations with the results of diffusion experiments of cations through a clay. The subject is investigated in the case of the diffusion of cesium, lithium and sodium through a compacted sodium bentonite. The diffusion tests are realised with the through-diffusion method. The comparison between the experimental results and the simulations shows that the latter tends to under estimate the propagation of the considered species. The differences could be attributed to surface diffusion and to a decrease of the accessibility to the sites of fixation of the bentonite, from the conditions of clay suspensions in batch tests to the situation of compacted samples. The influence of the experimental apparatus used during the diffusion tests on the results of the measurement has also been tested. It showed that these apparatus have to be taken into consideration when the experimental data are interpreted. A specific model has been therefore developed with the numerical code CASTEM 2000. (author)

  5. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  6. Single-molecule pull-down (SiMPull) for new-age biochemistry: methodology and biochemical applications of single-molecule pull-down (SiMPull) for probing biomolecular interactions in crude cell extracts.

    Aggarwal, Vasudha; Ha, Taekjip

    2014-11-01

    Macromolecular interactions play a central role in many biological processes. Protein-protein interactions have mostly been studied by co-immunoprecipitation, which cannot provide quantitative information on all possible molecular connections present in the complex. We will review a new approach that allows cellular proteins and biomolecular complexes to be studied in real-time at the single-molecule level. This technique is called single-molecule pull-down (SiMPull), because it integrates principles of conventional immunoprecipitation with the powerful single-molecule fluorescence microscopy. SiMPull is used to count how many of each protein is present in the physiological complexes found in cytosol and membranes. Concurrently, it serves as a single-molecule biochemical tool to perform functional studies on the pulled-down proteins. In this review, we will focus on the detailed methodology of SiMPull, its salient features and a wide range of biological applications in comparison with other biosensing tools. © 2014 WILEY Periodicals, Inc.

  7. Application of agile methodologies in software development

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  8. Application of Response Surface Methodology for Optimizing Oil ...

    Application of Response Surface Methodology for Optimizing Oil Extraction Yield From ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... from tropical almond seed by the use of response surface methodology (RSM).

  9. Application opportunities of agile methodology in service company management

    Barauskienė, Diana

    2017-01-01

    Application Opportunities of Agile Methodology in Service Company Management. The main purpose of this master thesis is to identify which methods (or their modified versions) of Agile methodology can be applied in service company management. This master thesis consists of these parts – literature scientific analysis, author’s research methodology (research methods, authors’ research model, essential elements used in the research of application of Agile methodology), research itself (prelimina...

  10. Causal Meta-Analysis : Methodology and Applications

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  11. Design Methodologies: Industrial and Educational Applications

    Tomiyama, T.; Gul, P.; Jin, Y.; Lutters, Diederick; Kind, Ch.; Kimura, F.

    2009-01-01

    The field of Design Theory and Methodology has a rich collection of research results that has been taught at educational institutions as well as applied to design practices. First, this keynote paper describes some methods to classify them. It then illustrates individual theories and methodologies

  12. Issues in the global applications of methodology in forensic anthropology.

    Ubelaker, Douglas H

    2008-05-01

    The project and research reported in this collection of articles follows a long-term historical pattern in forensic anthropology in which new case work and applications reveal methodological issues that need to be addressed. Forensic anthropological analysis in the area of the former Yugoslavia led to questions raised regarding the applicability of methods developed from samples in other regions. The subsequently organized project reveals that such differences exist and new methodology and data are presented to facilitate applications in the Balkan area. The effort illustrates how case applications and court testimony can stimulate research advances. The articles also serve as a model for the improvement of methodology available for global applications.

  13. Theoretical and Methodological Perspectives on Designing Video Studies of Interaction

    Anna-Lena Rostvall

    2005-12-01

    Full Text Available In this article the authors discuss the theoretical basis for the methodological decisions made during the course of a Swedish research project on interaction and learning. The purpose is to discuss how different theories are applied at separate levels of the study. The study is structured on three levels, with separate sets of research questions and theoretical concepts. The levels reflect a close-up description, a systematic analysis, and an interpretation of how teachers and students act and interact. The data consist of 12 hours of video-recorded and transcribed music lessons from high school and college. Through a multidisciplinary theoretical framework, the general understanding of teaching and learning in terms of interaction can be widened. The authors also present a software tool developed to facilitate the processes of transcription and analysis of the video data.

  14. Residual radioactive material guidelines: Methodology and applications

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III.

    1989-01-01

    A methodology to calculate residual radioactive material guidelines was developed for the US Department of Energy (DOE). This methodology is coded in a menu-driven computer program, RESRAD, which can be run on IBM or IBM-compatible microcomputers. Seven pathways of exposure are considered: external radiation, inhalation, and ingestion of plant foods, meat, milk, aquatic foods, and water. The RESRAD code has been applied to several DOE sites to calculate soil cleanup guidelines. This experience has shown that the computer code is easy to use and very user-friendly. 3 refs., 8 figs

  15. Diatomic interaction potential theory applications

    Goodisman, Jerry

    2013-01-01

    Diatomic Interaction Potential Theory, Volume 2: Applications discusses the variety of applicable theoretical material and approaches in the calculations for diatomic systems in their ground states. The volume covers the descriptions and illustrations of modern calculations. Chapter I discusses the calculation of the interaction potential for large and small values of the internuclear distance R (separated and united atom limits). Chapter II covers the methods used for intermediate values of R, which in principle means any values of R. The Hartree-Fock and configuration interaction schemes des

  16. Application of PRINCE2 Project Management Methodology

    Vaníčková Radka

    2017-09-01

    Full Text Available The methodology describes the principle of setting a project in PRINCE2 project management. The main aim of the paper is to implement PRINCE2 methodology to be used in an enterprise in the service industry. A partial aim is to choose a supplier of the project among new travel guides. The result of the project activity is a sight-seeing tour/service more attractive for customers in the tourism industry and a possible choice of new job opportunities. The added value of the article is the description of applying the principles, processes and topics of PRINCE2 project management so that they might be used in the field.

  17. Applicability of the Directed Graph Methodology

    Huszti, Jozsef [Institute of Isotope of the Hungarian Academy of Sciences, Budapest (Hungary); Nemeth, Andras [ESRI Hungary, Budapest (Hungary); Vincze, Arpad [Hungarian Atomic Energy Authority, Budapest (Hungary)

    2012-06-15

    Possible methods to construct, visualize and analyse the 'map' of the State's nuclear infrastructure based on different directed graph approaches are proposed. The transportation and the flow network models are described in detail. The use of the possible evaluation methodologies and the use of available software tools to construct and maintain the nuclear 'map' using pre-defined standard building blocks (nuclear facilities) are introduced and discussed.

  18. ''Training plan optimized design'' methodology application to IBERDROLA - Power generation

    Gil, S.; Mendizabal, J.L.

    1996-01-01

    The trend in both Europe and the United States, towards the understanding that no training plan may be considered suitable if not backed by the results of application of the S.A.T. (Systematic Approach to Training) methodology, led TECNATOM, S.A. to apply thy methodology through development of an application specific to the conditions of the Spanish working system. The requirement that design of the training be coherent with the realities of the working environment is met by systematic application of the SAT methodology as part of the work analysis and job-based task analysis processes, this serving as a basis for design of the training plans

  19. Applications of response surface methodology approach to ...

    Jane

    2011-08-22

    Aug 22, 2011 ... germ tube growth of Puccinia coronata f.sp. avenae urediosopores ... and can provide incorrect conclusions in case of strong interactions among the ..... resistance, grain yield, test weight, and seed weight in oat. Crop Sci.

  20. Application of a methodology for retouching

    Ana Bailão

    2010-11-01

    Full Text Available Between November 2006 and January 2010, an investigation into retouching methodologies was carried out. The aim of this paper is to describe, in four steps, the retouching methodology of a contemporary painting. The four steps are: chromatic and formal study, considering the use of Gestalt theory and the phenomena of contrast and assimilation; selection of the technique; choice of the materials and retouching practice.Entre Novembre 2006 et Janvier 2010, nous avons fait une recherche dans le cadre du programme de Maitrise sur la méthodologie et les techniques de retouche. Le but de cet article est la description, en quatre étapes, de la méthodologie de retouche d’une peinture contemporaine. Les quatre étapes sont: étude chromatique et formelle, avec l’utilisation de la théorie de la Gestalt et des phénomènes de contraste et assimilation, la sélection de la technique, le choix des matériaux et la pratique de retouche. 

  1. Analytical group decision making in natural resources: Methodology and application

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  2. Application of human reliability analysis methodology of second generation

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  3. Novel Biomaterials Methodology, Development and Application

    Traditionally the use of carbohydrate-based wound dressings including cotton, xerogels, charcoal cloth, alginates, chitosan and hydrogels, have afforded properties such as absorbency, ease of application and removal, bacterial protection, fluid balance, occlusion, and elasticity. Recent efforts in ...

  4. Ecodesign of cosmetic formulae: methodology and application.

    L'Haridon, J; Martz, P; Chenéble, J-C; Campion, J-F; Colombe, L

    2018-04-01

    This article describes an easy-to-use ecodesign methodology developed and applied since 2014 by the L'Oréal Group to improve the sustainable performance of its new products without any compromise on their cosmetic efficacy. Cosmetic products, after being used, are often discharged into the sewers and the aquatic compartment. This discharge is considered as dispersive and continuous. A consistent progress in reducing the environmental impact of cosmetic products can be achieved through focusing upon three strategic indicators: biodegradability, grey water footprint adapted for ecodesign (GWFE) and a global indicator, complementary to these two endpoints. Biodegradability represents the key process in the removal of organic ingredients from the environment. GWFE is defined herein as the theoretical volume of natural freshwater required to dilute a cosmetic formula after being used by the consumer, down to a concentration without any foreseeable toxic effects upon aquatic species. Finally, the complementary indicator highlights a possible alert on formula ingredients due to an unfavourable environmental profile based on hazard properties: for example Global Harmonization System/Classification, Labelling and Packaging (GHS/CLP) H410 classification or potential very persistent and very bioaccumulative (vPvB) classification. The ecodesign of a new cosmetic product can be a challenge as the cosmetic properties and quality of this new product should at least match the benchmark reference. As shown in the case studies described herein, new methodologies have been developed to maximize the biodegradability of cosmetic formulae, to minimize their GWFE and to limit the use of ingredients that present an unfavourable environmental profile, while reaching the highest standards in terms of cosmetic efficacy. By applying these methodologies, highly biodegradable products (≥ 95% based on ingredient composition) have been developed and marketed, with a low GWFE. This new

  5. Emission computed tomography: methodology and applications

    Reivich, M.; Alavi, A.; Greenberg, J.; Fowler, J.; Christman, D.; Rosenquist, A.; Rintelmann, W.; Hand, P.; MacGregor, R.; Wolf, A.

    1980-01-01

    A technique for the determination of local cerebral glucose metabolism using positron emission computed tomography is described as an example of the development of use of this methodology for the study of these parameters in man. The method for the determination of local cerebral glucose metabolism utilizes 18 F-2-fluoro-2-deoxyglucose ([ 18 F]-FDG). In this method [ 18 F]-FDG is used as a tracer for the exchange of glucose between plasma and brain and its phosphorylation by hexokinase in the tissue. The labelled product of metabolism, [ 18 F]-FDG phosphate, is essentially trapped in the tissue over the time course of the measurement. The studies demonstrate the potential usefulness of emission computed tomography for the measurement of various biochemical and physiological parameters in man. (Auth.)

  6. Robust PV Degradation Methodology and Application

    Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kimball, Greg [SunPower; Anderson, Mike [SunPower

    2017-11-15

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of PV systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this manuscript, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year (YOY) rate calculation. We show the method to provide reliable degradation rate estimates even in the case of sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.

  7. The micro-habitat methodology. Application protocols

    Sabaton, C; Valentin, S; Souchon, Y

    1995-06-01

    A strong need has been felt for guidelines to help various entities in applying the micro-habitat methodology, particularly in impact studies on hydroelectric installations. CEMAGREF and Electricite de France have developed separately two protocols with five major steps: reconnaissance of the river, selection of representative units to be studied in greater depth, morpho-dynamic measurements at one or more rates of discharge and hydraulic modeling, coupling of hydraulic and biological models, calculation of habitat-quality scores for fish, analysis of results. The two approaches give very comparable results and are essentially differentiated by the hydraulic model used. CEMAGREF uses a one-dimensional model requiring measurements at only one discharge rate. Electricite de France uses a simplified model based on measurements at several rates of discharge. This approach is possible when discharge can be controlled in the study area during data acquisition, as is generally the case downstream of hydroelectric installations. The micro-habitat methodology is now a fully operational tool with which to study changes in fish habitat quality in relation to varying discharge. It provides an element of assessment pertinent to the choice of instreaming flow to be maintained downstream of a hydroelectric installation; this information is essential when the flow characteristics (velocity, depth) and the nature of the river bed are the preponderant factors governing habitat suitability for trout or salmon. The ultimate decision must nonetheless take into account any other potentially limiting factors for the biocenoses on the one hand, and the target water use objectives on the other. In many cases, compromises must be found among different uses, different species and different stages in the fish development cycle. (Abstract Truncated)

  8. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability

  9. New applications of partial residual methodology

    Uslu, V.R.

    1999-12-01

    The formulation of a problem of interest in the framework of a statistical analysis starts with collecting the data, choosing a model, making certain assumptions as described in the basic paradigm by Box (1980). This stage is is called model building. Then the estimation stage is in order by pretending as if the formulation of the problem was true to obtain estimates, to make tests and inferences. In the final stage, called diagnostic checking, checking of whether there are some disagreements between the data and the model fitted is done by using diagnostic measures and diagnostic plots. It is well known that statistical methods perform best under the condition that all assumptions related to the methods are satisfied. However it is true that having the ideal case in practice is very difficult. Diagnostics are therefore becoming important so are diagnostic plots because they provide a immediate assessment. Partial residual plots that are the main interest of the present study are playing the major role among the diagnostic plots in multiple regression analysis. In statistical literature it is admitted that partial residual plots are more useful than ordinary residual plots in detecting outliers, nonconstant variance, and especially discovering curvatures. In this study we consider the partial residual methodology in statistical methods rather than multiple regression. We have shown that for the same purpose as in the multiple regression the use of partial residual plots is possible particularly in autoregressive time series models, transfer function models, linear mixed models and ridge regression. (author)

  10. Applications of a constrained mechanics methodology in economics

    Janová, Jitka

    2011-11-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  11. Applications of a constrained mechanics methodology in economics

    Janova, Jitka

    2011-01-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  12. Applications of a constrained mechanics methodology in economics

    Janova, Jitka, E-mail: janova@mendelu.cz [Department of Theoretical Physics and Astrophysics, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemedelska 1, 613 00 Brno (Czech Republic)

    2011-11-15

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the undergraduate level and (ii) to enable the students to gain a deeper understanding of the principles and methods routinely used in mechanics by looking at the well-known methodology from the different perspective of economics. Two constrained dynamic economic problems are presented using the economic terminology in an intuitive way. First, the Phillips model of the business cycle is presented as a system of forced oscillations and the general problem of two interacting economies is solved by the nonholonomic dynamics approach. Second, the Cass-Koopmans-Ramsey model of economical growth is solved as a variational problem with a velocity-dependent constraint using the vakonomic approach. The specifics of the solution interpretation in economics compared to mechanics is discussed in detail, a discussion of the nonholonomic and vakonomic approaches to constrained problems in mechanics and economics is provided and an economic interpretation of the Lagrange multipliers (possibly surprising for the students of physics) is carefully explained. This paper can be used by the undergraduate students of physics interested in interdisciplinary physics applications to gain an understanding of the current scientific approach to economics based on a physical background, or by university teachers as an attractive supplement to classical mechanics lessons.

  13. An aspect-oriented methodology for designing secure applications

    Georg, Geri; Ray, Indrakshi; Anastasakis, Kyriakos; Bordbar, Behzad; Toahchoodee, Manachai; Houmb, S.H.

    We propose a methodology, based on aspect-oriented modeling (AOM), for incorporating security mechanisms in an application. The functionality of the application is described using the primary model and the attacks are specified using aspects. The attack aspect is composed with the primary model to

  14. Probabilistic risk assessment methodology for risk management and regulatory applications

    See Meng Wong; Kelly, D.L.; Riley, J.E.

    1997-01-01

    This paper discusses the development and potential applications of PRA methodology for risk management and regulatory applications in the U.S. nuclear industry. The new PRA methodology centers on the development of This paper discusses the time-dependent configuration risk profile for evaluating the effectiveness of operational risk management programs at U.S. nuclear power plants. Configuration-risk profiles have been used as risk-information tools for (1) a better understanding of the impact of daily operational activities on plant safety, and (2) proactive planning of operational activities to manage risk. Trial applications of the methodology were undertaken to demonstrate that configuration-risk profiles can be developed routinely, and can be useful for various industry and regulatory applications. Lessons learned include a better understanding of the issues and characteristics of PRA models available to industry, and identifying the attributes and pitfalls in the developement of risk profiles

  15. Methodology and applications for organizational safety culture

    Sakaue, Takeharu; Makino, Maomi

    2004-01-01

    The mission of our activity is making 'guidance of safety culture for understanding and evaluations' which comes in much more useful and making it substantial by clarifying positioning of safety culture within evaluation of the quality management. This is pointed out by 'Discussion on how to implement safety culture sufficiently and possible recommendation' last year by falsification issue of TEPCO (Tokyo Electric Power Company). We have been developing the safety culture evaluation structured by three elements. One is safety culture evaluation support tool (SCET), another is organizational reliability model (ORM), third is system for safety. This paper describes mainly organizational reliability model (ORM) and its applications as well as ticking the system for safety culture within quality management. (author)

  16. Designing Interactive Applications to Support Novel Activities

    Hyowon Lee

    2013-01-01

    Full Text Available R&D in media-related technologies including multimedia, information retrieval, computer vision, and the semantic web is experimenting on a variety of computational tools that, if sufficiently matured, could support many novel activities that are not practiced today. Interactive technology demonstration systems produced typically at the end of their projects show great potential for taking advantage of technological possibilities. These demo systems or “demonstrators” are, even if crude or farfetched, a significant manifestation of the technologists’ visions in transforming emerging technologies into novel usage scenarios and applications. In this paper, we reflect on design processes and crucial design decisions made while designing some successful, web-based interactive demonstrators developed by the authors. We identify methodological issues in applying today’s requirement-driven usability engineering method to designing this type of novel applications and solicit a clearer distinction between designing mainstream applications and designing novel applications. More solution-oriented approaches leveraging design thinking are required, and more pragmatic evaluation criteria is needed that assess the role of the system in exploiting the technological possibilities to provoke further brainstorming and discussion. Such an approach will support a more efficient channelling of the technology-to-application transformation which are becoming increasingly crucial in today’s context of rich technological possibilities.

  17. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  18. Soils Activity Mobility Study: Methodology and Application

    None, None

    2014-09-29

    and labor- and data-intensive methods. For the watersheds analyzed in this report using the Level 1 PSIAC method, the risk of erosion is low. The field reconnaissance surveys of these watersheds confirm the conclusion that the sediment yield of undisturbed areas at the NNSS would be low. The climate, geology, soils, ground cover, land use, and runoff potential are similar among these watersheds. There are no well-defined ephemeral channels except at the Smoky and Plutonium Valley sites. Topography seems to have the strongest influence on sediment yields, as sediment yields are higher on the steeper hill slopes. Lack of measured sediment yield data at the NNSS does not allow for a direct evaluation of the yield estimates by the PSIAC method. Level 2 MUSLE estimates in all the analyzed watersheds except Shasta are a small percentage of the estimates from PSIAC because MUSLE is not inclusive of channel erosion. This indicates that channel erosion dominates the total sediment yield in these watersheds. Annual sediment yields for these watersheds are estimated using the CHAN-SEDI and CHAN-SEDII channel sediment transport models. Both transport models give similar results and exceed the estimates obtained from PSIAC and MUSLE. It is recommended that the total watershed sediment yield of watersheds at the NNSS with flow channels be obtained by adding the washload estimate (rill and inter-rill erosion) from MUSLE to that obtained from channel transport models (bed load and suspended sediment). PSIAC will give comparable results if factor scores for channel erosion are revised towards the high erosion level. Application of the Level 3 process-based models to estimate sediment yields at the NNSS cannot be recommended at this time. Increased model complexity alone will not improve the certainty of the sediment yield estimates. Models must be calibrated against measured data before model results are accepted as certain. Because no measurements of sediment yields at the NNSS are

  19. User Interaction Modeling and Profile Extraction in Interactive Systems: A Groupware Application Case Study †

    Tîrnăucă, Cristina; Duque, Rafael; Montaña, José L.

    2017-01-01

    A relevant goal in human–computer interaction is to produce applications that are easy to use and well-adjusted to their users’ needs. To address this problem it is important to know how users interact with the system. This work constitutes a methodological contribution capable of identifying the context of use in which users perform interactions with a groupware application (synchronous or asynchronous) and provides, using machine learning techniques, generative models of how users behave. Additionally, these models are transformed into a text that describes in natural language the main characteristics of the interaction of the users with the system. PMID:28726762

  20. Stable isotope methodology and its application to nutrition and gastroenterology

    Klein, P.D.; Hachey, D.L.; Wong, W.W.; Abrams, S.A.

    1993-01-01

    This report describes the activities of the Stable Isotope Laboratory in its function as a core resource facility for stable isotope applications in human nutrition research. Three aspects are covered: Training of visitors, assessment of new instrumentation, and development of new methodology. The research achievements of the laboratory are indicated in the publications that appeared during this period. (author). 23 refs

  1. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  2. Towards an MDA-based development methodology for distributed applications

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  3. Interfacing system LOCA risk assessment: Methodology and application

    Galyean, W.J.; Schroeher, J.A.; Hanson, D.J.

    1991-01-01

    The United States Nuclear Regulatory Commission (NRC) is sponsoring a research program to develop an improved understanding of the human factors hardware, and accident consequence issues that dominate the risk from an Interfacing Systems Loss-of-Coolant Accident (ISLOCA) at a nuclear power plant. To accomplish this program, a methodology has been developed for estimating the core damage frequency and risk associated with an ISLOCA. The steps of the methodology are described with emphasis on one step which is unique, estimation of the probability of rupture of the low pressure systems. A trial application of the methodology was made for a Pressurized Water Reactor (PWR). The results are believed to be plant specific and indicate that human errors during startup and shutdown could be significant contributors to ISLOCA risk at the plant evaluated. 10 refs

  4. Hybrid probabilistic and possibilistic safety assessment. Methodology and application

    Kato, Kazuyuki; Amano, Osamu; Ueda, Hiroyoshi; Ikeda, Takao; Yoshida, Hideji; Takase, Hiroyasu

    2002-01-01

    This paper presents a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to the safety assessment of geological disposal of high-level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts, while variability was formulated by means of probability density functions (pdfs) based on available data sets. The exercise demonstrated the applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert opinion and in providing information on the dependence of assessment results on the level of conservatism. In addition, it was shown that sensitivity analysis can identify key parameters contributing to uncertainties associated with results of the overall assessment. The information mentioned above can be utilized to support decision-making and to guide the process of disposal system development and optimization of protection against potential exposure. (author)

  5. Digraph matrix analysis applications to systems interactions

    Alesso, H.P.; Altenbach, T.; Lappa, D.; Kimura, C.; Sacks, I.J.; Ashmore, B.C.; Fromme, D.; Smith, C.F.; Williams, W.

    1984-01-01

    Complex events such as Three Mile Island-2, Brown's Ferry-3 and Crystal River-3 have demonstrated that previously unidentified system interdependencies can be important to safety. A major aspect of these events was dependent faults (common cause/mode failures). The term systems interactions has been introduced by the Nuclear Regulatory Commission (NRC) to identify the concepts of spatial and functional coupling of systems which can lead to system interdependencies. Spatial coupling refers to dependencies resulting from a shared environmental condition; functional coupling refers to both dependencies resulting from components shared between safety and/or support systems, and to dependencies involving human actions. The NRC is currently developing guidelines to search for and evaluate adverse systems interactions at light water reactors. One approach utilizes graph theoretical methods and is called digraph matrix analysis (DMA). This methodology has been specifically tuned to the systems interaction problem. The objective of this paper is to present results from two DMA applications and to contrast them with the results from more traditional fault tree approaches

  6. Training in radionuclide methodology and applications in biomedical area

    Signoretta, C.

    1998-01-01

    Full text: Training in the field of radionuclide methodology and applications in biomedical area is important to assure that radionuclide should duly be used without risk for patients or for technicians manipulating them. The National Atomic Energy Commission (CNEA) from its creation is giving training courses of different technical levels to those working in science and technology. The Course on Radionuclide Methodology and application is the most continuous, varied and requested within CNEA. This is a basic course mainly given to Biochemistry and Medicine. Its goal is to give both theoretical and practical knowledge for use and application of radionuclides bearing in mind radiological safety regulations. Personnel from CNEA and Nuclear Regulatory Authority (ARN) carry out teaching. On the other hand, a course for Technicians in Nuclear Medicine is giving supplying knowledge in this field, as well as expertise and practice to attend a responsible Medical Doctor. These curses comprise radionuclide methodology, anatomy, physiology, instrumentation and practical applications in Nuclear Medicine. Statistics concerning these course are giving. (author) [es

  7. Applications of mixed-methods methodology in clinical pharmacy research.

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  8. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

    Jaspers, Monique W M

    2009-05-01

    Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have

  9. Methodological developments and applications of neutron activation analysis

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  10. PIXE methodology of rare earth element analysis and its applications

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  11. Interaction between core analysis methodology and nuclear design: some PWR examples

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  12. Application of Bow-tie methodology to improve patient safety.

    Abdi, Zhaleh; Ravaghi, Hamid; Abbasi, Mohsen; Delgoshaei, Bahram; Esfandiari, Somayeh

    2016-05-09

    Purpose - The purpose of this paper is to apply Bow-tie methodology, a proactive risk assessment technique based on systemic approach, for prospective analysis of the risks threatening patient safety in intensive care unit (ICU). Design/methodology/approach - Bow-tie methodology was used to manage clinical risks threatening patient safety by a multidisciplinary team in the ICU. The Bow-tie analysis was conducted on incidents related to high-alert medications, ventilator associated pneumonia, catheter-related blood stream infection, urinary tract infection, and unwanted extubation. Findings - In total, 48 potential adverse events were analysed. The causal factors were identified and classified into relevant categories. The number and effectiveness of existing preventive and protective barriers were examined for each potential adverse event. The adverse events were evaluated according to the risk criteria and a set of interventions were proposed with the aim of improving the existing barriers or implementing new barriers. A number of recommendations were implemented in the ICU, while considering their feasibility. Originality/value - The application of Bow-tie methodology led to practical recommendations to eliminate or control the hazards identified. It also contributed to better understanding of hazard prevention and protection required for safe operations in clinical settings.

  13. Risk-Informed Assessment Methodology Development and Application

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-01-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  14. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  15. Reversible logic synthesis methodologies with application to quantum computing

    Taha, Saleem Mohammed Ridha

    2016-01-01

    This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions  are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Rese...

  16. A gamma heating calculation methodology for research reactor application

    Lee, Y.K.; David, J.C.; Carcreff, H.

    2001-01-01

    Gamma heating is an important issue in research reactor operation and fuel safety. Heat deposition in irradiation targets and temperature distribution in irradiation facility should be determined so as to obtain the optimal irradiation conditions. This paper presents a recently developed gamma heating calculation methodology and its application on the research reactors. Based on the TRIPOLI-4 Monte Carlo code under the continuous-energy option, this new calculation methodology was validated against calorimetric measurements realized within a large ex-core irradiation facility of the 70 MWth OSIRIS materials testing reactor (MTR). The contributions from prompt fission neutrons, prompt fission γ-rays, capture γ-rays and inelastic γ-rays to heat deposition were evaluated by a coupled (n, γ) transport calculation. The fission product decay γ-rays were also considered but the activation γ-rays were neglected in this study. (author)

  17. Application of transient analysis methodology to heat exchanger performance monitoring

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  18. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  19. LOADS INTERACTION DOMAINS METHODOLOGY FOR THE DESIGN OF STEEL GREENHOUSE STRUCTURES

    Sergio Castellano

    2007-03-01

    Full Text Available Aim of this research is to develop a design methodology which correlates main structural design parameters, whose production is characterised by high levels of standardization, such as the height of gutter or the distance between frames, with actions on the greenhouse. The methodology, based on the use of charts and abacus, permits a clear and a direct interpretation of the structural response to design load combinations and allows the design of structural improvements with the aim of the optimization of the ratio benefits (structural strength/costs. The study of structural interaction domains allowed a clear and a direct interpretation of the structural response to design load combinations. The diagrams highlight not only if the structure fulfils the standard requirements but also the safety levels with respect to design load combinations and allow the structural designer how to operate in order to optimize the structural response with standard requirements achieving the best ratio benefits (structural safety/ costs. The methodology was developed basing on criteria assigned by EN13031 on two different kinds of greenhouse structures: an arched greenhouse with a film plastic covering and a duo pitched roof greenhouse cover with rigid plastic membranes. Structural interaction domains for arched greenhouse showed a better capability of the structure to resist to vertical loads then to horizontal one. Moreover, the climatic load distribution on the structure assigned by EN13031 is such that the combination of climatic actions is less dangerous for the structure then their individual application. Whilst, duo pitched roof steel greenhouse interaction domains, showed a better capability of the structure to resist to vertical loads then to horizontal one and that, in any case, the serviceability limit states analysis is more strict then the ULS one. The shape of structural domains highlighted that the combination of actions is more dangerous for the

  20. Fast underdetermined BSS architecture design methodology for real time applications.

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  1. An interactive boundary layer modelling methodology for aerodynamic flows

    Smith, L

    2013-01-01

    Full Text Available Chord length m CD Dissipation coefficient Cf Skin friction coefficient d Original grid position m f Body force component N h Height m H Shape factor H* Energy thickness shape factor H** Density thickness J Jacobian L... and turbulent flows, which eliminate the direct link between the profile shape and the pressure gradient, making them suitable for flow with strong interaction. The resulting two equations read: ( ) 2 2 2 fe e e C d dU U MH d d =−++ ξ θ ξ θ...

  2. Design verification methodology for a solenoid valve for industrial applications

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  3. [Nursing methodology applicated in patients with pressure ulcers. Clinical report].

    Galvez Romero, Carmen

    2014-05-01

    The application of functional patterns lets us to make a systematic and premeditated nursing assessment, with which we obtain a lot of relevant patient data in an organized way, making easier to analize them. In our case, we use Marjory Gordon's functional health patterns and NANDA (North American Nursing Diagnosis Association), NOC (Nursing Outcomes Classification), NIC (Nursing Intervention Classification) taxonomy. The overall objective of this paper is to present the experience of implementation and development of nursing methodology in the care of patients with pressure ulcers. In this article it's reported a case of a 52-year-old female who presented necrosis of phalanxes in upper and lower limbs and suffered amputations of them after being hospitalized in an Intensive Care Unit. She was discharged with pressure ulcers on both heels. GENERAL ASSESSMENT: It was implemented the nursing theory known as "Gordon's functional health patterns" and the affected patterns were identified. The Second Pattern (Nutritional-Metabolic) was considered as reference, since this was the pattern which altered the rest. EVOLUTION OF THE PATIENT: The patient had a favourable evolution, improving all the altered patterns. The infections symptoms disappeared and the pressure ulcers of both heels healed completely. The application of nursing methodology to care patients with pressure ulcers using clinical practice guidelines, standardized procedures and rating scales of assessment improves the evaluation of results and the performance of nurses.

  4. Exploring Methodologies and Indicators for Cross-disciplinary Applications

    Bernknopf, R.; Pearlman, J.

    2015-12-01

    Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.

  5. Application of decision-making methodology to certificate-of-need applications for CT scanners

    Gottinger, H.W.; Shapiro, P.

    1985-01-01

    This paper describes a case study and application of decision-making methodology to two competing Certificate of Need (CON) applications for CT body scanners. We demonstrate the use of decision-making methodology by evaluating the CON applications. Explicit value judgements reflecting the monetary equivalent of the different categories of benefit are introduced to facilitate this comparison. The difference between the benefits (measured in monetary terms) and costs is called the net social value. Any alternative with positive net social value is judged economically justifiable, and the alternative with the greatest net social value is judged the most attractive. (orig.)

  6. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  7. Methodology for neural networks prototyping. Application to traffic control

    Belegan, I.C.

    1998-07-01

    The work described in this report was carried out in the context of the European project ASTORIA (Advanced Simulation Toolbox for Real-World Industrial Application in Passenger Management and Adaptive Control), and concerns the development of an advanced toolbox for complex transportation systems. Our work was focused on the methodology for prototyping a set of neural networks corresponding to specific strategies for traffic control and congestion management. The tool used for prototyping is SNNS (Stuggart Neural Network Simulator), developed at the University of Stuggart, Institute for Parallel and Distributed High Performance Systems, and the real data from the field were provided by ZELT. This report is structured into six parts. The introduction gives some insights about traffic control and its approaches. The second chapter discusses the various control strategies existing. The third chapter is an introduction to the field of neural networks. The data analysis and pre-processing is described in the fourth chapter. In the fifth chapter, the methodology for prototyping the neural networks is presented. Finally, conclusions and further work are presented. (author) 14 refs.

  8. Application of theoretical and methodological components of nursing care

    Rosa del Socorro Morales-Aguilar

    2016-12-01

    Full Text Available Introduction: the theoretical and methodological components are the proper expertise in nursing, and it refers to models, theories, care process, taxonomy of nursing diagnoses, system of nursing intervention classification, and system of outcomes classification, which base nursing care into professional practice. Methodology: research was performed on Google Scholar, reviewing the databases of Scielo, Ciberindex, Index Enfermería, Dialnet, Redalyc, Medline, identifying 70 published articles between 2005-2015, and selecting 52 of them. The keywords used were: nurse care, nursing diagnostic, classification, nursing theory, in spanish and portuguese. Results: training students, receive knowledge in the nursing process, NANDA International, classification of the interventions, nurse results and theoretical components. The Dorothea Orem, Callista Roy, Nola Pender, Virginia Henderson, Florence Nightingale, and Betty Neuman theories are applied. The application of the nursing process is limited and low familiarity with the international taxonomy by nurse professionals in the assistance area is noticed. Conclusions: the challenge of nursing is to continue to solidify the scientific knowledge and to undo the gap between theory and practice.

  9. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  10. AN AUTOMATIC AND METHODOLOGICAL APPROACH FOR ACCESSIBLE WEB APPLICATIONS

    Lourdes Moreno

    2007-06-01

    Full Text Available Semantic Web approaches try to get the interoperability and communication among technologies and organizations. Nevertheless, sometimes it is forgotten that the Web must be useful for every user, consequently it is necessary to include tools and techniques doing Semantic Web be accessible. Accessibility and usability are two usually joined concepts widely used in web application development, however their meaning are different. Usability means the way to make easy the use but accessibility is referred to the access possibility. For the first one, there are many well proved approaches in real cases. However, accessibility field requires a deeper research that will make feasible the access to disable people and also the access to novel non-disable people due to the cost to automate and maintain accessible applications. In this paper, we propose one architecture to achieve the accessibility in web-environments dealing with the WAI accessibility standard and the Universal Design paradigm. This architecture tries to control the accessibility in web applications development life-cycle following a methodology starting from a semantic conceptual model and leans on description languages and controlled vocabularies.

  11. Modern methodology and applications in spatial-temporal modeling

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  12. VaR Methodology Application for Banking Currency Portfolios

    Daniel Armeanu

    2007-02-01

    Full Text Available VaR has become the standard measure that financial analysts use to quantify market risk. VaR measures can have many applications, such as in risk management, to evaluate the performance of risk takers and for regulatory requirements, and hence it is very important to develop methodologies that provide accurate estimates. In particular, the Basel Committee on Banking Supervision at the Bank for International Settlements imposes to financial institutions such as banks and investment firms to meet capital requirements based on VaR estimates. In this paper we determine VaR for a banking currency portfolio and respect rules of National Bank of Romania regarding VaR report.

  13. Design Requirements for Communication-Intensive Interactive Applications

    Bolchini, Davide; Garzotto, Franca; Paolini, Paolo

    Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.

  14. Modelling Safe Interface Interactions in Web Applications

    Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael

    Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.

  15. Complex basis functions for molecular resonances: Methodology and applications

    White, Alec; McCurdy, C. William; Head-Gordon, Martin

    The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.

  16. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  17. Drug-targeting methodologies with applications: A review

    Kleinstreuer, Clement; Feng, Yu; Childress, Emily

    2014-01-01

    Targeted drug delivery to solid tumors is a very active research area, focusing mainly on improved drug formulation and associated best delivery methods/devices. Drug-targeting has the potential to greatly improve drug-delivery efficacy, reduce side effects, and lower the treatment costs. However, the vast majority of drug-targeting studies assume that the drug-particles are already at the target site or at least in its direct vicinity. In this review, drug-delivery methodologies, drug types and drug-delivery devices are discussed with examples in two major application areas: (1) inhaled drug-aerosol delivery into human lung-airways; and (2) intravascular drug-delivery for solid tumor targeting. The major problem addressed is how to deliver efficiently the drug-particles from the entry/infusion point to the target site. So far, most experimental results are based on animal studies. Concerning pulmonary drug delivery, the focus is on the pros and cons of three inhaler types, i.e., pressurized metered dose inhaler, dry powder inhaler and nebulizer, in addition to drug-aerosol formulations. Computational fluid-particle dynamics techniques and the underlying methodology for a smart inhaler system are discussed as well. Concerning intravascular drug-delivery for solid tumor targeting, passive and active targeting are reviewed as well as direct drug-targeting, using optimal delivery of radioactive microspheres to liver tumors as an example. The review concludes with suggestions for future work, considereing both pulmonary drug targeting and direct drug delivery to solid tumors in the vascular system. PMID:25516850

  18. Application of Agent Methodology in Healthcare Information Systems

    Reem Abdalla

    2017-02-01

    Full Text Available This paper presents a case study to describe the features and the phases of the two agent methodologies. The Gaia methodology for agent oriented analysis and design, Tropos is a detailed agent oriented software engineering methodology to explore each methodology's ability to present solutions for small problems. Also we provide an attempt to discover whether the methodology is in fact understandable and usable. In addition we were collecting and taking notes of the advantages and weaknesses of these methodologies during the study analysis for each methodology and the relationships among their models. The Guardian Angle: Patient-Centered Health Information System (GA: PCHIS is the personal system to help track, manage, and interpret the subject's health history, and give advice to both patient and provider is used as the case study throughout the paper.

  19. Multimodal interaction in image and video applications

    Sappa, Angel D

    2013-01-01

    Traditional Pattern Recognition (PR) and Computer Vision (CV) technologies have mainly focused on full automation, even though full automation often proves elusive or unnatural in many applications, where the technology is expected to assist rather than replace the human agents. However, not all the problems can be automatically solved being the human interaction the only way to tackle those applications. Recently, multimodal human interaction has become an important field of increasing interest in the research community. Advanced man-machine interfaces with high cognitive capabilities are a hot research topic that aims at solving challenging problems in image and video applications. Actually, the idea of computer interactive systems was already proposed on the early stages of computer science. Nowadays, the ubiquity of image sensors together with the ever-increasing computing performance has open new and challenging opportunities for research in multimodal human interaction. This book aims to show how existi...

  20. Electron-molecule interactions and their applications

    Christophorou, L G

    1984-01-01

    Electron-Molecule Interactions and Their Applications, Volume 2 provides a balanced and comprehensive account of electron-molecule interactions in dilute and dense gases and liquid media. This book consists of six chapters. Chapter 1 deals with electron transfer reactions, while Chapter 2 discusses electron-molecular positive-ion recombination. The electron motion in high-pressure gases and electron-molecule interactions from single- to multiple-collision conditions is deliberated in Chapter 3. In Chapter 4, knowledge on electron-molecule interactions in gases is linked to that on similar proc

  1. FPGA Design Methodologies Applicable to Nuclear Power Plants

    Kwong, Yongil; Jeong, Choongheui

    2013-01-01

    In order to solve the above problem, NPPs in some countries such as the US, Canada and Japan have already applied FPGA-based equipment which has advantages as follows: It is easier to verify the performance because it needs only HDL code to configure logic circuits without other software, compared to microprocessor-based equipment, It is much cheaper than ASIC in a small quantity, Its logic circuits are re configurable, It has enough resources like logic blocks and memory blocks to implement I and C functions, Multiple functions can be implemented in a FPGA chip, It is stronger with respect to carboy security than microprocessor-based equipment because its configuration cannot be changed by external access, It is simple to replace it with new one when it is obsolete, Its power consumption is lower. However, FPGA-based equipment does not have only the merits. There are some issues on its application to NPPs. First of all, the experiences in applying it to NPPs are much less than to other industries, and international standards or guidelines are also very few. And there is the small number of FPGA platforms for I and C systems. Finally, the specific guidelines on FPGA design are required because the design has both hardware and software characteristics. In order to handle the above issues, KINS(Korea Institute of Nuclear Safety) built a test platform last year and have developed regulatory guidelines for FPGA-application in NPPs. I and C systems of NPPs have been increasingly using FPGA-based equipment as an alternative of microprocessor-based equipment which is not simple to be evaluated for safety due to its complexity. This paper explained the FPGA design flow and design guidelines. Those methodologies can be used as the guidelines on FPGA verification for safety of I and C systems

  2. Application of System Dynamics Methodology in Population Analysis

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  3. Artificial Intelligence Methodologies and Their Application to Diabetes.

    Rigla, Mercedes; García-Sáez, Gema; Pons, Belén; Hernando, Maria Elena

    2018-03-01

    In the past decade diabetes management has been transformed by the addition of continuous glucose monitoring and insulin pump data. More recently, a wide variety of functions and physiologic variables, such as heart rate, hours of sleep, number of steps walked and movement, have been available through wristbands or watches. New data, hydration, geolocation, and barometric pressure, among others, will be incorporated in the future. All these parameters, when analyzed, can be helpful for patients and doctors' decision support. Similar new scenarios have appeared in most medical fields, in such a way that in recent years, there has been an increased interest in the development and application of the methods of artificial intelligence (AI) to decision support and knowledge acquisition. Multidisciplinary research teams integrated by computer engineers and doctors are more and more frequent, mirroring the need of cooperation in this new topic. AI, as a science, can be defined as the ability to make computers do things that would require intelligence if done by humans. Increasingly, diabetes-related journals have been incorporating publications focused on AI tools applied to diabetes. In summary, diabetes management scenarios have suffered a deep transformation that forces diabetologists to incorporate skills from new areas. This recently needed knowledge includes AI tools, which have become part of the diabetes health care. The aim of this article is to explain in an easy and plane way the most used AI methodologies to promote the implication of health care providers-doctors and nurses-in this field.

  4. PROLIFERATION RESISTANCE AND PHYSICAL PROTECTION WORKING GROUP: METHODOLOGY AND APPLICATIONS

    Bari R. A.; Whitlock, J.; Therios, I.U.; Peterson, P.F.

    2012-11-14

    We summarize the technical progress and accomplishments on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of Generation IV nuclear energy systems. We intend the results of the evaluations performed with the methodology for three types of users: system designers, program policy makers, and external stakeholders. The PR and PP Working Group developed the methodology through a series of demonstration and case studies. Over the past few years various national and international groups have applied the methodology to nuclear energy system designs as well as to developing approaches to advanced safeguards.

  5. Proliferation resistance and physical protection working group: methodology and applications

    Bari, Robert A.; Whitlock, Jeremy J.; Therios, Ike U.; Peterson, P.F.

    2012-01-01

    We summarize the technical progress and accomplishments on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of Generation IV nuclear energy systems. We intend the results of the evaluations performed with the methodology for three types of users: system designers, program policy makers, and external stakeholders. The PR and PP Working Group developed the methodology through a series of demonstration and case studies. Over the past few years various national and international groups have applied the methodology to nuclear energy system designs as well as to developing approaches to advanced safeguards.

  6. Nucleon-nucleon interactions via Lattice QCD: Methodology. HAL QCD approach to extract hadronic interactions in lattice QCD

    Aoki, Sinya

    2013-07-01

    We review the potential method in lattice QCD, which has recently been proposed to extract nucleon-nucleon interactions via numerical simulations. We focus on the methodology of this approach by emphasizing the strategy of the potential method, the theoretical foundation behind it, and special numerical techniques. We compare the potential method with the standard finite volume method in lattice QCD, in order to make pros and cons of the approach clear. We also present several numerical results for nucleon-nucleon potentials.

  7. Application of systematic review methodology to the field of nutrition.

    Lichtenstein, Alice H; Yetley, Elizabeth A; Lau, Joseph

    2008-12-01

    Systematic reviews represent a rigorous and transparent approach to synthesizing scientific evidence that minimizes bias. They evolved within the medical community to support development of clinical and public health practice guidelines, set research agendas, and formulate scientific consensus statements. The use of systematic reviews for nutrition-related topics is more recent. Systematic reviews provide independently conducted comprehensive and objective assessments of available information addressing precise questions. This approach to summarizing available data is a useful tool for identifying the state of science including knowledge gaps and associated research needs, supporting development of science-based recommendations and guidelines, and serving as the foundation for updates as new data emerge. Our objective is to describe the steps for performing systematic reviews and highlight areas unique to the discipline of nutrition that are important to consider in data assessment. The steps involved in generating systematic reviews include identifying staffing and planning for outside expert input, forming a research team, developing an analytic framework, developing and refining research questions, defining eligibility criteria, identifying search terms, screening abstracts according to eligibility criteria, retrieving articles for evaluation, constructing evidence and summary tables, assessing methodological quality and applicability, and synthesizing results including performing meta-analysis, if appropriate. Unique and at times challenging, nutrition-related considerations include baseline nutrient exposure, nutrient status, bioequivalence of bioactive compounds, bioavailability, multiple and interrelated biological functions, undefined nature of some interventions, and uncertainties in intake assessment. Systematic reviews are a valuable and independent component of decision-making processes by groups responsible for developing science-based recommendations

  8. Methodology of development and students' perceptions of a psychiatry educational smartphone application.

    Zhang, Melvyn W B; Ho, Cyrus S H; Ho, Roger C M

    2014-01-01

    The usage of Smartphones and smartphone applications in the recent decade has indeed become more prevalent. Previous research has highlighted the lack of critical appraisal of new applications. In addition, previous research has highlighted a method of using just the Internet Browser and a text editor to create an application, but this does not eliminate the challenges faced by clinicians. In addition, even though there has been a high rate of smartphone applications usage and acceptance, it is common knowledge that it would cost clinicians as well as their centers a lot to develop smartphone applications that could be catered to their needs, and help them in their daily educational needs. The objectives of the current research are thus to highlight a cost-effective methodology of development of interactive education smartphone applications, and also to determine whether medical students are receptive towards having smartphone applications and their perspectives with regards to the contents within. In this study, we will elaborate how the Mastering Psychiatry Online Portal and web-based mobile application were developed using HTML5 as the core programming language. The online portal and web-based application was launched in July 2012 and usage data were obtained. Subsequently, a native application was developed, as it was funded by an educational grant and students are recruited after their end of posting clinical examination to fill up a survey questionnaire relating to perspectives. Our initial analytical results showed that since inception to date, for the online portal, there have been a total of 15,803 views, with a total of 2,109 copies of the online textbook being downloaded. As for the online videos, 5,895 viewers have watched the training videos from the start till the end. 722 users have accessed the mobile textbook application. A total of 185 students participated in the perspective survey, with the majority having positive perspectives about the

  9. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  10. Analysis of damaged DNA / proteins interactions: Methodological optimizations and applications to DNA lesions induced by platinum anticancer drugs; Analyse des interactions ADN lese / proteines: Optimisations methodologiques et applications aux dommages de l'ADN engendres par les derives du platine

    Bounaix Morand du Puch, Ch

    2010-10-15

    DNA lesions contribute to the alteration of DNA structure, thereby inhibiting essential cellular processes. Such alterations may be beneficial for chemotherapies, for example in the case of platinum anticancer agents. They generate bulky adducts that, if not repaired, ultimately cause apoptosis. A better understanding of the biological response to such molecules can be obtained through the study of proteins that directly interact with the damages. These proteins constitute the DNA lesions interactome. This thesis presents the development of tools aiming at increasing the list of platinum adduct-associated proteins. Firstly, we designed a ligand fishing system made of damaged plasmids immobilized onto magnetic beads. Three platinum drugs were selected for our study: cisplatin, oxali-platin and satra-platin. Following exposure of the trap to nuclear extracts from HeLa cancer cells and identification of retained proteins by proteomics, we obtained already known candidates (HMGB1, hUBF, FACT complex) but also 29 new members of the platinated-DNA interactome. Among them, we noted the presence of PNUTS, TOX4 and WDR82, which associate to form the recently-discovered PTW/PP complex. Their capture was then confirmed with a second model, namely breast cancer cell line MDA MB 231, and the biological consequences of such an interaction now need to be elucidated. Secondly, we adapted a SPRi bio-chip to the study of platinum-damaged DNA/proteins interactions. Affinity of HMGB1 and newly characterized TOX4 for adducts generated by our three platinum drugs could be validated thanks to the bio-chip. Finally, we used our tools, as well as analytical chemistry and biochemistry methods, to evaluate the role of DDB2 (a factor involved in the recognition of UV-induced lesions) in the repair of cisplatin adducts. Our experiments using MDA MB 231 cells differentially expressing DDB2 showed that this protein is not responsible for the repair of platinum damages. Instead, it appears to act

  11. Validation of seismic soil structure interaction (SSI) methodology for a UK PWR nuclear power station

    Llambias, J.M.

    1993-01-01

    The seismic loading information for use in the seismic design of equipment and minor structures within a nuclear power plant is determined from a dynamic response analysis of the building in which they are located. This dynamic response analysis needs to capture the global response of both the building structure and adjacent soil and is commonly referred to as a soil structure interaction (SSI) analysis. NNC have developed a simple and cost effective methodology for the seismic SSI analysis of buildings in a PWR nuclear power station at a UK soft site. This paper outlines the NNC methodology and describes the approach adopted for its validation

  12. Interactive computer graphics applications for compressible aerodynamics

    Benson, Thomas J.

    1994-01-01

    Three computer applications have been developed to solve inviscid compressible fluids problems using interactive computer graphics. The first application is a compressible flow calculator which solves for isentropic flow, normal shocks, and oblique shocks or centered expansions produced by two dimensional ramps. The second application couples the solutions generated by the first application to a more graphical presentation of the results to produce a desk top simulator of three compressible flow problems: 1) flow past a single compression ramp; 2) flow past two ramps in series; and 3) flow past two opposed ramps. The third application extends the results of the second to produce a design tool which solves for the flow through supersonic external or mixed compression inlets. The applications were originally developed to run on SGI or IBM workstations running GL graphics. They are currently being extended to solve additional types of flow problems and modified to operate on any X-based workstation.

  13. Systems selection methodology for civil nuclear power applications

    Scarborough, J.

    1988-01-01

    A methodology for evaluation and selection of a preferred Advanced Small or Medium Power Reactor (SMPR) for commercial electric power generation is discussed, and an illustrative example is presented with five US Advanced SMPR power plants. The evaluation procedure was developed from a methodology for ranking small, advanced nuclear power plant designs under development by the US Department of Energy (DOE) and Department of Defense (DOD). The methodology involves establishing numerical probability distributions for each of fifteen evaluation criteria for each Advanced SMPR plant. A resultant single probability distribution with its associated numerical mean value is then developed for each Advanced SMPR plant by Monte Carlo sampling techniques in order that each plant may be ranked with an associated statement of certainty. The selection methodology is intended as a screening procedure for commercial offerings to preclude detailed technical and commercial assessments from being conducted for those offerings which do not meet the initial screening criteria

  14. Application of precursor methodology in initiating frequency estimates

    Kohut, P.; Fitzpatrick, R.G.

    1991-01-01

    The precursor methodology developed in recent years provides a consistent technique to identify important accident sequence precursors. It relies on operational events (extracting information from actual experience) and infers core damage scenarios based on expected safety system responses. The ranking or categorization of each precursor is determined by considering the full spectrum of potential core damage sequences. The methodology estimates the frequency of severe core damage based on the approach suggested by Apostolakis and Mosleh, which may lead to a potential overestimation of the severe-accident sequence frequency due to the inherent dependencies between the safety systems and the initiating events. The methodology is an encompassing attempt to incorporate most of the operating information available from nuclear power plants and is an attractive tool from the point of view of risk management. In this paper, a further extension of this methodology is discussed with regard to the treatment of initiating frequency of the accident sequences

  15. Systems selection methodology for civil nuclear power applications

    Scarborough, J.C.

    1987-01-01

    A methodology for evaluation and selection of a preferred Advanced Small or Medium Power Reactor (SMPR) for commercial electric power generation is discussed, and an illustrative example is presented with five U.S. Advanced SMPR power plants. The evaluation procedure was developed from a methodology for ranking small. advenced nuclear power plant designs under development by the U.S. Department of Energy (DOE) and Department of Defense (DOD). The methodology involves establishing numerical probability distributions for each of fifteen evaluation criteria for each Advanced SMPR plant. A resultant single probability distribution with its associated numerical mean value is then developed for each Advanced SMPR plant by Monte Carlo sampling techniques in order that each plant may be ranked with an associated statement of certainty. The selection methodology is intended as a screening procedure for commercial offerings to preclude detailed technical and commercial assessments from being conducted for those offerings which do not meet the initial screening criteria. (auhtor)

  16. Methodology for the collection and application of information on food ...

    S Blignaut

    ISSN 0378-5254 Journal of Family Ecology and Consumer Sciences, Vol 26: No 2, 1998. 89. Methodology .... Food preference therefore indicates an individual's personal motivation ...... food behavior in Sanjur, D. Social and cultural perspec-.

  17. Methodology of Neural Design: Applications in Microwave Engineering

    Z. Raida

    2006-06-01

    Full Text Available In the paper, an original methodology for the automatic creation of neural models of microwave structures is proposed and verified. Following the methodology, neural models of the prescribed accuracy are built within the minimum CPU time. Validity of the proposed methodology is verified by developing neural models of selected microwave structures. Functionality of neural models is verified in a design - a neural model is joined with a genetic algorithm to find a global minimum of a formulated objective function. The objective function is minimized using different versions of genetic algorithms, and their mutual combinations. The verified methodology of the automated creation of accurate neural models of microwave structures, and their association with global optimization routines are the most important original features of the paper.

  18. Physical protection evaluation methodology program development and application

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  19. Physical protection evaluation methodology program development and application

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  20. Application of theoretical and methodological components of nursing care

    Rosa del Socorro Morales-Aguilar; Gloria Elena Lastre-Amell; Alba Cecilia Pardo-Vásquez

    2016-01-01

    Introduction: the theoretical and methodological components are the proper expertise in nursing, and it refers to models, theories, care process, taxonomy of nursing diagnoses, system of nursing intervention classification, and system of outcomes classification, which base nursing care into professional practice. Methodology: research was performed on Google Scholar, reviewing the databases of Scielo, Ciberindex, Index Enfermería, Dialnet, Redalyc, Medline, identifying 70 published articles b...

  1. Flux Measurements in Trees: Methodological Approach and Application to Vineyards

    Francesca De Lorenzi

    2008-03-01

    Full Text Available In this paper a review of two sap flow methods for measuring the transpiration in vineyards is presented. The objective of this work is to examine the potential of detecting transpiration in trees in response to environmental stresses, particularly the high concentration of ozone (O3 in troposphere. The methods described are the stem heat balance and the thermal dissipation probe; advantages and disadvantages of each method are detailed. Applications of both techniques are shown, in two large commercial vineyards in Southern Italy (Apulia and Sicily, submitted to semi-arid climate. Sap flow techniques allow to measure transpiration at plant scale and an upscaling procedure is necessary to calculate the transpiration at the whole stand level. Here a general technique to link the value of transpiration at plant level to the canopy value is presented, based on experimental relationships between transpiration and biometric characteristics of the trees. In both vineyards transpiration measured by sap flow methods compares well with evapotranspiration measured by micrometeorological techniques at canopy scale. Moreover soil evaporation component has been quantified. In conclusion, comments about the suitability of the sap flow methods for studying the interactions between trees and ozone are given.

  2. Laser-plasma interactions and applications

    Neely, David; Bingham, Robert; Jaroszynski, Dino

    2013-01-01

    Laser-Plasma Interactions and Applications covers the fundamental and applied aspects of high power laser-plasma physics. With an internationally renowned team of authors, the book broadens the knowledge of young researchers working in high power laser-plasma science by providing them with a thorough pedagogical grounding in the interaction of laser radiation with matter, laser-plasma accelerators, and inertial confinement fusion. The text is organised such that the theoretical foundations of the subject are discussed first, in Part I. In Part II, topics in the area of high energy density physics are covered. Parts III and IV deal with the applications to inertial confinement fusion and as a driver of particle and radiation sources, respectively. Finally, Part V describes the principle diagnostic, targetry, and computational approaches used in the field. This book is designed to give students a thorough foundation in the fundamental physics of laser-plasma interactions. It will also provide readers with knowl...

  3. Applicability and methodology of determining sustainable yield in groundwater systems

    Kalf, Frans R. P.; Woolley, Donald R.

    2005-03-01

    There is currently a need for a review of the definition and methodology of determining sustainable yield. The reasons are: (1) current definitions and concepts are ambiguous and non-physically based so cannot be used for quantitative application, (2) there is a need to eliminate varying interpretations and misinterpretations and provide a sound basis for application, (3) the notion that all groundwater systems either are or can be made to be sustainable is invalid, (4) often there are an excessive number of factors bound up in the definition that are not easily quantifiable, (5) there is often confusion between production facility optimal yield and basin sustainable yield, (6) in many semi-arid and arid environments groundwater systems cannot be sensibly developed using a sustained yield policy particularly where ecological constraints are applied. Derivation of sustainable yield using conservation of mass principles leads to expressions for basin sustainable, partial (non-sustainable) mining and total (non-sustainable) mining yields that can be readily determined using numerical modelling methods and selected on the basis of applied constraints. For some cases there has to be recognition that the groundwater resource is not renewable and its use cannot therefore be sustainable. In these cases, its destiny should be the best equitable use. sostenible. Las razones son: (1) los conceptos y definiciones actuales son ambiguos y sin base física de modo que no pueden usarse para aplicación cuantitativa, (2) existe necesidad de eliminar interpretaciones variables y mal interpretaciones y aportar bases sanas para aplicación, (3) la noción de que todos los sistemas de aguas subterráneas son o pueden ser sostenibles no esvalida, (4) frecuentemente existen un numero excesivo de factores ligados a la definición de producción sostenible los cuales no son fácil de cuantificar, (5) frecuentemente existe confusión entre la producción optima de un establecimiento y la

  4. New quickest transient detection methodology. Nuclear engineering applications

    Wang, Xin; Jevremovic, Tatjana; Tsoukalas, Lefteri H.

    2003-01-01

    A new intelligent systems methodology for quickest online transient detection is presented. Based on information that includes, but is not limited to, statistical features, energy of frequency components and wavelet coefficients, the new methodology decides whether a transient has emerged. A fuzzy system makes the final decision, the membership functions of which are obtained by artificial neural networks and adjusted in an online manner. Comparisons are performed with conventional methods for transient detection using simulated and plant data. The proposed methodology could be useful in power plant operations, diagnostic and maintenance activities. It is also considered as a design tool for quick design modifications in a virtual design environment aimed at next generation University Research and Training Reactors (URTRs). (The virtual design environment is pursued as part of the Big-10 Consortium sponsored by the new Innovations in Nuclear Infrastructure and Education (INIE) program sponsored by the US Department of Energy.) (author)

  5. The interaction of antibodies with lipid membranes unraveled by fluorescence methodologies

    Figueira, Tiago N.; Veiga, Ana Salomé; Castanho, Miguel A. R. B.

    2014-12-01

    The interest and investment in antibody therapies has reached an overwhelming scale in the last decade. Yet, little concern has been noticed among the scientific community to unravel important interactions of antibodies with biological structures other than their respective epitopes. Lipid membranes are particularly relevant in this regard as they set the stage for protein-protein recognition, a concept potentially inclusive of antibody-antigen recognition. Fluorescence techniques allow experimental monitoring of protein partition between aqueous and lipid phases, deciphering events of adsorption, insertion and diffusion. This review focuses on the available fluorescence spectroscopy methodologies directed to the study of antibody-membrane interactions.

  6. Application of integrated fuzzy VIKOR & AHP methodology to contractor ranking

    Mohamad Rahim Ramezaniyan

    2012-08-01

    Full Text Available Contractor selection is a critical activity, which plays an important role in the overall success of any construction project. The implementation of fuzzy multiple criteria decision attribute (MCDA in selecting contractors has the advantage of rendering subjective and implicit decision making more objective and transparent. An additional merit of fuzzy MCDA is the ability to accommodate quantitative and qualitative information. In this paper, an integrated VIKOR–AHP methodology is proposed to make a selection among the alternative contractors in one of Iranian construction industry projects. In the proposed methodology, the weights of the selection criteria are determined by fuzzy pairwise comparison matrices of AHP.

  7. Software representation methodology for agile application development: An architectural approach

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  8. Model evaluation methodology applicable to environmental assessment models

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  9. Meta-Analytical Studies in Transport Economics. Methodology and Applications

    Brons, M.R.E.

    2006-05-18

    Vast increases in the external costs of transport in the late twentieth century have caused national and international governmental bodies to worry about the sustainability of their transport systems. In this thesis we use meta-analysis as a research method to study various topics in transport economics that are relevant for sustainable transport policymaking. Meta-analysis is a research methodology that is based on the quantitative summarisation of a body of previously documented empirical evidence. In several fields of economic, meta-analysis has become a well-accepted research tool. Despite the appeal of the meta-analytical approach, there are methodological difficulties that need to be acknowledged. We study a specific methodological problem which is common in meta-analysis in economics, viz., within-study dependence caused by multiple sampling techniques. By means of Monte Carlo analysis we investigate the effect of such dependence on the performance of various multivariate estimators. In the applied part of the thesis we use and develop meta-analytical techniques to study the empirical variation in indicators of the price sensitivity of demand for aviation transport, the price sensitivity of demand for gasoline, the efficiency of urban public transport and the valuation of the external costs of noise from rail transport. We focus on the estimation of mean values for these indicators and on the identification of the impact of conditioning factors.

  10. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  11. Application of probabilistic risk assessment methodology to fusion

    Piet, S.J.

    1985-07-01

    Probabilistic Risk Assessment (PRA) tools are applied to general fusion issues in a systematic way, generally qualitatively. The potential value of PRA to general fusion safety and economic issues is discussed. Several important design insights result: possible fault interactions must be minimized (decouple fault conditions), inherently safe designs must include provision for passively handling loss of site power and loss of coolant conditions, the reliability of the vacuum boundary appears vital to maximizing facility availabilty and minimizing safety risk, and economic analyses appear to be incomplete without consideration of potential availability loss from forced outrages. A modification to PRA formalism is introduced, called the fault interaction matrix. The fault interaction matrix contains information concerning what initial fault condition could lead to other fault conditions and with what frequency. Thus, the fault interaction matrix represents a way to present and measure the degree to which a designer has decoupled possible fault conditions in his design

  12. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  13. Quality control methodology for high-throughput protein-protein interaction screening.

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  14. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  15. Tracking and sensor data fusion methodological framework and selected applications

    Koch, Wolfgang

    2013-01-01

    Sensor Data Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output in accuracy, reliability, or cost. This book provides an introduction Sensor Data Fusion, as an information technology as well as a branch of engineering science and informatics. Part I presents a coherent methodological framework, thus providing th

  16. Improved FTA methodology and application to subsea pipeline reliability design.

    Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan

    2014-01-01

    An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.

  17. Intravenous dipyridamole thallium-201 SPECT imaging methodology, applications, and interpretations

    Rockett, J.F.; Magill, H.L.; Loveless, V.S.; Murray, G.L.

    1990-01-01

    Dipyridamole TI-201 imaging is an ideal alternative to exercise TI-201 scintigraphy in patients who are unwilling or unable to perform maximum exercise stress. The use of intravenous dipyridamole, alone or in combination with exercise, has not been approved for clinical practice by the Food and Drug Administration. Once approval is granted, the test will become a widely used and important component of the cardiac work-up. The indications, methodology, side effects, and utility of dipyridamole cardiac imaging in the clinical setting are discussed and a variety of examples presented.59 references

  18. Development and application of a hybrid transport methodology for active interrogation systems

    Royston, K.; Walters, W.; Haghighat, A. [Nuclear Engineering Program, Department of Mechanical Engineering, Virginia Tech., 900 N Glebe Rd., Arlington, VA 22203 (United States); Yi, C.; Sjoden, G. [Nuclear and Radiological Engineering, Georgia Tech, 801 Ferst Drive, Atlanta, GA 30332 (United States)

    2013-07-01

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, 7) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water cargo. To complete the first step, a response-function formulation has been developed to calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, 7) cross sections to find the resulting gamma source distribution. In the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma current at a detector window. The AIMS (Active Interrogation for Monitoring Special-Nuclear-Materials) software has been written to output the gamma current for a source-detector assembly scanning across a cargo container using the pre-calculated values and taking significantly less time than a reference MCNP5 calculation. (authors)

  19. Application of Six Sigma methodology to a diagnostic imaging process.

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  20. Application of the adjoint function methodology for neutron fluence determination

    Haghighat, A.; Nanayakkara, B.; Livingston, J.; Mahgerefteh, M.; Luoma, J.

    1991-01-01

    In previous studies, the neutron fluence at a reactor pressure vessel has been estimated based on consolidation of transport theory calculations and experimental data obtained from in-vessel capsules and/or cavity dosimeters. Normally, a forward neutron transport calculation is performed for each fuel cycle and the neutron fluxes are integrated over the reactor operating time to estimate the neutron fluence. Such calculations are performed for a geometrical model which is composed of one-eighth (0 to 45 deg) of the reactor core and its surroundings; i.e., core barrel, thermal shield, downcomer, reactor vessel, cavity region, concrete wall, and instrumentation well. Because the model is large, transport theory calculations generally require a significant amount of computer memory and time; hence, more efficient methodologies such as the adjoint transport approach have been proposed. These studies, however, do not address the necessary sensitivity studies needed for adjoint function calculations. The adjoint methodology has been employed to estimate the activity of a cavity dosimeter and that of an in-vessel capsule. A sensitivity study has been performed on the mesh distribution used in and around the cavity dosimeter and the in-vessel capsule. Further, since a major portion of the detector response is due to the neutrons originated in the peripheral fuel assemblies, a study on the use of a smaller calculational model has been performed

  1. Methodology Declassification of Impacted Buildings. Application of Technology MARSSIM

    Vico, A.M.; Álvarez, A.; Gómez, J.M.; Quiñones, J.

    2015-01-01

    This work describes the material measurement methodology to assure the absence of contamination on impacted buildings due to processes related to the first part of the nuclear fuel cycle performed at the former Junta de Energía Nuclear, JEN, currently Centro de Investigaciones Energéticas Medioambientales y Tecnológicas, CIEMAT. The first part of the work encloses the identification and quantification of natural isotopes and its proportion in the studied surfaces through different analytical techniques. The experimental study has involved the proper equipment selection to carry out the field measurement and the characterization of uranium isotopes and their immediate descendants. According to European Union recommendations and specifications established by CSN (Consejo de Seguridad Nuclear), Spanish Regulatory authorities, for CIEMAT, the surface activity reference level have been established, which allow to decide if a surface can be classified as a conventional surface. In order to make decisions about the compliance with the established clearance criteria, MARSSIM methodology is applied by using the results obtained from field measurements (impacted and non impacted surfaces).

  2. Methodology of developing a smartphone application for crisis research and its clinical application.

    Zhang, Melvyn W B; Ho, Cyrus S H; Fang, Pan; Lu, Yanxia; Ho, Roger C M

    2014-01-01

    Recent advancement in Internet based technologies have resulted in the growth of a sub-specialized field, termed as "Infodemiology" and "Infoveillance". Infoveillence refers to the collation of infodemiology measures for the purpose of surveillance and trending. Previous research has only demonstrated the research potential of Web 2.0 medium in collation of data in crisis situation. The objectives for the current study are to demonstrate the methodology of implementation of a smartphone-based application for dissemination and collation of information during a crisis situation. The Haze Smartphone application was developed using an online application builder and using HTML5 as the core programming language. A five-phase developmental method including a) formulation of user requirements, b) system design, c) system development, d) system evaluation and finally e) system application and implementation were adopted. The smartphone application was deployed during a one-week period via a self-sponsored Facebook post and via direct dissemination of the web-links by emails. A total of 298 respondents took part in the survey within the application. Most of them were between the ages of 20- to 29-years old and had a university education. More individuals preferred the option of accessing and providing feedback to a survey on physical and psychological wellbeing via direct access to a Web-based questionnaire. In addition, the participants reported a mean number of 4.03 physical symptoms (SD 2.6). The total Impact of Event Scale-Revised (IES-R) score was 18.47 (SD 11.69), which indicated that the study population did experience psychological stress but not posttraumatic stress disorder. The perceived dangerous Pollutant Standards Index (PSI) level and the number of physical symptoms were associated with higher IES-R Score (Psmartphone application could potentially be used to acquire research data in a crisis situation. However, it is crucial for future research to further

  3. Methodological Note: Neurofeedback: A Comprehensive Review on System Design, Methodology and Clinical Applications

    Hengameh Marzbani

    2016-04-01

    Full Text Available Neurofeedback is a kind of biofeedback, which teaches self-control of brain functions to subjects by measuring brain waves and providing a feedback signal. Neurofeedback usually provides the audio and or video feedback. Positive or negative feedback is produced for desirable or undesirable brain activities, respectively. In this review, we provided clinical and technical information about the following issues: (1 Various neurofeedback treatment protocols i.e. alpha, beta, alpha/theta, delta, gamma, and theta; (2 Different EEG electrode placements i.e. standard recording channels in the frontal, temporal, central, and occipital lobes; (3 Electrode montages (unipolar, bipolar; (4 Types of neurofeedback i.e. frequency, power, slow cortical potential, functional magnetic resonance imaging, and so on; (5 Clinical applications of neurofeedback i.e. treatment of attention deficit hyperactivity disorder, anxiety, depression, epilepsy, insomnia, drug addiction, schizophrenia, learning disabilities, dyslexia and dyscalculia, autistic spectrum disorders and so on as well as other applications such as pain management, and the improvement of musical and athletic performance; and (6 Neurofeedback softwares. To date, many studies have been conducted on the neurofeedback therapy and its effectiveness on the treatment ofmany diseases. Neurofeedback, like other treatments, has its own pros and cons. Although it  is a non-invasive procedure, its validity has been questioned in terms of conclusive scientific evidence. For example, it is expensive, time-consuming and its benefits are not long-lasting. Also, it might take months to show the desired improvements. Nevertheless, neurofeedback is known as a complementary and alternative treatment of many brain dysfunctions. However, current research does not support conclusive results about its efficacy.

  4. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  5. Case study application of the IAEA safeguards assessment methodology to a mixed oxide fuel fabrication facility

    Swartz, J.; McDaniel, T.

    1981-01-01

    Science Applications, Inc. has prepared a case study illustrating the application of an assessment methodology to an international system for safeguarding mixed oxide (MOX) fuel fabrication facilities. This study is the second in a series of case studies which support an effort by the International Atomic Energy Agency (IAEA) and an international Consultant Group to develop a methodology for assessing the effectiveness of IAEA safeguards. 3 refs

  6. Application of a Methodology to calculate logistical cost

    Joaquín Mock-Díaz

    2017-12-01

    Full Text Available At present time, the managerial environment constantly becomes more aggressive and unstable. For that reason, companies are forced to improve on a regular basis their management, to increase their economic efficiency and their effectiveness and have a better performance. Within this context, the objective of this research is to apply a methodology to determine logistical costs, in a service−providing company, which allows assessing the behavior of such costs during the year 2016. A financial assessment performed to the logistical activities proved the existence of a high cost of opportunity, element mainly dependent on inventory rotation. For the purposes of this study, several scientific methods were used; the historical−logical method, to analyze the historical evolution of logistics; and the analysis−synthesis method to gather the elements and main ideas that characterize it.

  7. Radiation monitoring methodologies and their applications at BARC site

    Divkar, J.K.; Chatterjee, M.K.; Patra, R.P; Morali, S.; Singh, Rajvir

    2016-01-01

    Radiation monitoring methodology can be planned for various objectives during normal as well as emergency situations. During radiological emergency, radiation monitoring data provides useful information required for management of the abnormal situation. In order to assess the possible consequences accurately and to implement adequate measure, the emergency management authorities should have a well-prepared monitoring strategy in readiness. Fixed monitoring method is useful to analyze the behavior of nuclear plant site and to develop holistic model for it mobile monitoring is useful for quick impact assessment and will be the backbone of emergency response, particularly in case of non availability of fixed monitoring system caused due to natural disaster like floods, earthquake and tsunami

  8. Tutorials on emerging methodologies and applications in operations research

    2005-01-01

    Operations Research emerged as a quantitative approach to problem-solving in World War II. Its founders, who were physicists, mathematicians, and engineers, quickly found peace-time uses for this new field. Moreover, we can say that Operations Research (OR) was born in the same incubator as computer science, and through the years, it has spawned many new disciplines, including systems engineering, health care management, and transportation science. Fundamentally, Operations Research crosses discipline domains to seek solutions on a range of problems and benefits diverse disciplines from finance to bioengineering. Many disciplines routinely use OR methods. Many scientific researchers, engineers, and others will find the methodological presentations in this book useful and helpful in their problem-solving efforts. OR’s strengths are modeling, analysis, and algorithm design. It provides a quantitative foundation for a broad spectrum of problems, from economics to medicine, from environmental control to sports,...

  9. Fluid-structure interaction and biomedical applications

    Galdi, Giovanni; Nečasová, Šárka

    2014-01-01

    This book presents, in a methodical way, updated and comprehensive descriptions and analyses of some of the most relevant problems in the context of fluid-structure interaction (FSI). Generally speaking, FSI is among the most popular and intriguing problems in applied sciences and includes industrial as well as biological applications. Various fundamental aspects of FSI are addressed from different perspectives, with a focus on biomedical applications. More specifically, the book presents a mathematical analysis of basic questions like the well-posedness of the relevant initial and boundary value problems, as well as the modeling and the numerical simulation of a number of fundamental phenomena related to human biology. These latter research topics include blood flow in arteries and veins, blood coagulation and speech modeling. We believe that the variety of the topics discussed, along with the different approaches used to address and solve the corresponding problems, will help readers to develop a more holis...

  10. Anitproton-matter interactions in antiproton applications

    Morgan, David L., Jr.

    1990-01-01

    By virtue of the highly energetic particles released when they annihilate in matter, antiprotons have a variety of potentially important applications. Among others, these include remote 3-D density and composition imaging of the human body and also of thick, dense materials, cancer therapy, and spacecraft propulsion. Except for spacecraft propulsion, the required numbers of low energy antiprotons can be produced, stored, and transported through reliance on current or near term technology. Paramount to these applications and to fundamental research involving antiprotons is knowledge of how antiprotons interact with matter. The basic annihilation process is fairly well understood, but the antiproton annihilation and energy loss rates in matter depend in complex ways on a number of atomic processes. The rates, and the corresponding cross sections, were measured or are accurately predictable only for limited combinations of antiproton kinetic energy and material species.

  11. Application of TRIZ Methodology in Diffusion Welding System Optimization

    Ravinder Reddy, N.; Satyanarayana, V. V.; Prashanthi, M.; Suguna, N.

    2017-12-01

    Welding is tremendously used in metal joining processes in the manufacturing process. In recent years, diffusion welding method has significantly increased the quality of a weld. Nevertheless, diffusion welding has some extent short research and application progress. Therefore, diffusion welding has a lack of relevant information, concerned with the joining of thick and thin materials with or without interlayers, on welding design such as fixture, parameters selection and integrated design. This article intends to combine innovative methods in the application of diffusion welding design. This will help to decrease trial and error or failure risks in the welding process being guided by the theory of inventive problem solving (TRIZ) design method. This article hopes to provide welding design personnel with innovative design ideas under research and for practical application.

  12. A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence

    Jaspers, Monique W. M.

    2009-01-01

    OBJECTIVE: Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the

  13. Methodology to explore emergent behaviours of the interactions between water resources and ecosystem under a pluralistic approach

    García-Santos, Glenda; Madruga de Brito, Mariana; Höllermann, Britta; Taft, Linda; Almoradie, Adrian; Evers, Mariele

    2018-06-01

    Understanding the interactions between water resources and its social dimensions is crucial for an effective and sustainable water management. The identification of sensitive control variables and feedback loops of a specific human-hydro-scape can enhance the knowledge about the potential factors and/or agents leading to the current water resources and ecosystems situation, which in turn supports the decision-making process of desirable futures. Our study presents the utility of a system dynamics modeling approach for water management and decision-making for the case of a forest ecosystem under risk of wildfires. We use the pluralistic water research concept to explore different scenarios and simulate the emergent behaviour of water interception and net precipitation after a wildfire in a forest ecosystem. Through a case study, we illustrate the applicability of this new methodology.

  14. Applications of a Constrained Mechanics Methodology in Economics

    Janova, Jitka

    2011-01-01

    This paper presents instructive interdisciplinary applications of constrained mechanics calculus in economics on a level appropriate for undergraduate physics education. The aim of the paper is (i) to meet the demand for illustrative examples suitable for presenting the background of the highly expanding research field of econophysics even at the…

  15. Analytical group decision making in natural resources: methodology and application

    Daniel L. Schmoldt; David L. Peterson

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups...

  16. Applications of neuroscience in criminal law: legal and methodological issues.

    Meixner, John B

    2015-01-01

    The use of neuroscience in criminal law applications is an increasingly discussed topic among legal and psychological scholars. Over the past 5 years, several prominent federal criminal cases have referenced neuroscience studies and made admissibility determinations regarding neuroscience evidence. Despite this growth, the field is exceptionally young, and no one knows for sure how significant of a contribution neuroscience will make to criminal law. This article focuses on three major subfields: (1) neuroscience-based credibility assessment, which seeks to detect lies or knowledge associated with a crime; (2) application of neuroscience to aid in assessments of brain capacity for culpability, especially among adolescents; and (3) neuroscience-based prediction of future recidivism. The article briefly reviews these fields as applied to criminal law and makes recommendations for future research, calling for the increased use of individual-level data and increased realism in laboratory studies.

  17. New approaches in intelligent control techniques, methodologies and applications

    Kountchev, Roumen

    2016-01-01

    This volume introduces new approaches in intelligent control area from both the viewpoints of theory and application. It consists of eleven contributions by prominent authors from all over the world and an introductory chapter. This volume is strongly connected to another volume entitled "New Approaches in Intelligent Image Analysis" (Eds. Roumen Kountchev and Kazumi Nakamatsu). The chapters of this volume are self-contained and include summary, conclusion and future works. Some of the chapters introduce specific case studies of various intelligent control systems and others focus on intelligent theory based control techniques with applications. A remarkable specificity of this volume is that three chapters are dealing with intelligent control based on paraconsistent logics.

  18. A Brief overview of neutron activation analyses methodology and applications

    Ali, M.A.

    2000-01-01

    The primary objective of this talk is to present our new facility for Neutron Activation Analysis to the scientific and industrial societies and show its possibilities. Therefore my talk will handle the following main items: An overview of neutron activation analysis, The special interest of fast mono-energetic neutrons, The NAA method and its sensitivities, The Recent scientific and industrial applications using NAA, and o An illustrating example measured by using our facility is presented What is NAA? It is a sensitive analytical technique useful for performing both qualitative and quantitative multi-element analyses in samples. Worldwide application of NAA is so widespread; it is estimated that approximately several 10,000 samples undergo analysis each year from almost every conceivable field of scientific or technical interest. Why NAA? For many elements and applications, NAA: Offers sensitivities that are sometimes superior to those attainable by other methods, on the order of nano-gram level, It is accurate and reliable, NAA is generally recognized as the r eferee method o f choice when new procedures are being developed or when other methods yield results that do not agree. However, the activation analysis at En=14 MeV is limited by a few factors: Low value of flux, low cross-sections of threshold reactions, o Short irradiation time due to finite target life, Interfering reactions and gamma ray spectral interference

  19. Moessbauer spectroscopy: Methodology and some applications to magnetic materials

    Sundqvist, T.

    1986-01-01

    The development of a new computer program for analysis of Moessbauer spectra that allows the user to make a detailed simulation of a measured spectrum is described. The program includes several novel computational algorithms as well as extensive treatment of experimental side effects. Data collection instrumentation has been improved by the development of computer based data acquisition units. Replacing traditional multichannel analyzers, these computer controlled units provide increased flexibility, improved capacity and ease of data handling. The systems designed range from a simple Apple II interface, to a high performance self contained computer controlled unit. The computerized spectrometers feature two independent channels, allowing for acquisition of the spectrum of interest and of a simultaneous calibration spectra, as well as software controlled frequency of operation. Moessbauer spectroscopy is applied to amorphous Fe based alloys to study the correlations among hyperfine interactions, and to study the crystallization behaviour of these alloys. Special attention has been payed to the quadrupole interaction in the amorphous phases. Careful data analysis, making use of the above mentioned program, is used in an attempt to determine the complex magnetic structures found in various iron phosphides. The usefulness of the Ni-61 isotope for Moessbauer spectroscopy has been investigated. (author)

  20. Studying human-automation interactions: methodological lessons learned from the human-centred automation experiments 1997-2001

    Massaiu, Salvatore; Skjerve, Ann Britt Miberg; Skraaning, Gyrd Jr.; Strand, Stine; Waeroe, Irene

    2004-04-01

    This report documents the methodological lessons learned from the Human Centred Automation (HCA) programme both in terms of psychometric evaluation of the measurement techniques developed for human-automation interaction study, and in terms of the application of advanced statistical methods for analysis of experiments. The psychometric evaluation is based on data from the four experiments performed within the HCA programme. The result is a single-source reference text of measurement instruments for the study of human-automation interaction, part of which were specifically developed by the programme. The application of advanced statistical techniques is exemplified by additional analyses performed on the IPSN-HCA experiment of 1998. Special importance is given to the statistical technique Structural Equation Modeling, for the possibility it offers to advance, and empirically test, comprehensive explanations about human-automation interactions. The additional analyses of the IPSN-HCA experiment investigated how the operators formed judgments about their own performance. The issue is of substantive interest for human automation interaction research because the operators' over- or underestimation of their own performance could be seen as a symptom of human-machine mismatch, and a potential latent failure. These analyses concluded that it is the interplay between (1) the level of automation and several factors that determines the operators' bias in performance self-estimation: (2) the nature of the task, (3) the level of scenario complexity, and (4) the level of trust in the automatic system. A structural model that expresses the interplay of all these factors was empirically evaluated and was found able to provide a concise and elegant explanation of the intricate pattern of relationships between the identified factors. (Author)

  1. Advances in Artificial Neural Networks – Methodological Development and Application

    Yanbo Huang

    2009-08-01

    Full Text Available Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological

  2. Big and complex data analysis methodologies and applications

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  3. RF power harvesting: a review on designing methodologies and applications

    Tran, Le-Giang; Cha, Hyouk-Kyu; Park, Woo-Tae

    2017-12-01

    Wireless power transmission was conceptualized nearly a century ago. Certain achievements made to date have made power harvesting a reality, capable of providing alternative sources of energy. This review provides a summ ary of radio frequency (RF) power harvesting technologies in order to serve as a guide for the design of RF energy harvesting units. Since energy harvesting circuits are designed to operate with relatively small voltages and currents, they rely on state-of-the-art electrical technology for obtaining high efficiency. Thus, comprehensive analysis and discussions of various designs and their tradeoffs are included. Finally, recent applications of RF power harvesting are outlined.

  4. Minimal cut-set methodology for artificial intelligence applications

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements

  5. Energy minimization in medical image analysis: Methodologies and applications.

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Benefit-Risk Monitoring of Vaccines Using an Interactive Dashboard: A Methodological Proposal from the ADVANCE Project.

    Bollaerts, Kaatje; De Smedt, Tom; Donegan, Katherine; Titievsky, Lina; Bauchau, Vincent

    2018-03-26

    New vaccines are launched based on their benefit-risk (B/R) profile anticipated from clinical development. Proactive post-marketing surveillance is necessary to assess whether the vaccination uptake and the B/R profile are as expected and, ultimately, whether further public health or regulatory actions are needed. There are several, typically not integrated, facets of post-marketing vaccine surveillance: the surveillance of vaccination coverage, vaccine safety, effectiveness and impact. With this work, we aim to assess the feasibility and added value of using an interactive dashboard as a potential methodology for near real-time monitoring of vaccine coverage and pre-specified health benefits and risks of vaccines. We developed a web application with an interactive dashboard for B/R monitoring. The dashboard is demonstrated using simulated electronic healthcare record data mimicking the introduction of rotavirus vaccination in the UK. The interactive dashboard allows end users to select certain parameters, including expected vaccine effectiveness, age groups, and time periods and allows calculation of the incremental net health benefit (INHB) as well as the incremental benefit-risk ratio (IBRR) for different sets of preference weights. We assessed the potential added value of the dashboard by user testing amongst a range of stakeholders experienced in the post-marketing monitoring of vaccines. The dashboard was successfully implemented and demonstrated. The feedback from the potential end users was generally positive, although reluctance to using composite B/R measures was expressed. The use of interactive dashboards for B/R monitoring is promising and received support from various stakeholders. In future research, the use of such an interactive dashboard will be further tested with real-life data as opposed to simulated data.

  7. Latent Trait Theory Applications to Test Item Bias Methodology. Research Memorandum No. 1.

    Osterlind, Steven J.; Martois, John S.

    This study discusses latent trait theory applications to test item bias methodology. A real data set is used in describing the rationale and application of the Rasch probabilistic model item calibrations across various ethnic group populations. A high school graduation proficiency test covering reading comprehension, writing mechanics, and…

  8. A theoretical and methodological proposal for the descriptive assessment of therapeutic interactions.

    Froján-Parga, María Xesús; Ruiz-Sancho, Elena M; Calero-Elvira, Ana

    2016-01-01

    The goal of this study is to show the development of a strategy for a descriptive assessment of the therapeutic interaction. In this study, we develop an observational methodology to analyze the dialogues that took place during 92 sessions conducted in a psychological center in Madrid, Spain, in which 19 adults were treated for various psychological problems by 9 behavioral therapists. A system was developed to codify vocal behavior of both the therapists and the clients; the software The Observer XT was used for recording. Therapeutic interactions were analyzed using sequential analysis. There are three main sequences that synthesize the therapist-client interaction: first, an utterance by the client preceded by a therapist's verbalization, specifically a question (discriminative morphology) and followed by an expression of approval (reinforcement morphology); second, verbalizations of failure or discomfort uttered by the client, followed most often by verbalizations of disapproval (punishing morphology) uttered by the therapist; and third, verbalizations uttered by the client that are discriminated by the therapist after an in-depth explanation and followed by different therapist's utterances (expressions of approval, technical information, etc.). Depending on how the client responds the results in this study present a starting point for the study of the functional sequences that form the basis of therapeutic change.

  9. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    Bernreuter, D.L.

    1981-10-01

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  10. Application of Six Sigma methodology to a cataract surgery unit.

    Taner, Mehmet Tolga

    2013-01-01

    The article's aim is to focus on the application of Six Sigma to minimise intraoperative and post-operative complications rates in a Turkish public hospital cataract surgery unit. Implementing define-measure-analyse-improve and control (DMAIC) involves process mapping, fishbone diagrams and rigorous data-collection. Failure mode and effect analysis (FMEA), pareto diagrams, control charts and process capability analysis are applied to redress cataract surgery failure root causes. Inefficient skills of assistant surgeons and technicians, low quality of IOLs used, wrong IOL placement, unsystematic sterilisation of surgery rooms and devices, and the unprioritising network system are found to be the critical drivers of intraoperative-operative and post-operative complications. Sigma level was increased from 2.60 to 3.75 subsequent to extensive training of assistant surgeons, ophthalmologists and technicians, better quality IOLs, systematic sterilisation and air-filtering, and the implementation of a more sophisticated network system. This article shows that Six Sigma measurement and process improvement can become the impetus for cataract unit staff to rethink their process and reduce malpractices. Measuring, recording and reporting data regularly helps them to continuously monitor their overall process and deliver safer treatments. This is the first Six Sigma ophthalmology study in Turkey.

  11. Student satisfaction and loyalty in Denmark: Application of EPSI methodology.

    Shahsavar, Tina; Sudzina, Frantisek

    2017-01-01

    Monitoring and managing customers' satisfaction are key features to benefit from today's competitive environment. In higher education context, only a few studies are available on satisfaction and loyalty of the main customers who are the students, which signifies the need to investigate the field more thoroughly. The aim of this research is to measure the strength of determinants of students' satisfaction and the importance of antecedents in students' satisfaction and loyalty in Denmark. Our research model is the modification of European Performance Satisfaction Index (EPSI), which takes the university's image direct effects on students' expectations into account from students' perspective. The structural equation model of student satisfaction and loyalty has been evaluated using partial least square path modelling. Our findings confirm that the EPSI framework is applicable on student satisfaction and loyalty among Danish universities. We show that all the relationships among variables of the research model are significant except the relationship between quality of software and students' loyalty. Results further verify the significance of antecedents in students' satisfaction and loyalty at Danish universities; the university image and student satisfaction are the antecedents of student loyalty with a significant direct effect, while perceived value, quality of hardware, quality of software, expectations, and university image are antecedents of student satisfaction. Eventually, our findings may be of an inspiration to maintain and improve students' experiences during their study at the university. Dedicating resources to identified important factors from students' perception enable universities to attract more students, make them highly satisfied and loyal.

  12. Student satisfaction and loyalty in Denmark: Application of EPSI methodology

    Shahsavar, Tina

    2017-01-01

    Monitoring and managing customers’ satisfaction are key features to benefit from today’s competitive environment. In higher education context, only a few studies are available on satisfaction and loyalty of the main customers who are the students, which signifies the need to investigate the field more thoroughly. The aim of this research is to measure the strength of determinants of students’ satisfaction and the importance of antecedents in students’ satisfaction and loyalty in Denmark. Our research model is the modification of European Performance Satisfaction Index (EPSI), which takes the university’s image direct effects on students’ expectations into account from students’ perspective. The structural equation model of student satisfaction and loyalty has been evaluated using partial least square path modelling. Our findings confirm that the EPSI framework is applicable on student satisfaction and loyalty among Danish universities. We show that all the relationships among variables of the research model are significant except the relationship between quality of software and students’ loyalty. Results further verify the significance of antecedents in students’ satisfaction and loyalty at Danish universities; the university image and student satisfaction are the antecedents of student loyalty with a significant direct effect, while perceived value, quality of hardware, quality of software, expectations, and university image are antecedents of student satisfaction. Eventually, our findings may be of an inspiration to maintain and improve students’ experiences during their study at the university. Dedicating resources to identified important factors from students’ perception enable universities to attract more students, make them highly satisfied and loyal. PMID:29240801

  13. Strategic environmental assessment methodologies--applications within the energy sector

    Finnveden, Goeran; Nilsson, Maans; Johansson, Jessica; Persson, Aasa; Moberg, Aasa; Carlsson, Tomas

    2003-01-01

    Strategic Environmental Assessment (SEA) is a procedural tool and within the framework of SEA, several different types of analytical tools can be used in the assessment. Several analytical tools are presented and their relation to SEA is discussed including methods for future studies, Life Cycle Assessment, Risk Assessment, Economic Valuation and Multi-Attribute Approaches. A framework for the integration of some analytical tools in the SEA process is suggested. It is noted that the available analytical tools primarily cover some types of environmental impacts related to emissions of pollutants. Tools covering impacts on ecosystems and landscapes are more limited. The relation between application and choice of analytical tools is discussed. It is suggested that SEAs used to support a choice between different alternatives require more quantitative methods, whereas SEAs used to identify critical aspects and suggest mitigation strategies can suffice with more qualitative methods. The possible and desired degree of site-specificity in the assessment can also influence the choice of methods. It is also suggested that values and world views can be of importance for judging whether different types of tools and results are meaningful and useful. Since values and world views differ between different stakeholders, consultation and understanding are important to ensure credibility and relevance

  14. Dissipative NEGF methodology to treat short range Coulomb interaction: Current through a 1D nanostructure.

    Martinez, Antonio; Barker, John R; Di Prieto, Riccardo

    2018-06-13

    A methodology describing Coulomb blockade in the Non-equilibrium Green Function formalism is presented. We carried out ballistic and dissipative simulations through a 1D quantum dot using an Einstein phonon model. Inelastic phonons with different energies have been considered. The methodology incorporates the short-range Coulomb interaction between two electrons through the use of a two-particle Green's function. Unlike previous work, the quantum dot has spatial resolution i.e. it is not just parameterized by the energy level and coupling constants of the dot. Our method intends to describe the effect of electron localization while maintaining an open boundary or extended wave function. The formalism conserves the current through the nanostructure. A simple 1D model is used to explain the increase of mobility in semi-crystalline polymers as a function of the electron concentration. The mechanism suggested is based on the lifting of energy levels into the transmission window as a result of the local electron-electron repulsion inside a crystalline domain. The results are aligned with recent experimental findings. Finally, as a proof of concept, we present a simulation of a low temperature resonant structure showing the stability diagram in the Coulomb blockade regime. . © 2018 IOP Publishing Ltd.

  15. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  16. METHODOLOGY FOR FORMING MUTUALLY BENEFICIAL NETWORK INTERACTION BETWEEN SMALL CITIES AND DISTRICT CENTRES

    Nikolay A. Ivanov

    2017-01-01

    Full Text Available Abstract. Objectives The aim of the study is to develop a methodology for networking between small towns and regional centres on the basis of developing areas of mutual benefit. It is important to assess the possibility of cooperation between small towns and regional centres and local selfgovernment bodies on the example of individual territorial entities of Russia in the context of the formation and strengthening of networks and support for territorial development. Methods Systemic and functional methodical approaches were taken. The modelling of socio-economic processes provides a visual representation of the direction of positive changes for small towns and regional centres of selected Subjects of the Russian Federation. Results Specific examples of cooperation between small towns and district centres are revealed in some areas; these include education, trade and public catering, tourist and recreational activities. The supporting role of subsystems, including management, regulatory activity, transport and logistics, is described. Schemes, by to which mutually beneficial network interaction is formed, are characterised in terms of the specific advantages accruing to each network subject. Economic benefits of realising interaction between small cities and regional centres are discussed. The methodology is based on assessing the access of cities to commutation, on which basis contemporary regional and city networks are formed. Conclusion On the basis of the conducted study, a list of areas for mutually beneficial networking between small towns and district centres has been identified, allowing the appropriate changes in regional economic policies to be effected in terms of programmes aimed at the development of regions and small towns, including those suffering from economic depression.

  17. Performance specification methodology: introduction and application to displays

    Hopper, Darrel G.

    1998-09-01

    Acquisition reform is based on the notion that DoD must rely on the commercial marketplace insofar as possible rather than solely looking inward to a military marketplace to meet its needs. This reform forces a fundamental change in the way DoD conducts business, including a heavy reliance on private sector models of change. The key to more reliance on the commercial marketplace is the performance specifications (PS). This paper introduces some PS concepts and a PS classification principal to help bring some structure to the analysis of risk (cost, schedule, capability) in weapons system development and the management of opportunities for affordable ownership (maintain/increase capability via technology insertion, reduce cost) in this new paradigm. The DoD shift toward commercial components is nowhere better exemplified than in displays. Displays are the quintessential dual-use technology and are used herein to exemplify these PS concepts and principal. The advent of flat panel displays as a successful technology is setting off an epochal shift in cockpits and other military applications. Displays are installed in every DoD weapon system, and are, thus, representative of a range of technologies where issues and concerns throughout industry and government have been raised regarding the increased DoD reliance on the commercial marketplace. Performance specifications require metrics: the overall metrics of 'information-thrust' with units of Mb/s and 'specific info- thrust' with units of Mb/s/kg are introduced to analyze value of a display to the warfighter and affordability to the taxpayer.

  18. Electrification of particulate entrained fluid flows-Mechanisms, applications, and numerical methodology

    Wei, Wei; Gu, Zhaolin

    2015-10-01

    distribution and mechanical behaviors of liquid surface can be predicted by using this method. The methodology combining particle charging model with Computational Fluid Dynamics (CFD) and Discrete element method (DEM) is applicable to study the particle charging/charged processes in gas-solid two phase flows, the influence factors of particle charging, such as gas-particle interaction, contact force, contact area, and various velocities, are described systematically. This review would explore a clear understanding of the particulate charging and provide theoretical references to control and utilize the charging/charged particulate entrained fluid system.

  19. Electrification of particulate entrained fluid flows—Mechanisms, applications, and numerical methodology

    Wei, Wei; Gu, Zhaolin

    2015-01-01

    charge distribution and mechanical behaviors of liquid surface can be predicted by using this method. The methodology combining particle charging model with Computational Fluid Dynamics (CFD) and Discrete element method (DEM) is applicable to study the particle charging/charged processes in gas–solid two phase flows, the influence factors of particle charging, such as gas–particle interaction, contact force, contact area, and various velocities, are described systematically. This review would explore a clear understanding of the particulate charging and provide theoretical references to control and utilize the charging/charged particulate entrained fluid system.

  20. Electrification of particulate entrained fluid flows—Mechanisms, applications, and numerical methodology

    Wei, Wei [School of Energy and Power Engineering, Wuhan University of Technology, Wuhan, Hubei, 430063 (China); School of Human Settlements and Civil Engineering, Xi’an Jiaotong University, Xi’an, Shaanxi, 710049 (China); Gu, Zhaolin, E-mail: guzhaoln@mail.xjtu.edu.cn [School of Human Settlements and Civil Engineering, Xi’an Jiaotong University, Xi’an, Shaanxi, 710049 (China)

    2015-10-28

    charge distribution and mechanical behaviors of liquid surface can be predicted by using this method. The methodology combining particle charging model with Computational Fluid Dynamics (CFD) and Discrete element method (DEM) is applicable to study the particle charging/charged processes in gas–solid two phase flows, the influence factors of particle charging, such as gas–particle interaction, contact force, contact area, and various velocities, are described systematically. This review would explore a clear understanding of the particulate charging and provide theoretical references to control and utilize the charging/charged particulate entrained fluid system.

  1. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-01-01

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soil and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE's) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This

  2. Quantum-Mechanics Methodologies in Drug Discovery: Applications of Docking and Scoring in Lead Optimization.

    Crespo, Alejandro; Rodriguez-Granillo, Agustina; Lim, Victoria T

    2017-01-01

    The development and application of quantum mechanics (QM) methodologies in computer- aided drug design have flourished in the last 10 years. Despite the natural advantage of QM methods to predict binding affinities with a higher level of theory than those methods based on molecular mechanics (MM), there are only a few examples where diverse sets of protein-ligand targets have been evaluated simultaneously. In this work, we review recent advances in QM docking and scoring for those cases in which a systematic analysis has been performed. In addition, we introduce and validate a simplified QM/MM expression to compute protein-ligand binding energies. Overall, QMbased scoring functions are generally better to predict ligand affinities than those based on classical mechanics. However, the agreement between experimental activities and calculated binding energies is highly dependent on the specific chemical series considered. The advantage of more accurate QM methods is evident in cases where charge transfer and polarization effects are important, for example when metals are involved in the binding process or when dispersion forces play a significant role as in the case of hydrophobic or stacking interactions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Boolean modeling in systems biology: an overview of methodology and applications

    Wang, Rui-Sheng; Albert, Réka; Saadatpour, Assieh

    2012-01-01

    Mathematical modeling of biological processes provides deep insights into complex cellular systems. While quantitative and continuous models such as differential equations have been widely used, their use is obstructed in systems wherein the knowledge of mechanistic details and kinetic parameters is scarce. On the other hand, a wealth of molecular level qualitative data on individual components and interactions can be obtained from the experimental literature and high-throughput technologies, making qualitative approaches such as Boolean network modeling extremely useful. In this paper, we build on our research to provide a methodology overview of Boolean modeling in systems biology, including Boolean dynamic modeling of cellular networks, attractor analysis of Boolean dynamic models, as well as inferring biological regulatory mechanisms from high-throughput data using Boolean models. We finally demonstrate how Boolean models can be applied to perform the structural analysis of cellular networks. This overview aims to acquaint life science researchers with the basic steps of Boolean modeling and its applications in several areas of systems biology. (paper)

  4. Methodological application so as to obtain digital elevation models DEM in wetland areas

    Quintero, Deiby A; Montoya V, Diana M; Betancur, Teresita

    2009-01-01

    In order to understand hydrological systems and the description of flow processes that occur among its components it is essential to have a physiographic description that morphometric and relief characteristics. When local studies are performed, the basic cartography available, in the best case 1:25,000 scale, tends not to obey the needs required to represent the water dynamics that characterize the interactions between streams, aquifers and lenticular water bodies in flat zones particularly in those where there are wetlands localized in ancient F100D plains of rivers. A lack of financial resources is the principal obstacle to acquiring; information that is current and sufficient for the scale of the project. Geomorphologic conditions of flat relief zones are a good alternative for the construction of the new data. Using the basic cartography available and the new data, it is possible to obtain DEMs that are improved and consistent with the dynamics of surface and groundwater flows in the hydrological system. To accomplish this one must use spatial modeling tools coupled with Geographic Information System - GIS. This article present a methodological application for the region surrounding the catchment of wetland Cienaga Colombia in the Bajo Cauca region of Antioquia.

  5. Application of the integrated safety assessment methodology to the protection of electric systems

    Hortal, Javier; Izquierdo, Jose M.

    1996-01-01

    The generalization of classical techniques for risk assessment incorporating dynamic effects is the main objective of the Integrated Safety Assessment Methodology, as practical implementation of Protection Theory. Transient stability, contingency analysis and protection setpoint verification in electric power systems are particularly appropriate domains of application, since the coupling of reliability and dynamic analysis in the protection assessment process is being increasingly demanded. Suitable techniques for dynamic simulation of sequences of switching events in power systems are derived from the use of quasi-linear equation solution algorithms. The application of the methodology, step by step, is illustrated in a simple but representative example

  6. A methodology to establish a database to study gene environment interactions for childhood asthma

    McCormick Jonathan

    2010-12-01

    Full Text Available Abstract Background Gene-environment interactions are likely to explain some of the heterogeneity in childhood asthma. Here, we describe the methodology and experiences in establishing a database for childhood asthma designed to study gene-environment interactions (PAGES - Paediatric Asthma Gene Environment Study. Methods Children with asthma and under the care of a respiratory paediatrician are being recruited from 15 hospitals between 2008 and 2011. An asthma questionnaire is completed and returned by post. At a routine clinic visit saliva is collected for DNA extraction. Detailed phenotyping in a proportion of children includes spirometry, bronchodilator response (BDR, skin prick reactivity, exhaled nitric oxide and salivary cotinine. Dietary and quality of life questionnaires are completed. Data are entered onto a purpose-built database. Results To date 1045 children have been invited to participate and data collected in 501 (48%. The mean age (SD of participants is 8.6 (3.9 years, 57% male. DNA has been collected in 436 children. Spirometry has been obtained in 172 children, mean % predicted (SD FEV1 97% (15 and median (IQR BDR is 5% (2, 9. There were differences in age, socioeconomic status, severity and %FEV1 between the different centres (p≤0.024. Reasons for non-participation included parents not having time to take part, children not attending clinics and, in a small proportion, refusal to take part. Conclusions It is feasible to establish a national database to study gene-environment interactions within an asthmatic paediatric population; there are barriers to participation and some different characteristics in individuals recruited from different centres. Recruitment to our study continues and is anticipated to extend current understanding of asthma heterogeneity.

  7. Go-flow: a reliability analysis methodology applicable to piping system

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  8. The Application Strategy of Iterative Solution Methodology to Matrix Equations in Hydraulic Solver Package, SPACE

    Na, Y. W.; Park, C. E.; Lee, S. Y.

    2009-01-01

    As a part of the Ministry of Knowledge Economy (MKE) project, 'Development of safety analysis codes for nuclear power plants', KOPEC has been developing the hydraulic solver code package applicable to the safety analyses of nuclear power plants (NPP's). The matrices of the hydraulic solver are usually sparse and may be asymmetric. In the earlier stage of this project, typical direct matrix solver packages MA48 and MA28 had been tested as matrix solver for the hydraulic solver code, SPACE. The selection was based on the reasonably reliable performance experience from their former version MA18 in RELAP computer code. In the later stage of this project, the iterative methodologies have been being tested in the SPACE code. Among a few candidate iterative solution methodologies tested so far, the biconjugate gradient stabilization methodology (BICGSTAB) has shown the best performance in the applicability test and in the application to the SPACE code. Regardless of all the merits of using the direct solver packages, there are some other aspects of tackling the iterative solution methodologies. The algorithm is much simpler and easier to handle. The potential problems related to the robustness of the iterative solution methodologies have been resolved by applying pre-conditioning methods adjusted and modified as appropriate to the application in the SPACE code. The application strategy of conjugate gradient method was introduced in detail by Schewchuk, Golub and Saad in the middle of 1990's. The application of his methodology to nuclear engineering in Korea started about the same time and is still going on and there are quite a few examples of application to neutronics. Besides, Yang introduced a conjugate gradient method programmed in C++ language. The purpose of this study is to assess the performance and behavior of the iterative solution methodology compared to those of the direct solution methodology still being preferred due to its robustness and reliability. The

  9. Application of new design methodologies to very high-temperature metallic components of the HTTR

    Hada, Kazuhiko; Ohkubo, Minoru; Baba, Osamu

    1991-01-01

    The high-temperature piping and helium-to-helium intermediate heat exchanger of the High-Temperature Engineering Test Reactor (HTTR) are designed to be operating at very high temperatures of about 900deg C among the class 1 components of the HTTR. At such a high temperature, mechanical strength of heat-resistant metallic materials is very low and thermal expansions of structural members are large. Therefore, innovative design methodologies are needed to reduce both mechanical and thermal loads acting on these components. To the HTTR, the design methodologies which can separate the heat-resistant function from the pressure-retaining functions and allow them to expand freely are applied to reduce pressure and thermal loads. Since these design methodologies need to verify their applicability, the Japan Atomic Energy Research Institute (JAERI) has been performing many design and research works on their verifications. The details of the design methodologies and their verifications are given in this paper. (orig.)

  10. An Application of the Methodology for Assessment of the Sustainability of Air Transport System

    Janic, Milan

    2003-01-01

    An assessment and operationalization of the concept of sustainable air transport system is recognized as an important but complex research, operational and policy task. In the scope of the academic efforts to properly address the problem, this paper aims to assess the sustainability of air transport system. It particular, the paper describes the methodology for assessment of sustainability and its potential application. The methodology consists of the indicator systems, which relate to the air transport system operational, economic, social and environmental dimension of performance. The particular indicator systems are relevant for the particular actors such users (air travellers), air transport operators, aerospace manufacturers, local communities, governmental authorities at different levels (local, national, international), international air transport associations, pressure groups and public. In the scope of application of the methodology, the specific cases are selected to estimate the particular indicators, and thus to assess the system sustainability under given conditions.

  11. Optimization Of Methodological Support Of Application Tax Benefits In Regions: Practice Of Perm Region

    Alexandr Ivanovich Tatarkin

    2015-03-01

    Full Text Available In the article, the problem of the methodological process support of regional tax benefits is reviewed. The method of tax benefits assessment, accepted in Perm Region, was chosen as an analysis object because the relatively long period of application of benefits has allowed to build enough statistics base. In the article, the reliability of budget, economic, investment, and social effectiveness assessments of application benefits, based on the Method, is investigated. The suggestions of its perfection are formulated

  12. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  13. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  14. Radiological safety methodology in radioactive tracer applications for hydrodynamics and environmental studies

    Suarez, R.; Badano, A.; Dellepere, A.; Artucio, G.; Bertolotti, A.

    1995-01-01

    The use of radioactive tracer techniques as control sewage disposal contamination in Montevideo Estuarine and Carrasco beach has been studied for the Nuclear Technology National Direction. Hydrodynamic models simulation has been introduced as work methodology. As well as radiological safety and radioactive material applications in the environmental studies has been evaluated mainly in the conclusions and recommendations in this report. maps

  15. The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence

    2017-09-01

    enforcement (LE) capabilities during the investigation of criminal offenses has become commonplace in the U.S. criminal justice system . These... system , and FORENSICS AND LAW ENFORCEMENT IN ARMY COUNTERINTELLIGENCE 22 would likely need to go to their local Army CID or military police...THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT

  16. CHARACTERIZATION OF SMALL AND MEDIUM ENTERPRISES (SMES OF POMERANIAN REGION IN SIX SIGMA METHODOLOGY APPLICATION

    2011-12-01

    Full Text Available Background: Six Sigma is related to product’s characteristics and parameters of actions, needed to obtain these products. On the other hand, it is a multi-step, cyclic process aimed at the improvements leading to global standard, closed to the perfection. There is a growing interest in Six Sigma methodology among smaller organizations but there are still too little publications presented such events in the sector of small and medium enterprises, especially based on good empirical results. It was already noticed at the phase of the preliminary researches, that only small part of companies from this sector in Pomerian region use elements of this methodology. Methods: The companies were divided into groups by the type of their activities as well as the employment size. The questionnaires were sent to 150 randomly selected organizations in two steps and were addressed to senior managers. The questionnaire contained the questions about basic information about a company, the level of the knowledge and the practical application of Six Sigma methodology, opinions about improvements of processes occurring in the company, opinions about trainings in Six Sigma methodology. Results: The following hypotheses were proposed, statistically verified and received the answer: The lack of the adequate knowledge of Six Sigma methodology in SMEs limits the possibility to effectively monitor and improve processes - accepted. The use of statistical tools of Six Sigma methodology requires the broad action to popularize this knowledge among national SMEs - accepted. The level of the awareness of the importance as well as practical use of Six Sigma methodology in manufacturing SMEs is higher than in SMEs providing services - rejected, the level is equal. The level of the knowledge and the use of Six Sigma methodology in medium manufacturing companies is significantly higher than in small manufacturing companies - accepted. The level of the knowledge and the application

  17. Spectroscopic and Dynamic Applications of Laser - Interactions.

    Quesada, Mark Alejandro

    1987-05-01

    Five different studies of laser-molecule interactions are conducted in this thesis. In part one, the first observation of Autler-Townes splitting of molecules is discussed and used to measure vibronic transition moments between excited electronic states. The effect was observed in the two-color, four -photon ionization of hydrogen via the resonant levels E,F(v = 6, J = 1) and D(v = 2, J = 2). Calculations gave good fits to the observed spectra yielding a vibronic transition moment of 2.0 +/- 0.5 a.u. between the above excited states. In part two, a method for extracting the alignment parameters of a molecular angular momentum distribution using laser-induced fluorescence is presented. The treatment is applicable to the common case of cylindrically symmetric orientation distributions in the high J-limit. Four different combinations of rotational branches in the LIF absorption emission process are examined. Computer algebra programs are used to generate simple analytical expressions which account for the influence of saturation on determining alignment parameters. In part three, the application of MPI-optogalvanic spectroscopy to the molecule 1,4-diazabicyclo (2.2.2) octane (DABCO) at various levels in a methane/air flame environment is described. The method employs a burner design that permits access to preheated and primary reaction zones of the flame for laser probing. Hot bands arising from two-photon resonant (X_1 ' to A_1') transitions are measured and the intramolecular vibrational potentials for the ground and first excited state are determined. In part four, DABCO's nu_ {13} torsional mode relaxation in a helium -DABCO and argon-DABCO supersonic jet, under low expansion conditions, is discussed. Modeling of the relaxation using the linear Landau-Teller relaxation equation is undertaken with various attempts to incorporate the effects of velocity slip. The relaxation rate is found to be independent of slip and the cross section dependent on the inverse of

  18. Methodology to explore interactions between the water system and society in order to identify adaptation strategies

    Offermans, A. G. E.; Haasnoot, M.

    2009-04-01

    Development of sustainable water management strategies involves analysing current and future vulnerability, identification of adaptation possibilities, effect analysis and evaluation of the strategies under different possible futures. Recent studies on water management often followed the pressure-effect chain and compared the state of social, economic and ecological functions of the water systems in one or two future situations with the current situation. The future is, however, more complex and dynamic. Water management faces major challenges to cope with future uncertainties in both the water system as well as the social system. Uncertainties in our water system relate to (changes in) drivers and pressures and their effects on the state, like the effects of climate change on discharges. Uncertainties in the social world relate to changing of perceptions, objectives and demands concerning water (management), which are often related with the aforementioned changes in the physical environment. The methodology presented here comprises the 'Perspectives method', derived from the Cultural Theory, a method on analyzing and classifying social response to social and natural states and pressures. The method will be used for scenario analysis and to identify social responses including changes in perspectives and management strategies. The scenarios and responses will be integrated within a rapid assessment tool. The purpose of the tool is to provide users with insight about the interaction of the social and physical system and to identify robust water management strategies by analysing the effectiveness under different possible futures on the physical, social and socio-economic system. This method allows for a mutual interaction between the physical and social system. We will present the theoretical background of the perspectives method as well as a historical overview of perspective changes in the Dutch Meuse area to show how social and physical systems interrelate. We

  19. Connecting traces : Understanding client-server interactions in Ajax applications

    Matthijssen, N.; Zaidman, A.; Storey, M.; Bull, I.; Van Deursen, A.

    2010-01-01

    Ajax-enabled web applications are a new breed of highly interactive, highly dynamic web applications. Although Ajax allows developers to create rich web applications, Ajax applications can be difficult to comprehend and thus to maintain. For this reason, we have created FireDetective, a tool that

  20. Environmental and sanitary evaluation of electro-nuclear sites: methodological research and application to prospective scenarios

    2004-12-01

    In the framework of the radioactive wastes disposal of the law of 1991, an exchange forum constituted by ANDRA, CEA, COGEMA, EdF, Framatome-ANP and IRSN implemented an environmental and sanitary evaluation of the different methods of radioactive wastes management. This report presents the six studies scenarios, the proposed methodology, the application to the six scenarios and the analysis of the results which showed the efficiency of the different recycling options towards the electronuclear cycle impacts limitation, and a technical conclusion illustrated by improvement possibilities of the methodology. (A.L.B.)

  1. Problem solving environment for distributed interactive applications

    Rycerz, K.; Bubak, M.; Sloot, P.; Getov, V.; Gorlatch, S.; Bubak, M.; Priol, T.

    2008-01-01

    Interactive Problem Solving Environments (PSEs) offer an integrated approach for constructing and running complex systems, such as distributed simulation systems. To achieve efficient execution of High Level Architecture (HLA)-based distributed interactive simulations on the Grid, we introduce a PSE

  2. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  3. An Evaluation Methodology Development and Application Process for Severe Accident Safety Issue Resolution

    Robert P. Martin

    2012-01-01

    Full Text Available A general evaluation methodology development and application process (EMDAP paradigm is described for the resolution of severe accident safety issues. For the broader objective of complete and comprehensive design validation, severe accident safety issues are resolved by demonstrating comprehensive severe-accident-related engineering through applicable testing programs, process studies demonstrating certain deterministic elements, probabilistic risk assessment, and severe accident management guidelines. The basic framework described in this paper extends the top-down, bottom-up strategy described in the U.S Nuclear Regulatory Commission Regulatory Guide 1.203 to severe accident evaluations addressing U.S. NRC expectation for plant design certification applications.

  4. Development of Methodologies, Metrics, and Tools for Investigating Human-Robot Interaction in Space Robotics

    Ezer, Neta; Zumbado, Jennifer Rochlis; Sandor, Aniko; Boyer, Jennifer

    2011-01-01

    Human-robot systems are expected to have a central role in future space exploration missions that extend beyond low-earth orbit [1]. As part of a directed research project funded by NASA s Human Research Program (HRP), researchers at the Johnson Space Center have started to use a variety of techniques, including literature reviews, case studies, knowledge capture, field studies, and experiments to understand critical human-robot interaction (HRI) variables for current and future systems. Activities accomplished to date include observations of the International Space Station s Special Purpose Dexterous Manipulator (SPDM), Robonaut, and Space Exploration Vehicle (SEV), as well as interviews with robotics trainers, robot operators, and developers of gesture interfaces. A survey of methods and metrics used in HRI was completed to identify those most applicable to space robotics. These methods and metrics included techniques and tools associated with task performance, the quantification of human-robot interactions and communication, usability, human workload, and situation awareness. The need for more research in areas such as natural interfaces, compensations for loss of signal and poor video quality, psycho-physiological feedback, and common HRI testbeds were identified. The initial findings from these activities and planned future research are discussed. Human-robot systems are expected to have a central role in future space exploration missions that extend beyond low-earth orbit [1]. As part of a directed research project funded by NASA s Human Research Program (HRP), researchers at the Johnson Space Center have started to use a variety of techniques, including literature reviews, case studies, knowledge capture, field studies, and experiments to understand critical human-robot interaction (HRI) variables for current and future systems. Activities accomplished to date include observations of the International Space Station s Special Purpose Dexterous Manipulator

  5. New applications of the interaction between diols and boronic acids

    Duval, F.L.

    2015-01-01

    Florine Duval - New applications of the interaction between diols and boronic acids – Summary

    Chapter 1 introduces the theory and known applications of the interaction between boronic acids and diols, and explains the context of this thesis. Diagnosis of

  6. Application of 'Process management' methodology in providing financial services of PE 'Post Serbia'

    Kujačić Momčilo D.

    2014-01-01

    Full Text Available The paper describes application of the methodology 'Process management', in providing of financial services at the post office counter hall. An overview of the methodology is given, as one of the most commonly used qualitative methodology, whereby Process management's technics are described , those can better meet user needs and market demands, as well as to find more effectively way to resist current competition in the postal service market. One of the main problem that pointed out is a long waiting time in the counter hall during providing financial services, which leads to the formation of queue lines, and thus to customer dissatisfaction. According that, paper points steps that should be taken during provide of financial services in a postal network unit for providing services to customers by optimizing user time waiting in line and increasing the satisfaction of all participants in that process.

  7. Assessment methodology applicable to safe decommissioning of Romanian VVR-S research reactor

    Baniu, O.; Vladescu, G.; Vidican, D.; Penescu, M.

    2002-01-01

    The paper contains the results of research activity performed by CITON specialists regarding the assessment methodology intended to be applied to safe decommissioning of the research reactors, developed taking into account specific conditions of the Romanian VVR-S Research Reactor. The Romanian VVR-S Research Reactor is an old reactor (1957) and its Decommissioning Plan is under study. The main topics of paper are as follows: Safety approach of nuclear facilities decommissioning. Applicable safety principles; Main steps of the proposed assessment methodology; Generic content of Decommissioning Plan. Main decommissioning activities. Discussion about the proposed Decommissioning Plan for Romanian Research Reactor; Safety risks which may occur during decommissioning activities. Normal decommissioning operations. Fault conditions. Internal and external hazards; Typical development of a scenario. Features, Events and Processes List. Exposure pathways. Calculation methodology. (author)

  8. Applications of a methodology for the analysis of learning trends in nuclear power plants

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  9. Application of Master Curve Methodology for Structural Integrity Assessments of Nuclear Components

    Sattari-Far, Iradj [Det Norske Veritas, Stockholm (Sweden); Wallin, Kim [VTT, Esbo (Finland)

    2005-10-15

    The objective was to perform an in-depth investigation of the Master Curve methodology and also based on this method develop a procedure for fracture assessments of nuclear components. The project has sufficiently illustrated the capabilities of the Master Curve methodology for fracture assessments of nuclear components. Within the scope of this work, the theoretical background of the methodology and its validation on small and large specimens has been studied and presented to a sufficiently large extent, as well as the correlations between the charpy-V data and the Master Curve T{sub 0} reference temperature in the evaluation of fracture toughness. The work gives a comprehensive report of the background theory and the different applications of the Master Curve methodology. The main results of the work have shown that the cleavage fracture toughness is characterized by a large amount of statistical scatter in the transition region, it is specimen size dependent and it should be treated statistically rather than deterministically. The Master Curve methodology is able to make use of statistical data in a consistent way. Furthermore, the Master Curve methodology provides a more precise prediction of the fracture toughness of embrittled materials in comparison with the ASME K{sub IC} reference curve, which often gives over-conservative results. The suggested procedure in this study, concerning the application of the Master Curve method in fracture assessments of ferritic steels in the transition region and the low shelf regions, is valid for the temperatures range T{sub 0}-50{<=}T{<=}T{sub 0}+50 deg C. If only approximate information is required, the Master Curve may well be extrapolated outside this temperature range. The suggested procedure has also been illustrated for some examples.

  10. Hepatic transporter drug-drug interactions: an evaluation of approaches and methodologies.

    Williamson, Beth; Riley, Robert J

    2017-12-01

    Drug-drug interactions (DDIs) continue to account for 5% of hospital admissions and therefore remain a major regulatory concern. Effective, quantitative prediction of DDIs will reduce unexpected clinical findings and encourage projects to frontload DDI investigations rather than concentrating on risk management ('manage the baggage') later in drug development. A key challenge in DDI prediction is the discrepancies between reported models. Areas covered: The current synopsis focuses on four recent influential publications on hepatic drug transporter DDIs using static models that tackle interactions with individual transporters and in combination with other drug transporters and metabolising enzymes. These models vary in their assumptions (including input parameters), transparency, reproducibility and complexity. In this review, these facets are compared and contrasted with recommendations made as to their application. Expert opinion: Over the past decade, static models have evolved from simple [I]/k i models to incorporate victim and perpetrator disposition mechanisms including the absorption rate constant, the fraction of the drug metabolised/eliminated and/or clearance concepts. Nonetheless, models that comprise additional parameters and complexity do not necessarily out-perform simpler models with fewer inputs. Further, consideration of the property space to exploit some drug target classes has also highlighted the fine balance required between frontloading and back-loading studies to design out or 'manage the baggage'.

  11. Towards more sustainable management of European food waste: Methodological approach and numerical application.

    Manfredi, Simone; Cristobal, Jorge

    2016-09-01

    Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.

  12. Application of FIVE methodology in probabilistic risk assessment (PRA) of fire events

    Lopez Garcia, F.J.; Suarez Alonso, J.; Fiolamengual, M.J.

    1993-01-01

    This paper reflects the experience acquired during the process of evaluation and updating of the fire analysis within the Cofrentes NPP PRA. It determines which points are the least precise, either because of their greater uncertainty or because of their excessive conservatism, as well as the subtasks which have involved a larger work load and could be simplified. These aspects are compared with the steps followed in methodology FIVE (Fire Vulnerability Evaluation Methodology) to assess whether application of this methodology would optimize the task, by making it more systematic and realistic and reducing uncertainties. On the one hand, the FIVE methodology does not have the scope sufficient to carry out a quantitative risk evaluation, but it can easily be complemented -without detriment to its systematic nature- by quantifying core damage in significant areas. On the other hand, certain issues such as definition of the fire growth software program which has to be used, are still not fully closed. Nevertheless, the conclusions derived from this assessment are satisfactory, since it is considered that this methodology would serve to unify the criteria and data of the analysis of fire-induced risks, providing a progressive screening method which would considerably simplify the task. (author)

  13. Application of the accident management information needs methodology to a severe accident sequence

    Ward, L.W.; Hanson, D.J.; Nelson, W.R.; Solberg, D.E.

    1989-01-01

    The U.S. Nuclear Regulatory Commission (NRC) is conducting an Accident Management Research Program that emphasizes the application of severe accident research results to enhance the capability of plant operating personnel to effectively manage severe accidents. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed as part of the research program designed to resolve this issue. The methodology identifies the information needs of the plant personnel during a wide range of accident conditions, the existing plant measurements capable of supplying these information needs and what, if any minor additions to instrument and display systems would enhance the capability to manage accidents, known limitations on the capability of these measurements to function properly under the conditions that will be present during a wide range of severe accidents, and areas in which the information systems could mislead plant personnel. This paper presents an application of this methodology to a severe accident sequence to demonstrate its use in identifying the information which is available for management of the event. The methodology has been applied to a severe accident sequence in a Pressurized Water Reactor with a large dry containment. An examination of the capability of the existing measurements was then performed to determine whether the information needs can be supplied

  14. Application of NASA Kennedy Space Center system assurance analysis methodology to nuclear power plant systems designs

    Page, D.W.

    1985-01-01

    The Kennedy Space Center (KSC) entered into an agreement with the Nuclear Regulatory Commission (NRC) to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. In joint meetings of KSC and Duke Power personnel, an agreement was made to select to CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set a Final Safety Analysis Reports as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. The conclusion is drawn that nuclear power plant systems and aerospace ground support systems are similar in complexity and design and share common safety and reliability goals. The SAA methodology is readily adaptable to nuclear power plant designs because of it's practical application of existing and well known safety and reliability analytical techniques tied to an effective management information system

  15. A dynamic systems engineering methodology research study. Phase 2: Evaluating methodologies, tools, and techniques for applicability to NASA's systems projects

    Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.

    1989-01-01

    A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.

  16. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  17. 2014 International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    Ören, Tuncer; Kacprzyk, Janusz; Filipe, Joaquim

    2015-01-01

    The present book includes a set of selected extended papers from the 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2014), held in Vienna, Austria, from 28 to 30 August 2014. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2014 received 167 submissions, from 45 countries, in all continents. After a double blind paper review performed by the Program Committee, 23% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2014. Commitment to high quality standards is a major concern of SIMULTEC...

  18. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein

  19. 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    Kacprzyk, Janusz; Ören, Tuncer; Filipe, Joaquim

    2016-01-01

    The present book includes a set of selected extended papers from the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2015 received 102 submissions, from 36 countries, in all continents. After a double blind paper review performed by the Program Committee, 19% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2015. Commitment to high quality standards is a major concern of SIMULTECH t...

  20. Managing mixed fisheries in the European western waters: application of Fcube methodology

    Iriondo, Ane; García, Dorleta; Santurtún, Marina

    2012-01-01

    Fisheries management is moving towards ecosystem based management instead of traditional single species based advice. To progress towards an ecosystem approach, a new methodology called “Fleet and Fisheries Forecast” (Fcube) has been proposed. In the application of the method, a precise initial f...... the lowest. In this analysis, Western Waters fleet management results show consistency between stocks and their respective TACs. The study highlights that it is possible to deliver advice within the context of mixed fisheries using the Fcube method......Fisheries management is moving towards ecosystem based management instead of traditional single species based advice. To progress towards an ecosystem approach, a new methodology called “Fleet and Fisheries Forecast” (Fcube) has been proposed. In the application of the method, a precise initial...

  1. Methodological study of the diffusion of interacting cations through clays. Application: experimental tests and simulation of coupled chemistry-diffusion transport of alkaline ions through a synthetical bentonite; Etude methodologique de la diffusion de cations interagissants dans les argiles. Application: mise en oeuvre experimentale et modelisation du couplage chimie-diffusion d'alcalins dans une bentonite synthetique

    Melkior, Th

    2000-07-01

    The subject of this work deals with the project of underground disposal of radioactive wastes in deep geological formations. It concerns the study of the migration of radionuclides through clays. In these materials, the main transport mechanism is assumed to be diffusion under natural conditions. Therefore, some diffusion experiments are conducted. With interacting solutes which present a strong affinity for the material, the duration of these tests will be too long, for the range of concentrations of interest. An alternative is to determine on one hand the geochemical retention properties using batch tests and crushed rock samples and, on the other hand, to deduce the transport parameters from diffusion tests realised with a non-interacting tracer, tritiated water. These data are then used to simulate the migration of the reactive elements with a numerical code which can deal with coupled chemistry-diffusion equations. The validity of this approach is tested by comparing the numerical simulations with the results of diffusion experiments of cations through a clay. The subject is investigated in the case of the diffusion of cesium, lithium and sodium through a compacted sodium bentonite. The diffusion tests are realised with the through-diffusion method. The comparison between the experimental results and the simulations shows that the latter tends to under estimate the propagation of the considered species. The differences could be attributed to surface diffusion and to a decrease of the accessibility to the sites of fixation of the bentonite, from the conditions of clay suspensions in batch tests to the situation of compacted samples. The influence of the experimental apparatus used during the diffusion tests on the results of the measurement has also been tested. It showed that these apparatus have to be taken into consideration when the experimental data are interpreted. A specific model has been therefore developed with the numerical code CASTEM 2000. (author)

  2. Application of project management methodology in design management of nuclear safety related structure

    Chen Mao

    2004-01-01

    This paper focuses on the application of project management methodology in the design management of Nuclear Safety Related Structure (NSRS), considering the design management features of its civil construction. Based on the experiences from the management of several projects, the project management triangle is proposed to be used in the management, to well treat the position of design interface in the project management. Some other management methods are also proposed

  3. Methodology and application of 13C breath test in gastroenterology practice

    Yan Weili; Jiang Yibin

    2002-01-01

    13 C breath test has been widely used in research of nutrition, pharmacology and gastroenterology for its properties such as safety, non-invasion and so on. The author describes the principle, methodology of 13 C breath test and its application in detection to Helico-bacteria pylori infection in stomach and small bowl bacterial overgrowth, measurement of gastric emptying, pancreatic exocrine function and liver function with various substrates

  4. Methodologies and applications for critical infrastructure protection: State-of-the-art

    Yusta, Jose M.; Correa, Gabriel J.; Lacal-Arantegui, Roberto

    2011-01-01

    This work provides an update of the state-of-the-art on energy security relating to critical infrastructure protection. For this purpose, this survey is based upon the conceptual view of OECD countries, and specifically in accordance with EU Directive 114/08/EC on the identification and designation of European critical infrastructures, and on the 2009 US National Infrastructure Protection Plan. The review discusses the different definitions of energy security, critical infrastructure and key resources, and shows some of the experie'nces in countries considered as international reference on the subject, including some information-sharing issues. In addition, the paper carries out a complete review of current methodologies, software applications and modelling techniques around critical infrastructure protection in accordance with their functionality in a risk management framework. The study of threats and vulnerabilities in critical infrastructure systems shows two important trends in methodologies and modelling. A first trend relates to the identification of methods, techniques, tools and diagrams to describe the current state of infrastructure. The other trend accomplishes a dynamic behaviour of the infrastructure systems by means of simulation techniques including systems dynamics, Monte Carlo simulation, multi-agent systems, etc. - Highlights: → We examine critical infrastructure protection experiences, systems and applications. → Some international experiences are reviewed, including EU EPCIP Plan and the US NIPP programme. → We discuss current methodologies and applications on critical infrastructure protection, with emphasis in electric networks.

  5. Interaction Analysis: Theory, Research and Application.

    Amidon, Edmund J., Ed.; Hough, John J., Ed.

    This volume of selected readings developed for students and practitioners at various levels of sophistication is intended to be representative of work done to date on interaction analysis. The contents include journal articles, papers read at professional meetings, abstracts of doctoral dissertations, and selections from larger monographs, plus 12…

  6. Interactions in Generalized Linear Models: Theoretical Issues and an Application to Personal Vote-Earning Attributes

    Tsung-han Tsai

    2013-05-01

    Full Text Available There is some confusion in political science, and the social sciences in general, about the meaning and interpretation of interaction effects in models with non-interval, non-normal outcome variables. Often these terms are casually thrown into a model specification without observing that their presence fundamentally changes the interpretation of the resulting coefficients. This article explains the conditional nature of reported coefficients in models with interactions, defining the necessarily different interpretation required by generalized linear models. Methodological issues are illustrated with an application to voter information structured by electoral systems and resulting legislative behavior and democratic representation in comparative politics.

  7. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site

    Agueero, A.; Pinedo, P.; Simon, I.; Cancio, D.; Moraleda, M.; Trueba, C.; Perez-Sanchez, D.

    2008-01-01

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS 'Reference Biospheres Methodology'. The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of 36 Cl, 79 Se, 99 Tc, 129 I, 135 Cs, 226 Ra, 231 Pa, 238 U, 237 Np and 239 Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity

  8. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site.

    Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D

    2008-09-15

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and

  9. Radar-acoustic interaction for IFF applications

    Saffold, James A.; Williamson, Frank R.; Ahuja, Krishan; Stein, Lawrence R.; Muller, Marjorie

    1998-08-01

    This paper describes the results of an internal development program (IDP) No. 97-1 conducted from August 1-October 1 1996 at the Georgia Tech Research Institute. The IDP program was implemented to establish theoretical relationships and verify the interaction between X-band radar waves and ultrasonic acoustics. Low cost, off-the-shelf components were used for the verification in order to illustrate the cost savings potential of developing and utilizing these systems. The measured data was used to calibrate the developed models of the phenomenology and to support extrapolation for radar systems which can exploit these interactions. One such exploitation is for soldier identification IFF and radar taggant concepts. The described IDP program provided the phenomenological data which is being used to extrapolate concept system performances based on technological limitations and battlefield conditions for low cost IFF and taggant configurations.

  10. New Gogny interaction suitable for astrophysical applications

    Gonzalez-Boquera, C.; Centelles, M.; Viñas, X.; Robledo, L. M.

    2018-04-01

    The D1 family of parametrizations of the Gogny interaction commonly suffers from a rather soft neutron matter equation of state that leads to maximal masses of neutron stars well below the observational value of two solar masses. We propose a reparametrization scheme that preserves the good properties of the Gogny force but allows one to tune the density dependence of the symmetry energy, which, in turn, modifies the predictions for the maximum stellar mass. The scheme works well for D1M, and leads to a new parameter set, dubbed D1M*. In the neutron-star domain, D1M* predicts a maximal mass of two solar masses and global properties of the star in harmony with those obtained with the SLy4 Skyrme interaction. By means of a set of selected calculations in finite nuclei, we check that D1M* performs comparably well to D1M in several aspects of nuclear structure in nuclei.

  11. Application of an environmental remediation methodology: theory vs. practice reflections and two Belgian case studies - 59184

    Blommaert, W.; Mannaerts, K.; Pepin, S.; Dehandschutter, B.

    2012-01-01

    Like in many countries, polluted industrial sites also exist in Belgium. Although the contamination is purely chemical in most cases, they may also contain a radioactive component. For chemically contaminated sites, extensive regulations and methodologies were already developed and applied by the different regional authorities. However and essentially because radioactivity is a federal competence, there was also a necessity for developing a legal federal framework (including an ER-methodology [1]) for remediation of radioactive contaminated sites. Most of the so-called radioactive contaminated sites are exhibiting a mixed contamination (chemical and radiological), and hence the development of such methodology had to be in line with the existing (regional) ones concerning chemical contamination. Each authority having their own responsibilities with regard to the type of contamination, this makes it more complicated and time-consuming finding the best solution satisfying all involved parties. To overcome these difficulties the legal framework and methodology - including the necessary involvement of the stakeholders and delineation of each party's responsibilities - has to be transparent, clear and unambiguous. Once the methodology is developed as such and approved, the application of it is expected to be more or less easy, logic and straightforward. But is this really true? The aim of this document is to investigate as well the impact of factors such as the type of radioactive contamination - levels of contamination, related to NORM activity or not, homogeneous or heterogeneous, the differences in licensing procedures,.. - on the application of the developed methodology and what could be the consequences in the long run on the remediation process. Two existing case studies in Belgium will be presented ([2]). The first case deals with a historical radium contaminated site, the second one with a phosphate processing facility still in operation, both with (very) low

  12. PFEM application in fluid structure interaction problems

    Celigueta Jordana, Miguel Ángel; Larese De Tetto, Antonia; Latorre, Salvador

    2008-01-01

    In the current paper the Particle Finite Element Method (PFEM), an innovative numerical method for solving a wide spectrum of problems involving the interaction of fluid and structures, is briefly presented. Many examples of the use of the PFEM with GiD support are shown. GiD framework provides a useful pre and post processor for the specific features of the method. Its advantages and shortcomings are pointed out in the present work. Peer Reviewed

  13. Methodological issues in detecting gene-gene interactions in breast cancer susceptibility: a population-based study in Ontario

    Onay Venus

    2007-08-01

    Full Text Available Abstract Background There is growing evidence that gene-gene interactions are ubiquitous in determining the susceptibility to common human diseases. The investigation of such gene-gene interactions presents new statistical challenges for studies with relatively small sample sizes as the number of potential interactions in the genome can be large. Breast cancer provides a useful paradigm to study genetically complex diseases because commonly occurring single nucleotide polymorphisms (SNPs may additively or synergistically disturb the system-wide communication of the cellular processes leading to cancer development. Methods In this study, we systematically studied SNP-SNP interactions among 19 SNPs from 18 key genes involved in major cancer pathways in a sample of 398 breast cancer cases and 372 controls from Ontario. We discuss the methodological issues associated with the detection of SNP-SNP interactions in this dataset by applying and comparing three commonly used methods: the logistic regression model, classification and regression trees (CART, and the multifactor dimensionality reduction (MDR method. Results Our analyses show evidence for several simple (two-way and complex (multi-way SNP-SNP interactions associated with breast cancer. For example, all three methods identified XPD-[Lys751Gln]*IL10-[G(-1082A] as the most significant two-way interaction. CART and MDR identified the same critical SNPs participating in complex interactions. Our results suggest that the use of multiple statistical approaches (or an integrated approach rather than a single methodology could be the best strategy to elucidate complex gene interactions that have generally very different patterns. Conclusion The strategy used here has the potential to identify complex biological relationships among breast cancer genes and processes. This will lead to the discovery of novel biological information, which will improve breast cancer risk management.

  14. Motivating Students for Project-based Learning for Application of Research Methodology Skills.

    Tiwari, Ranjana; Arya, Raj Kumar; Bansal, Manoj

    2017-12-01

    Project-based learning (PBL) is motivational for students to learn research methodology skills. It is a way to engage and give them ownership over their own learning. The aim of this study is to use PBL for application of research methodology skills for better learning by encouraging an all-inclusive approach in teaching and learning rather than an individualized tailored approach. The present study was carried out for MBBS 6 th - and 7 th -semester students of community medicine. Students and faculties were sensitized about PBL and components of research methodology skills. They worked in small groups. The students were asked to fill the student feedback Questionnaire and the faculty was also asked to fill the faculty feedback Questionnaire. Both the Questionnaires were assessed on a 5 point Likert scale. After submitted projects, document analysis was done. A total of 99 students of the 6 th and 7 th semester were participated in PBL. About 90.91% students agreed that there should be continuation of PBL in subsequent batches. 73.74% felt satisfied and motivated with PBL, whereas 76.77% felt that they would be able to use research methodology in the near future. PBL requires considerable knowledge, effort, persistence, and self-regulation on the part of the students. They need to devise plans, gather information evaluate both the findings, and their approach. Facilitator plays a critical role in helping students in the process by shaping opportunity for learning, guiding students, thinking, and helping them construct new understanding.

  15. Production methodologies of polymeric and hydrogel particles for drug delivery applications.

    Lima, Ana Catarina; Sher, Praveen; Mano, João F

    2012-02-01

    Polymeric particles are ideal vehicles for controlled delivery applications due to their ability to encapsulate a variety of substances, namely low- and high-molecular mass therapeutics, antigens or DNA. Micro and nano scale spherical materials have been developed as carriers for therapies, using appropriated methodologies, in order to achieve a prolonged and controlled drug administration. This paper reviews the methodologies used for the production of polymeric micro/nanoparticles. Emulsions, phase separation, spray drying, ionic gelation, polyelectrolyte complexation and supercritical fluids precipitation are all widely used processes for polymeric micro/nanoencapsulation. This paper also discusses the recent developments and patents reported in this field. Other less conventional methodologies are also described, such as the use of superhydrophobic substrates to produce hydrogel and polymeric particulate biomaterials. Polymeric drug delivery systems have gained increased importance due to the need for improving the efficiency and versatility of existing therapies. This allows the development of innovative concepts that could create more efficient systems, which in turn may address many healthcare needs worldwide. The existing methods to produce polymeric release systems have some critical drawbacks, which compromise the efficiency of these techniques. Improvements and development of new methodologies could be achieved by using multidisciplinary approaches and tools taken from other subjects, including nanotechnologies, biomimetics, tissue engineering, polymer science or microfluidics.

  16. Methodology for biosphere analysis in high level waste disposal. Application to the Mediterranean system

    Pinedo, P.; Simon, I.; Aguero, A.; Cancio, D.

    2000-01-01

    For several years CIEMAT has been developing for ENRESA a conceptual approach and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water bodies or soils). The model development also includes evaluation of radiological impacts arising from the resulting distribution of radionuclides in the biosphere. At the time when the methodology was proposed, the level of development of the different aspects proposed within it was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects, like the procedure for conceptual model development and the description of biosphere systems representatives of the long term needed further developments. The developments have been performed in parallel to international projects, within which there were and are an active participation, mainly, the BIOphere Models Validation Study (BIOMOVS II) international Project, within which it was developed the so called Reference Biosphere Methodology and, the International Atomic Energy Agency (IAEA) Programme on BIOsphere Modelling and ASSessment methods (BIOMASS), that is under development at present. The methodology been made takes account of these international developments. The purpose of the work summarised herein is the application of the methodology to the 1997 performance assessment (PA) exercise made by ENRESA, using from it the general and particular information about the assessment context, the source term, and the geo-biosphere interface data. (author)

  17. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  18. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs

  19. A general centroid determination methodology, with application to multilayer dielectric structures and thermally stimulated current measurements

    Miller, S.L.; Fleetwood, D.M.; McWhorter, P.J.; Reber, R.A. Jr.; Murray, J.R.

    1993-01-01

    A general methodology is developed to experimentally characterize the spatial distribution of occupied traps in dielectric films on a semiconductor. The effects of parasitics such as leakage, charge transport through more than one interface, and interface trap charge are quantitatively addressed. Charge transport with contributions from multiple charge species is rigorously treated. The methodology is independent of the charge transport mechanism(s), and is directly applicable to multilayer dielectric structures. The centroid capacitance, rather than the centroid itself, is introduced as the fundamental quantity that permits the generic analysis of multilayer structures. In particular, the form of many equations describing stacked dielectric structures becomes independent of the number of layers comprising the stack if they are expressed in terms of the centroid capacitance and/or the flatband voltage. The experimental methodology is illustrated with an application using thermally stimulated current (TSC) measurements. The centroid of changes (via thermal emission) in the amount of trapped charge was determined for two different samples of a triple-layer dielectric structure. A direct consequence of the TSC analyses is the rigorous proof that changes in interface trap charge can contribute, though typically not significantly, to thermally stimulated current

  20. A methodology for developing strategic municipal solid waste management plans with an application in Greece.

    Economopoulos, A P

    2010-11-01

    A rational approach for developing optimal municipal solid waste (MSW) management plans comprises the strategic and the detailed planning phases. The present paper focuses on the former, the objective of which is to screen management alternatives so as to select the ones that are able to fulfil all legal and other management requirements with reasonable cost. The analysis considers the transportation, treatment and final disposal of the commingled wastes that remain after the application of material recovery at the source programmes and comprises 10 elements, four of which are region-dependent and the remaining ones application-dependent. These elements and their inter-dependencies are described and the entire methodology is applied to Greece. The application considers the existing regional plans and shows that they are incompatible with the existing EU Directives, as well as overly expensive. To address this problem, a new plan is developed in accordance with the rational planning principles of the present methodology. The comparative evaluation of the above alternatives shows that the existing regional plans, in addition to being incompatible with the applicable EU Directives, require 4.3 to 4.8 times (3.7 to 4.4 billion €) higher capital investment and their annual cost is at least 2.1 to 2.3 times (590 to 735 million € year(-1)) higher in comparison with the new national plan.

  1. Ion-materials interactions and their application

    Whitlow, H.J.

    1998-01-01

    The interaction of energetic ions and other charged particles with solid matter leads to a wealth of physical processes. This thesis comprises a collection of papers and an introductory commentary, which explore some aspects of how these interactions may be used for: (i) Characterisation of thin surface layers of material, (ii) characterisation of energetic charged particles, and (iii) modification of materials by ion bombardment. In (i) Elastic Recoil Detection using a detector system for measurement of Time of Flight and kinetic energy of recoiling target atoms has been developed as a quantitative method for elemental depth profiling of thin (0.5-1 μm) surface layers. This method has been applied to the study of reactions of metal/III-V structures, which are of importance for the semiconductor industry. (ii) MeV-ion - materials interactions have been used as the basis for developing Si p-i-n detectors for the CHICSi programme which will undertake experimental studies of heavy ion collisions at intermediate energies. This involved development and testing of extremely thin (10-12 μm) Si ΔE detectors for characterising light- and intermediate mass charged particles as well as calibration of Si p-i-n detectors and their susceptibility to radiation damage. (iii) Nuclear Reaction Analysis (NRA) with resonant nuclear reactions has been used to study modification of material with ion beams. In the first study, the accumulation of fluorine in BF 2 + ion implanted WSi 2 solid diffusion sources was investigated. The second study investigated if there was a correlation between photoluminescence and segregation of hydrogen to buried heterojunctions in plasma-etched III-V quantum-well structures. The ion bombardment in this case was during etching in an Ar+CH 4 plasma using an Electron Cyclotron Resonance (ECR) source. (author)

  2. ADDIE Model Application Promoting Interactive Multimedia

    Baharuddin, B.

    2018-02-01

    This paper presents the benefits of interactive learning in a vocational high school, which is developed by Research and Developmet (R&D) method. The questionnaires, documentations, and instrument tests are used to obtain data and it is analyzed by descriptive statistic. The results show the students’ competence is generated up to 80.00 %, and the subject matter aspects of the content is up to 90.00 %. The learning outcomes average is 85. This type media fulfils the proposed objective which can enhance the learning outcome.

  3. Methodology of Integration for Competitive Technical Intelligence with Blue Ocean Strategy: Application to an exotic fruit

    Marisela Rodríguez Salvador

    2011-12-01

    Full Text Available This article presents a new methodology that integrates Competitive Technical Intelligence with Blue Ocean Strategy. We explore new business niches taking advantage of the synergy that both areas offer, developing a model based on cyclic interactions through a process developed in two stages: Understanding opportunity that arise from idea formulation to decision making and strategic development. The validity of our approach (first stage was observed in the evaluation of an exotic fruit, Anacardium Occidentale, in the South of the State of Veracruz, Mexico with the support of the university ITESM, Campus Monterrey. We identified critical factors for success, opportunities and threats. Results confirm the attractiveness of this crop.

  4. A methodology for the design and testing of atmospheric boundary layer models for wind energy applications

    J. Sanz Rodrigo

    2017-02-01

    Full Text Available The GEWEX Atmospheric Boundary Layer Studies (GABLS 1, 2 and 3 are used to develop a methodology for the design and testing of Reynolds-averaged Navier–Stokes (RANS atmospheric boundary layer (ABL models for wind energy applications. The first two GABLS cases are based on idealized boundary conditions and are suitable for verification purposes by comparing with results from higher-fidelity models based on large-eddy simulation. Results from three single-column RANS models, of 1st, 1.5th and 2nd turbulence closure order, show high consistency in predicting the mean flow. The third GABLS case is suitable for the study of these ABL models under realistic forcing such that validation versus observations from the Cabauw meteorological tower are possible. The case consists on a diurnal cycle that leads to a nocturnal low-level jet and addresses fundamental questions related to the definition of the large-scale forcing, the interaction of the ABL with the surface and the evaluation of model results with observations. The simulations are evaluated in terms of surface-layer fluxes and wind energy quantities of interest: rotor equivalent wind speed, hub-height wind direction, wind speed shear and wind direction veer. The characterization of mesoscale forcing is based on spatially and temporally averaged momentum budget terms from Weather Research and Forecasting (WRF simulations. These mesoscale tendencies are used to drive single-column models, which were verified previously in the first two GABLS cases, to first demonstrate that they can produce similar wind profile characteristics to the WRF simulations even though the physics are more simplified. The added value of incorporating different forcing mechanisms into microscale models is quantified by systematically removing forcing terms in the momentum and heat equations. This mesoscale-to-microscale modeling approach is affected, to a large extent, by the input uncertainties of the mesoscale

  5. Designing Interaction Spaces for Rich Internet Applications with UML

    Dolog, Peter; Stage, Jan

    2007-01-01

    In this paper, we propose a new method for designing rich internet applications. The design process uses results from an object-oriented analysis and employs interaction spaces as the basic abstraction mechanism. State diagrams are employed as refinements of interaction spaces and task models...

  6. Application of the HGPT methodology of reactor operation problems with a nodal mixed method

    Baudron, A.M.; Bruna, G.B.; Gandini, A.; Lautard, J.J.; Monti, S.; Pizzigati, G.

    1998-01-01

    The heuristically based generalized perturbation theory (HGPT), to first and higher order, applied to the neutron field of a reactor system, is discussed in relation to quasistatic problems. This methodology is of particular interest in reactor operation. In this application it may allow an on-line appraisal of the main physical responses of the reactor system when subject to alterations relevant to normal system exploitation, e.g. control rod movement, and/or soluble boron concentration changes to be introduced, for instance, for compensating power level variations following electrical network demands. In this paper, after describing the main features of the theory, its implementation into the diffusion, 3D mixed dual nodal code MINOS of the SAPHYR system is presented. The results from a small scale investigation performed on a simplified PWR system corroborate the validity of the methodology proposed

  7. Assessment of ISLOCA risk: Methodology and application to a Westinghouse four-loop ice condenser plant

    Kelly, D.L.; Auflick, J.L.; Haney, L.N. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISLOCA core damage frequency and risk. This report presents a detailed description of the application of this analysis methodology to a Westinghouse four-loop ice condenser plant. This document also includes appendices A through I which provide: System descriptions; ISLOCA event trees; human reliability analysis; thermal hydraulic analysis; core uncovery timing calculations; calculation of system rupture probability; ISLOCA consequences analysis; uncertainty analysis; and component failure analysis.

  8. Assessment of ISLOCA risk: Methodology and application to a Westinghouse four-loop ice condenser plant

    Kelly, D.L.; Auflick, J.L.; Haney, L.N.

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISLOCA core damage frequency and risk. This report presents a detailed description of the application of this analysis methodology to a Westinghouse four-loop ice condenser plant. This document also includes appendices A through I which provide: System descriptions; ISLOCA event trees; human reliability analysis; thermal hydraulic analysis; core uncovery timing calculations; calculation of system rupture probability; ISLOCA consequences analysis; uncertainty analysis; and component failure analysis

  9. The Role of Couples’ Interactions in Application of Communication Skills

    منصوره‌السادات صادقی; محمدعلی مظاهری; فرشته موتابی

    2014-01-01

    The aim of this research was to predict the role of couples’ interactions in application of communication skills based on observing their positive and negative interactions. A sample of 31 couples [adapted (15) and maladapted (16)] who were living in Tehran, were selected via accessible sampling. The couples’ interactions were videotaped through a designed scenario including problem solving, decision making, and reviewing conversation about a shared pleasure event. Participants also completed...

  10. Femtosecond laser-matter interaction theory, experiments and applications

    Gamaly, Eugene G

    2011-01-01

    Basics of Ultra-Short Laser-Solid InteractionsSubtle Atomic Motion Preceding a Phase Transition: Birth, Life and Death of PhononsUltra-Fast Disordering by fs-Lasers: Superheating Prior to Entropy CatastropheAblation of SolidsUltra-Short Laser-Matter Interaction Confined Inside a Bulk of Transparent SolidApplications of Ultra-Short Laser-Matter InteractionsConclusion Remarks.

  11. Application of the Biosphere Assessment Methodology to the ENRESA, 1997 Performance and Safety Assessment

    Pinedo, P.; Simon, I.; Aguero, A.

    1998-01-01

    For several years CIEMAT has been developing for ENRESA knowledge and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water bodies or soils). The model development also includes evaluation of radiological impacts arising from the resulting distribution of radionuclides in the biosphere. In 1996, a Methodology to analyse the biosphere in this context proposed to ENRESA. The level of development of the different aspects proposed within the Methodology was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects like the procedure for conceptual model development and the description of biosphere system representatives of the long term needed further developments. At present, the International Atomic Energy Agency (IAEA) Programme on Biosphere Modelling and Assessment (BIOMASS) in collaboration with several national organizations, ENRESA and CIEMAT among them, is working to complete and augment the Reference Biosphere Methodology and to produce some practical descriptions of Reference Systems. The overall purpose of this document is to apply the Methodology, taking account of on-going developments in biosphere modelling, to the last performance assessment (PA) exercise made by ENRESA (ENRESA, 1997), using from it the general and particular information about the assessment context, radionuclide information, geosphere and geobiosphere interface data. There are three particular objectives to this work: (a) to determine the practicability of the Methodology in an application to a realistic assessment situation, (b) To compare and contrast previous biosphere modelling in HLW PA and, (c) to test software development related with data management and modelling. (Author) 42 refs

  12. Application of Haddon’s matrix in qualitative research methodology: an experience in burns epidemiology

    Deljavan R

    2012-07-01

    Full Text Available Reza Deljavan,1 Homayoun Sadeghi-Bazarganim,2,3 Nasrin Fouladim,4 Shahnam Arshi,5 Reza Mohammadi61Injury Epidemiology and Prevention Research Center, 2Neuroscience Research Center, Department of Statistics and Epidemiology, Tabriz University of Medical Sciences, Tabriz, Iran; 3Public Health Department, Karolinska Institute, Stockholm, Sweden; 4Ardabil University of Medical Sciences, Ardabil, Iran; 5Shahid Beheshti University of Medical Sciences, Tehran, Iran; 6Public Health Department, Karolinska Institute, Stockholm, SwedenBackground: Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon’s matrix through qualitative research methods to better understand people’s perceptions about burn injuries.Methods: This study applied Haddon’s matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon’s matrix was used to develop an interview guide and also through the analysis phase.Results: The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education, pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators. This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans.Conclusion: Haddon’s matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon’s matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries

  13. Application of low-cost methodologies for mobile phone app development.

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low

  14. Application of code scaling, applicability and uncertainty methodology to large break LOCA analysis of two loop PWR

    Mavko, B.; Stritar, A.; Prosek, A.

    1993-01-01

    In NED 119, No. 1 (May 1990) a series of six papers published by a Technical Program Group presented a new methodology for the safety evaluation of emergency core cooling systems in nuclear power plants. This paper describes the application of that new methodology to the LB LOCA analysis of the two loop Westinghouse power plant. Results of the original work were used wherever possible, so that the analysis was finished in less than one man year of work. Steam generator plugging level and safety injection flow rate were used as additional uncertainty parameters, which had not been used in the original work. The computer code RELAP5/MOD2 was used. Response surface was generated by the regression analysis and by the artificial neural network like Optimal Statistical Estimator method. Results were compared also to the analytical calculation. (orig.)

  15. Cognitive Sensitivity in Sibling Interactions: Development of the Construct and Comparison of Two Coding Methodologies

    Prime, Heather; Perlman, Michal; Tackett, Jennifer L.; Jenkins, Jennifer M.

    2014-01-01

    Research Findings: The goal of this study was to develop a construct of sibling cognitive sensitivity, which describes the extent to which children take their siblings' knowledge and cognitive abilities into account when working toward a joint goal. In addition, the study compared 2 coding methodologies for measuring the construct: a thin…

  16. Affective Interface Adaptations in the Musickiosk Interactive Entertainment Application

    Malatesta, L.; Raouzaiou, A.; Pearce, L.; Karpouzis, K.

    The current work presents the affective interface adaptations in the Musickiosk application. Adaptive interaction poses several open questions since there is no unique way of mapping affective factors of user behaviour to the output of the system. Musickiosk uses a non-contact interface and implicit interaction through emotional affect rather than explicit interaction where a gesture, sound or other input directly maps to an output behaviour - as in traditional entertainment applications. PAD model is used for characterizing the different affective states and emotions.

  17. Drug-Target Interactions: Prediction Methods and Applications.

    Anusuya, Shanmugam; Kesherwani, Manish; Priya, K Vishnu; Vimala, Antonydhason; Shanmugam, Gnanendra; Velmurugan, Devadasan; Gromiha, M Michael

    2018-01-01

    Identifying the interactions between drugs and target proteins is a key step in drug discovery. This not only aids to understand the disease mechanism, but also helps to identify unexpected therapeutic activity or adverse side effects of drugs. Hence, drug-target interaction prediction becomes an essential tool in the field of drug repurposing. The availability of heterogeneous biological data on known drug-target interactions enabled many researchers to develop various computational methods to decipher unknown drug-target interactions. This review provides an overview on these computational methods for predicting drug-target interactions along with available webservers and databases for drug-target interactions. Further, the applicability of drug-target interactions in various diseases for identifying lead compounds has been outlined. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  19. Application of a methodology based on the Theory of Constraints in the sector of tourism services

    Reyner Pérez Campdesuñer

    2017-04-01

    Full Text Available Purpose: The objective of the research was aimed at achieving the implementation of the theory of constraints on the operating conditions of a hotel, which differs by its characteristics of traditional processes that have applied this method, from the great heterogeneity of resources needed to meet the demand of customers. Design/methodology/approach: To achieve this purpose, a method of generating conversion equations that allowed to express all the resources of the organization under study depending on the number of customers to serve facilitating comparison between different resources and estimated demand through techniques developed traditional forecasting, these features were integrated into the classical methodology of theory of constraints. Findings: The application of tools designed for hospitality organizations allowed to demonstrate the applicability of the theory of constraints on entities under conditions different from the usual, develop a set of conversion equations of different resources facilitating comparison with demand and consequently achieve improve levels of efficiency and effectiveness of the organization. Originality/value: The originality of the research is summarized in the application of the theory of constraints in a very different from the usual conditions, covering 100% of the processes and resources in hospitality organizations.

  20. Defense nuclear energy systems selection methodology for civil nuclear power applications

    Scarborough, J.C.

    1986-01-01

    A methodology developed to select a preferred nuclear power system for a US Department of Defense (DOD) application has been used to evaluate preferred nuclear power systems for a remote island community in Southeast Asia. The plant would provide ∼10 MW of electric power, possibly low-temperature process heat for the local community, and would supplement existing island diesel electric capacity. The nuclear power system evaluation procedure was evolved from a disciplined methodology for ranking ten nuclear power designs under joint development by the US Department of Energy (DOE) and DOD. These included six designs proposed by industry for the Secure Military Power Plant Program (now termed Multimegawatt Terrestrial Reactor Program), the SP-100 Program, the North Warning System Program, and the Modular Advanced High-Temperature Gas-Cooled Reactor (HTGR) and Liquid-Metal Reactor (LMR) programs. The 15 evaluation criteria established for the civil application were generally similar to those developed and used for the defense energy systems evaluation, except that the weighting factor applied to each individual criterion differed. The criteria and their weighting (importance) functions for the civil application are described

  1. Interactive data visualization foundations, techniques, and applications

    Ward, Matthew; Keim, Daniel

    2010-01-01

    Visualization is the process of representing data, information, and knowledge in a visual form to support the tasks of exploration, confirmation, presentation, and understanding. This book is designed as a textbook for students, researchers, analysts, professionals, and designers of visualization techniques, tools, and systems. It covers the full spectrum of the field, including mathematical and analytical aspects, ranging from its foundations to human visual perception; from coded algorithms for different types of data, information and tasks to the design and evaluation of new visualization techniques. Sample programs are provided as starting points for building one's own visualization tools. Numerous data sets have been made available that highlight different application areas and allow readers to evaluate the strengths and weaknesses of different visualization methods. Exercises, programming projects, and related readings are given for each chapter. The book concludes with an examination of several existin...

  2. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  3. Mobile devices and computing cloud resources allocation for interactive applications

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  4. Methodology for digital radiography simulation using the Monte Carlo code MCNPX for industrial applications

    Souza, E.M.; Correa, S.C.A.; Silva, A.X.; Lopes, R.T.; Oliveira, D.F.

    2008-01-01

    This work presents a methodology for digital radiography simulation for industrial applications using the MCNPX radiography tally. In order to perform the simulation, the energy-dependent response of a BaFBr imaging plate detector was modeled and introduced in the MCNPX radiography tally input. In addition, a post-processing program was used to convert the MCNPX radiography tally output into 16-bit digital images. Simulated and experimental images of a steel pipe containing corrosion alveoli and stress corrosion cracking were compared, and the results showed good agreement between both images

  5. SystemVerilog assertions and functional coverage guide to language, methodology and applications

    Mehta, Ashok B

    2013-01-01

    This book provides a hands-on, application-oriented guide to the language and methodology of both SystemVerilog Assertions and SytemVerilog Functional Coverage.  Readers will benefit from the step-by-step approach to functional hardware verification, which will enable them to uncover hidden and hard to find bugs, point directly to the source of the bug, provide for a clean and easy way to model complex timing checks and objectively answer the question 'have we functionally verified everything'.  Written by a professional end-user of both SystemVerilog Assertions and SystemVerilog Functional Co

  6. Application of NASA Kennedy Space Center System Assurance Analysis methodology to nuclear power plant systems designs

    Page, D.W.

    1985-01-01

    In May of 1982, the Kennedy Space Center (KSC) entered into an agreement with the NRC to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. North Carolina's Duke Power Company expressed an interest in the study and proposed the nuclear power facility at CATAWBA for the basis of the study. In joint meetings of KSC and Duke Power personnel, an agreement was made to select two CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set of Final Safety Analysis Reports (FSAR) as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. (orig./HP)

  7. Vedic division methodology for high-speed very large scale integration applications

    Prabir Saha

    2014-02-01

    Full Text Available Transistor level implementation of division methodology using ancient Vedic mathematics is reported in this Letter. The potentiality of the ‘Dhvajanka (on top of the flag’ formula was adopted from Vedic mathematics to implement such type of divider for practical very large scale integration applications. The division methodology was implemented through half of the divisor bit instead of the actual divisor, subtraction and little multiplication. Propagation delay and dynamic power consumption of divider circuitry were minimised significantly by stage reduction through Vedic division methodology. The functionality of the division algorithm was checked and performance parameters like propagation delay and dynamic power consumption were calculated through spice spectre with 90 nm complementary metal oxide semiconductor technology. The propagation delay of the resulted (32 ÷ 16 bit divider circuitry was only ∼300 ns and consumed ∼32.5 mW power for a layout area of 17.39 mm^2. Combination of Boolean arithmetic along with ancient Vedic mathematics, substantial amount of iterations were reduced resulted as ∼47, ∼38, 34% reduction in delay and ∼34, ∼21, ∼18% reduction in power were investigated compared with the mostly used (e.g. digit-recurrence, Newton–Raphson, Goldschmidt architectures.

  8. Calculation of t8/5 by response surface methodology for electric arc welding applications

    Meseguer-Valdenebro José Luis

    2014-01-01

    Full Text Available One of the greatest difficulties traditionally found in stainless steel constructions has been the execution of welding parts in them. At the present time, the available technology allows us to use arc welding processes for that application without any disadvantage. Response surface methodology is used to optimise a process in which the variables that take part in it are not related to each other by a mathematical law. Therefore, an empiric model must be formulated. With this methodology the optimisation of one selected variable may be done. In this work, the cooling time that takes place from 800 to 500ºC, t8/5, after TIG welding operation, is modelled by the response surface method. The arc power, the welding velocity and the thermal efficiency factor are considered as the variables that have influence on the t8/5 value. Different cooling times,t8/5, for different combinations of values for the variables are previously determined by a numerical method. The input values for the variables have been experimentally established. The results indicate that response surface methodology may be considered as a valid technique for these purposes.

  9. Water and Carbon Footprint of Wine: Methodology Review and Application to a Case Study

    Sara Rinaldi

    2016-07-01

    Full Text Available Life cycle assessments (LCAs play a strategic role in improving the environmental performance of a company and in supporting a successful marketing communication. The high impact of the food industry on natural resources, in terms of water consumption and greenhouse gases emission, has been focusing the attention of consumers and producers towards environmentally sustainable products. This work presents a comprehensive approach for the joint evaluation of carbon (CF and water (WF footprint of the wine industry from a cradle to grave perspective. The LCA analysis is carried out following the requirements of international standards (ISO/TS 14067 and ISO 14046. A complete review of the water footprint methodology is presented and guidelines for all the phases of the evaluation procedure are provided, including acquisition and validation of input data, allocation, application of analytic models, and interpretation of the results. The strength of this approach is the implementation of a side-by-side CF vs. WF assessment, based on the same system boundaries, functional unit, and input data, that allows a reliable comparison between the two indicators. In particular, a revised methodology is presented for the evaluation of the grey water component. The methodology was applied to a white and a red wine produced in the same company. A comparison between the two products is presented for each LCA phase along with literature results for similar wines.

  10. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer

    2015-01-01

    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  11. Probablistic risk assessment methodology application to Indian pressurised heavy water reactors

    Babar, A.K.; Grover, R.B.; Mehra, V.K.; Gangwal, D.K.; Chakraborty, G.

    1987-01-01

    Probabilistic risk assessment in the context of nuclear power plants is associated with models that predict the offsite radiological releases resulting from reactor accidents. Level 1 PRA deals with the identification of accident sequences relevant to the design of a system and also with their quantitative estimation. It is characterised by event tree, fault tree analysis. The initiating events applicable to pressurised heavy water reactors have been considered and the dominating initiating events essential for detailed studies are identified in this paper. Reliability analysis and the associated problems encountered during the case studies are mentioned briefly. It is imperative to validate the failure data used for analysis. Bayesian technique has been employed for the same and a brief account is included herein. A few important observations, e.g. effects of the presence of moderator, made during the application of probabilistic risk assessment methodology are also discussed. (author)

  12. Application of the BEPU methodology to assess fuel performance in dry storage

    Feria, F.; Herranz, L.E.

    2017-01-01

    Highlights: • Application of the BEPU methodology to estimate the cladding stress in dry storage. • The stress predicted is notably affected by the irradiation history. • Improvements of FGR modelling would significantly enhance the stress estimates. • The prediction uncertainty should not be disregarded when assessing clad integrity. - Abstract: The stress at which fuel cladding is submitted in dry storage is the driving force of the main degrading mechanisms postulated (i.e., embrittlement due to hydrides radial reorientation and creep). Therefore, a sound assessment is mandatory to reliably predict fuel performance under the dry storage prevailing conditions. Through fuel rod thermo-mechanical codes, best estimate calculations can be conducted. Precision of predictions depends on uncertainties affecting the way of calculating the stress, so by using uncertainty analysis an upper bound of stress can be determined and compared to safety limits set. The present work shows the application of the BEPU (Best Estimate Plus Uncertainty) methodology in this field. Concretely, hydrides radial reorientation has been assessed based on stress predictions under challenging thermal conditions (400 °C) and a stress limit of 90 MPa. The computational tools used to do that are FRAPCON-3xt (best estimate) and Dakota (uncertainty analysis). The methodology has been applied to a typical PWR fuel rod highly irradiated (65 GWd/tU) at different power histories. The study performed allows concluding that both the power history and the prediction uncertainty should not be disregarded when fuel rod integrity is evaluated in dry storage. On probabilistic bases, a burnup of 60 GWd/tU is found out as an acceptable threshold even in the most challenging irradiation conditions considered.

  13. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  14. Assessment of brain perfusion with MRI: methodology and application to acute stroke

    Grandin, C.B.

    2003-01-01

    We review the methodology of brain perfusion measurements with MRI and their application to acute stroke, with particular emphasis on the work awarded by the 6th Lucien Appel Prize for Neuroradiology. The application of the indicator dilution theory to the dynamic susceptibility-weighted bolus-tracking method is explained, as is the approach to obtaining quantitative measurements of cerebral blood flow (CBF) and volume (CBV). Our contribution to methodological developments, such as CBV measurement with the frequency-shifted burst sequence, development of the PRESTO sequence, comparison of different deconvolution methods and of spin- and gradient-echo sequences, and the validation of MRI measurements against positron emission tomography is summarised. The pathophysiology of brain ischaemia and the role of neuroimaging in the setting of acute stroke are reviewed, with an introduction to the concepts of ischaemic penumbra and diffusion/perfusion mismatch. Our work on the determination of absolute CBF and CBV thresholds for predicting the area of infarct growth, identification of the best perfusion parameters (relative or absolute) for predicting the area of infarct growth and the role of MR angiography is also summarised. We conclude that MRI is a very powerful way to assess brain perfusion and that its use might help in selecting patients who will benefit most from treatment such as thrombolysis. (orig.)

  15. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  16. Predicting protein complexes from weighted protein-protein interaction graphs with a novel unsupervised methodology: Evolutionary enhanced Markov clustering.

    Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina

    2015-03-01

    Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new

  17. International Conference on Intelligent and Interactive Systems and Applications

    Patnaik, Srikanta; Yu, Zhengtao

    2017-01-01

    This book provides the latest research findings and developments in the field of interactive intelligent systems, addressing diverse areas such as autonomous systems, Internet and cloud computing, pattern recognition and vision systems, mobile computing and intelligent networking, and e-enabled systems. It gathers selected papers from the International Conference on Intelligent and Interactive Systems and Applications (IISA2016) held on June 25–26, 2016 in Shanghai, China. Interactive intelligent systems are among the most important multi-disciplinary research and development domains of artificial intelligence, human–computer interaction, machine learning and new Internet-based technologies. Accordingly, these systems embrace a considerable number of application areas such as autonomous systems, expert systems, mobile systems, recommender systems, knowledge-based and semantic web-based systems, virtual communication environments, and decision support systems, to name a few. To date, research on interactiv...

  18. The Role of Couples’ Interactions in Application of Communication Skills

    منصوره‌السادات صادقی

    2014-02-01

    Full Text Available The aim of this research was to predict the role of couples’ interactions in application of communication skills based on observing their positive and negative interactions. A sample of 31 couples [adapted (15 and maladapted (16] who were living in Tehran, were selected via accessible sampling. The couples’ interactions were videotaped through a designed scenario including problem solving, decision making, and reviewing conversation about a shared pleasure event. Participants also completed the Marital Adjustment Test (MAT. The couple’s interactions were coded using the Iranian Couples Interaction Coding System (ICICS. Results revealed that the more increase in positive interactions between couples, the more use of “problem solving” and “negotiation” skills observed, whereas increase in the couples’ negative interactions was associated with less application of this skills. It was also revealed that while the couples’ negative interactions were significantly predictive of using the “decision making” skill, couples with more positive interactions were located in the group who was able to have a review conversation about a shared positive event. In addition, results indicated that couples who were more able to apply problem solving, negotiation, decision making and review conversation about a shared event skills, had also higher scores on marital adjustment. Destructive role of the couples’ negative interactions and intensifying the negative emotions between them has been discussed, as if it seems that couples are not willing to help each other for resolving conflicts. As interactions getting more negative, by creating a negative filtering, memory becomes biased towards negative events at the expense of shared positive events.

  19. Incorporating stakeholders' preferences for ex ante evaluation of energy and climate policy interactions. Development of a Multi Criteria Analysis weighting methodology

    Grafakos, S.; Zevgolis, D.; Oikonomou, V.

    2008-03-01

    Evaluation of energy and climate policy interactions is a complex issue which has not been addressed systematically. Multi Criteria Decision Analysis (MCDA) evaluation processes have been applied widely to different policy and decision cases as they have the ability to cope with high complexity, by structuring and analyzing the policy problem in a transparent and systematic way. Criteria weights elicitation techniques are developed within the framework of MCDA to integrate stakeholders' preferential information in the decision making and evaluation process. There are variant methods to determine criteria weights which can be used in various ways for different policy evaluation purposes. During decision making, policy makers and relevant stakeholders implicitly or explicitly express their relative importance between the evaluation criteria by assigning weighting factors to them. More particular, climate change policy problems lack a simple, transparent and structured way to incorporate stakeholders' views and values. In order to incorporate stakeholders' weighting preferences into an ex ante evaluation of climate change and energy policy instruments interaction, an integrative constructive weighting methodology has been developed. This paper presents the main characteristics of evaluation of energy and climate policy interactions, the reasoning behind the development of the weighting tool, its main theoretical and functional characteristics and the results of its application to obtain and incorporate stakeholders' preferences on energy and climate change policy evaluation criteria. The weighting method that has been elaborated and applied to derive stakeholders' preferences for criteria weights is a combination of pair wise comparisons and ratio importance weighting methods. Initially introduces the stakeholders to the evaluation process through a warming up holistic approach for ranking the criteria and then requires them to express their ratio relative importance

  20. Theory and Applications of Weakly Interacting Markov Processes

    2018-02-03

    between a node and its neighbor is inversely 3 proportional to the total number of neighbors of that node. Such stochastic systems arise in many different...jumps and models with simultaneous jumps that arise in applications. (1.ii.d) Uniform in Time Interacting Particle Approximations for Nonlinear...problems. (1.iv.a) Diffusion Approximations for Controlled Weakly Interacting Large Finite State Systems with Simultaneous Jumps [25]. We consider a rate

  1. Application of response surface methodology to optimize uranium biological leaching at high pulp density

    Fatemi, Faezeh; Arabieh, Masoud; Jahani, Samaneh

    2016-01-01

    The aim of the present study was to carry out uranium bioleaching via optimization of the leaching process using response surface methodology. For this purpose, the native Acidithiobacillus sp. was adapted to different pulp densities following optimization process carried out at a high pulp density. Response surface methodology based on Box-Behnken design was used to optimize the uranium bioleaching. The effects of six key parameters on the bioleaching efficiency were investigated. The process was modeled with mathematical equation, including not only first and second order terms, but also with probable interaction effects between each pair of factors.The results showed that the extraction efficiency of uranium dropped from 100% at pulp densities of 2.5, 5, 7.5 and 10% to 68% at 12.5% of pulp density. Using RSM, the optimum conditions for uranium bioleaching (12.5% (w/v)) were identified as pH = 1.96, temperature = 30.90 C, stirring speed = 158 rpm, 15.7% inoculum, FeSO 4 . 7H 2 O concentration at 13.83 g/L and (NH 4 ) 2 SO 4 concentration at 3.22 g/L which achieved 83% of uranium extraction efficiency. The results of uranium bioleaching experiment using optimized parameter showed 81% uranium extraction during 15 d. The obtained results reveal that using RSM is reliable and appropriate for optimization of parameters involved in the uranium bioleaching process.

  2. Application of response surface methodology to optimize uranium biological leaching at high pulp density

    Fatemi, Faezeh; Arabieh, Masoud; Jahani, Samaneh [NSTRI, Tehran (Iran, Islamic Republic of). Nuclear Fuel Cycle Research School

    2016-08-01

    The aim of the present study was to carry out uranium bioleaching via optimization of the leaching process using response surface methodology. For this purpose, the native Acidithiobacillus sp. was adapted to different pulp densities following optimization process carried out at a high pulp density. Response surface methodology based on Box-Behnken design was used to optimize the uranium bioleaching. The effects of six key parameters on the bioleaching efficiency were investigated. The process was modeled with mathematical equation, including not only first and second order terms, but also with probable interaction effects between each pair of factors.The results showed that the extraction efficiency of uranium dropped from 100% at pulp densities of 2.5, 5, 7.5 and 10% to 68% at 12.5% of pulp density. Using RSM, the optimum conditions for uranium bioleaching (12.5% (w/v)) were identified as pH = 1.96, temperature = 30.90 C, stirring speed = 158 rpm, 15.7% inoculum, FeSO{sub 4} . 7H{sub 2}O concentration at 13.83 g/L and (NH{sub 4}){sub 2}SO{sub 4} concentration at 3.22 g/L which achieved 83% of uranium extraction efficiency. The results of uranium bioleaching experiment using optimized parameter showed 81% uranium extraction during 15 d. The obtained results reveal that using RSM is reliable and appropriate for optimization of parameters involved in the uranium bioleaching process.

  3. Quantum mechanical reactive scattering theory for simple chemical reactions: Recent developments in methodology and applications

    Miller, W.H.

    1989-08-01

    It has recently been discovered that the S-matrix version of the Kohn variational principle is free of the ''Kohn anomalies'' that have plagued other versions and prevented its general use. This has made a major contribution to heavy particle reactive (and also to electron-atom/molecule) scattering which involve non-local (i.e., exchange) interactions that prevent solution of the coupled channel equations by propagation methods. This paper reviews the methodology briefly and presents a sample of integral and differential cross sections that have been obtained for the H + H 2 → H 2 +H and D + H 2 → HD + H reactions in the high energy region (up to 1.2 eV translational energy) relevant to resonance structures reported in recent experiments. 35 refs., 11 figs

  4. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  5. Estimating the cost of delaying a nuclear power plant: methodology and application

    Hill, L.J.; Tepel, R.C.; Van Dyke, J.W.

    1985-01-01

    This paper presents an analysis of an actual 24-month nuclear power plant licensing delay under alternate assumptions about regulatory practice, sources of replacement power, and the cost of the plant. The analysis focuses on both the delay period and periods subsequent to the delay. The methodology utilized to simulate the impacts involved the recursive interaction of a generation-costing program to estimate fuel-replacement costs and a financial regulatory model to concomitantly determine the impact on the utility, its ratepayers, and security issues. The results indicate that a licensing delay has an adverse impact on the utility's internal generation of funds and financial indicators used to evaluate financial soundness. The direction of impact on electricity rates is contingent on the source of fuel used for replacement power. 5 references, 5 tables

  6. Methods of dichotic listening as a research methodology for hemispheric interaction.

    Kovyazina M.S.

    2014-07-01

    Full Text Available Experimental data was obtained from a dichotic listening test by patients with unilateral brain lesions and corpus callosum pathology (agenesis, cysts, degenerative changes, etc. Efficiency index analysis shows that interhemispheric interaction in the audioverbal sphere depends to a greater extent on the right hemisphere state. The dichotic listening technique is not an informative means of studying hemispheric interaction, since it does not allow a clear distinction between hemispheric symptoms and symptoms of pathology of the corpus callosum. Thus, violations of hemispheric relations caused by disorders of the corpus callosum and cerebral hemispheres change worth more right hemisphere activity.

  7. Outbreak!: Teaching Clinical and Diagnostic Microbiology Methodologies with an Interactive Online Game

    Clark, Sherri; Smith, Geoffrey Battle

    2004-01-01

    Outbreak! is an online, interactive educational game that helps students and teachers learn and evaluate clinical microbiology skills. When the game was used in introductory microbiology laboratories, qualitative evaluation by students showed very positive responses and increased learning. Outbreak! allows students to design diagnostic tests and…

  8. Regulation of dopamine transporter function by protein-protein interactions: new discoveries and methodological challenges

    Eriksen, Jacob; Jørgensen, Trine Nygaard; Gether, Ulrik

    2010-01-01

    -synaptic neurons. This has led to the identification of a plethora of different kinases, receptors and scaffolding proteins that interact with DAT and hereby either modulate the catalytic activity of the transporter or regulate its trafficking and degradation. Several new tools for studying DAT regulation in live...

  9. Interdepartmental interaction model on the extracurricular activities of students in the city of Surgut in the quality management system of the municipal state institution "Information and Methodological Center"

    Loseva E. A.

    2018-01-01

    in this article the author considers interdepartmental interaction model in the field of extracurricular activities of students in the quality management system. The topic is examined on the example of the municipal state institution "Information and Methodological Center".

  10. Application of a power plant simplification methodology: The example of the condensate feedwater system

    Seong, P.H.; Manno, V.P.; Golay, M.W.

    1988-01-01

    A novel framework for the systematic simplification of power plant design is described with a focus on the application for the optimization of condensate feedwater system (CFWS) design. The evolution of design complexity of CFWS is reviewed with emphasis upon the underlying optimization process. A new evaluation methodology which includes explicit accounting of human as well as mechanical effects upon system availability is described. The unifying figure of merit for an operating system is taken to be net electricity production cost. The evaluation methodology is applied to the comparative analysis of three designs. In the illustrative examples, the results illustrate how inclusion in the evaluation of explicit availability related costs leads to optimal configurations. These are different from those of current system design practices in that thermodynamic efficiency and capital cost optimization are not overemphasized. Rather a more complete set of design-dependent variables is taken into account, and other important variables which remain neglected in current practices are identified. A critique of the new optimization approach and a discussion of future work areas including improved human performance modeling and different optimization constraints are provided. (orig.)

  11. Assessment of ISLOCA risk: Methodology and application to a Babcock and Wilcox nuclear power plant

    Galyean, W.J.; Gertman, D.I.

    1992-04-01

    This report presents information essential to understanding the risk associated with inter-system loss-of-coolant accidents (ISLOCAs). The methodology developed and presented in the report provides a state-of-the-art method for identifying and evaluating plant-specific hardware design, human performance issues, and accident consequence factors to relevant to the prediction of the ISLOCA risk. This ISLOCA methodology was developed and then applied to a Babcock and Wilcox (B ampersand W) nuclear power plants. The results from this application are described in detail. For this particular B ampersand W reference plant, the assessment indicated that the probability of a severe ISLOCA is approximately 2.2E-06/reactor-year. This document Volume 3 provides appendices A--H of the report. Topics are: Historical experience related to ISLOCA events; component failure rates; reference B ampersand W plant system descriptions; reference B ampersand W plant ISLOCA event trees; Human reliability analysis for the B ampersand W ISLOCA probabilistic risk assessment; thermal hydraulic calculations; bounding core uncovery time calculations; and system rupture probability

  12. Application of machine learning methodology for pet-based definition of lung cancer

    Kerhet, A.; Small, C.; Quon, H.; Riauka, T.; Schrader, L.; Greiner, R.; Yee, D.; McEwan, A.; Roa, W.

    2010-01-01

    We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (nsclc) tumours in positron-emission tomography–computed tomography (pet–ct) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet–ct and a treatment-planning ct image. The reference gross tumour volume (gtv) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (suv) thresholds that most closely approximated the gtv contour on each slice. A set of uptake distribution-related attributes was calculated for each pet slice. A machine learning algorithm was trained on a subset of the pet slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm’s performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference suv thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in nsclc. PMID:20179802

  13. An update on technical and methodological aspects for cardiac PET applications

    PRESOTTO, Luca; BUSNARDO, Elena; GIANOLLI, Luigi; BETTINARDI, Valentino

    2016-01-01

    Positron emission tomography (PET) is indicated for a large number of cardiac diseases: perfusion and viability studies are commonly used to evaluate coronary artery disease; PET can also be used to assess sarcoidosis and endocarditis, as well as to investigate amyloidosis. Furthermore, a hot topic for research is plaque characterization. Most of these studies are technically very challenging. High count rates and short acquisition times characterize perfusion scans while very small targets have to be imaged in inflammation/infection and plaques examinations. Furthermore, cardiac PET suffers from respiratory and cardiac motion blur. Each type of studies has specific requirements from the technical and methodological point of view, thus PET systems with overall high performances are required. Furthermore, in the era of hybrid PET/computed tomography (CT) and PET/Magnetic Resonance Imaging (MRI) systems, the combination of complementary functional and anatomical information can be used to improve diagnosis and prognosis. Moreover, PET images can be qualitatively and quantitatively improved exploiting information from the other modality, using advanced algorithms. In this review we will report the latest technological and methodological innovations for PET cardiac applications, with particular reference to the state of the art of the hybrid PET/CT and PET/MRI. We will also report the most recent advancements in software, from reconstruction algorithms to image processing and analysis programs.

  14. Application of REPAS Methodology to Assess the Reliability of Passive Safety Systems

    Franco Pierro

    2009-01-01

    Full Text Available The paper deals with the presentation of the Reliability Evaluation of Passive Safety System (REPAS methodology developed by University of Pisa. The general objective of the REPAS is to characterize in an analytical way the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems. The REPAS can be used in the design of the passive safety systems to assess their goodness and to optimize their costs. It may also provide numerical values that can be used in more complex safety assessment studies and it can be seen as a support to Probabilistic Safety Analysis studies. With regard to this, some examples in the application of the methodology are reported in the paper. A best-estimate thermal-hydraulic code, RELAP5, has been used to support the analyses and to model the selected systems. Probability distributions have been assigned to the uncertain input parameters through engineering judgment. Monte Carlo method has been used to propagate uncertainties and Wilks' formula has been taken into account to select sample size. Failure criterions are defined in terms of nonfulfillment of the defined design targets.

  15. Vulnerability or Sensitivity to the Environment? Methodological Issues, Trends, and Recommendations in Gene–Environment Interactions Research in Human Behavior

    Caroline Leighton

    2017-06-01

    Full Text Available Research on the potential role of gene–environment interactions (GxE in explaining vulnerability to psychopathology in humans has witnessed a shift from a diathesis-stress perspective to differential susceptibility approaches. This paper critically reviews methodological issues and trends in this body of research. Databases were screened for studies of GxE in the prediction of personality traits, behavior, and mental health disorders in humans published between January 2002 and January 2015. In total, 315 papers were included. Results showed that 34 candidate genes have been included in GxE studies. Independent of the type of environment studied (early or recent life events, positive or negative environments, about 67–83% of studies have reported significant GxE interactions, which is consistent with a social susceptibility model. The percentage of positive results does not seem to differ depending on the gene studied, although publication bias might be involved. However, the number of positive findings differs depending on the population studied (i.e., young adults vs. older adults. Methodological considerations limit the ability to draw strong conclusions, particularly as almost 90% (n = 283/315 of published papers are based on samples from North America and Europe, and about 70% of published studies (219/315 are based on samples that were also used in other reports. At the same time, there are clear indications of methodological improvements over time, as is shown by a significant increase in longitudinal and experimental studies as well as in improved minimum genotyping. Recommendations for future research, such as minimum quality assessment of genes and environmental factors, specifying theoretical models guiding the study, and taking into account of cultural, ethnic, and lifetime perspectives, are formulated.

  16. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  17. Social Life Cycle Assessment as a Management Tool: Methodology for Application in Tourism

    Roberto Merli

    2013-08-01

    Full Text Available As is widely known, sustainability is an important factor in competition, increasing the added value of a company in terms of image and credibility. However, it is important that sustainability assessments are effectively addressed in a global perspective. Therefore, life cycle tools are adopted to evaluate environmental and social impacts. Among these, and of particular significance, appears the Social Life Cycle Assessment (SLCA, which, although in its early stage of development, seems to have extremely promising methodological features. For this reason, it seemed interesting to propose a first application to the tourism sector, which could be better than other methods, studied in terms of social sustainability data. The particular characteristics of service delivery lend themselves more to the development of data related to social sustainability than other sectors. In this paper the results of a case study carried out using social accounting and business management tools are shown.

  18. A statistical methodology for the estimation of extreme wave conditions for offshore renewable applications

    Larsén, Xiaoli Guo; Kalogeri, Christina; Galanis, George

    2015-01-01

    and post-process outputs from a high resolution numerical wave modeling system for extreme wave estimation based on the significant wave height. This approach is demonstrated through the data analysis at a relatively deep water site, FINO 1, as well as a relatively shallow water area, coastal site Horns...... as a characteristic index of extreme wave conditions. The results from the proposed methodology seem to be in a good agreement with the measurements at both the relatively deep, open water and the shallow, coastal water sites, providing a potentially useful tool for offshore renewable energy applications. © 2015...... Rev, which is located in the North Sea, west of Denmark. The post-processing targets at correcting the modeled time series of the significant wave height, in order to match the statistics of the corresponding measurements, including not only the conventional parameters such as the mean and standard...

  19. Methodology for validating technical tools to assess customer Demand Response: Application to a commercial customer

    Alcazar-Ortega, Manuel; Escriva-Escriva, Guillermo; Segura-Heras, Isidoro

    2011-01-01

    The authors present a methodology, which is demonstrated with some applications to the commercial sector, in order to validate a Demand Response (DR) evaluation method previously developed and applied to a wide range of industrial and commercial segments, whose flexibility was evaluated by modeling. DR is playing a more and more important role in the framework of electricity systems management for the effective integration of other distributed energy resources. Consequently, customers must identify what they are using the energy for in order to use their flexible loads for management purposes. Modeling tools are used to predict the impact of flexibility on the behavior of customers, but this result needs to be validated since both customers and grid operators have to be confident in these flexibility predictions. An easy-to-use two-steps method to achieve this goal is presented in this paper.

  20. The application of life cycle assessment to integrated solid waste management. Pt. 1: Methodology

    Clift, R.; Doig, A.; Finnveden, G.

    2000-07-01

    Integrated Waste Management is one of the holistic approaches to environmental and resource management which are emerging from applying the concept of sustainable development. Assessment of waste management options requires application of Life Cycle Assessment (LCA). This paper summarizes the methodology for applying LCA to Integrated Waste Management of Municipal Solid Wastes (MSW) developed for and now used by the UK Environment Agency, including recent developments in international fora. Particular attention is devoted to system definition leading to rational and clear compilation of the Life Cycle Inventory, with appropriate 'credit' for recovering materials and/or energy from the waste. LCA of waste management is best seen as a way of structuring information to help decision processes. (Author)

  1. Selection of phage-displayed accessible recombinant targeted antibodies (SPARTA): methodology and applications.

    D'Angelo, Sara; Staquicini, Fernanda I; Ferrara, Fortunato; Staquicini, Daniela I; Sharma, Geetanjali; Tarleton, Christy A; Nguyen, Huynh; Naranjo, Leslie A; Sidman, Richard L; Arap, Wadih; Bradbury, Andrew Rm; Pasqualini, Renata

    2018-05-03

    We developed a potentially novel and robust antibody discovery methodology, termed selection of phage-displayed accessible recombinant targeted antibodies (SPARTA). This combines an in vitro screening step of a naive human antibody library against known tumor targets, with in vivo selections based on tumor-homing capabilities of a preenriched antibody pool. This unique approach overcomes several rate-limiting challenges to generate human antibodies amenable to rapid translation into medical applications. As a proof of concept, we evaluated SPARTA on 2 well-established tumor cell surface targets, EphA5 and GRP78. We evaluated antibodies that showed tumor-targeting selectivity as a representative panel of antibody-drug conjugates (ADCs) and were highly efficacious. Our results validate a discovery platform to identify and validate monoclonal antibodies with favorable tumor-targeting attributes. This approach may also extend to other diseases with known cell surface targets and affected tissues easily isolated for in vivo selection.

  2. Improving life cycle assessment methodology for the application of decision support

    Herrmann, Ivan Tengbjerg

    for the application of decision support and evaluation of uncertainty in LCA. From a decision maker’s (DM’s) point of view there are at least three main “illness” factors influencing the quality of the information that the DM uses for making decisions. The factors are not independent of each other, but it seems......) refrain from making a decision based on an LCA and thus support a decision on other parameters than the LCA environmental parameters. Conversely, it may in some decision support contexts be acceptable to base a decision on highly uncertain information. This all depends on the specific decision support...... the different steps. A deterioration of the quality in each step is likely to accumulate through the statistical value chain in terms of increased uncertainty and bias. Ultimately this can make final decision support problematic. The "Law of large numbers" (LLN) is the methodological tool/probability theory...

  3. Designing interaction behaviour in service-oriented enterprise application integration

    Dirgahayu, T.; Quartel, Dick; van Sinderen, Marten J.

    In this paper we present an approach for designing interaction behaviour in service-oriented enterprise application integration. The approach enables business analysts to actively participate in the design of an integration solution. In this way, we expect that the solution meets its integration

  4. Designing interactive ambient multimedia applications : requirements and implementation challenges

    Obrenovic, Z.; Nack, F.-M.; Hardman, H.L.

    2006-01-01

    Ambient intelligence opens new possibilities for interactive multimedia, leading towards applications where the selection, generation and playback of multimedia content can be directed and influenced by multiple users in an ambient sensor network. In this paper, we derive the basic requirements for

  5. Development of the CPXSD Methodology for Generation of Fine-Group Libraries for Shielding Applications

    Alpan, F. Arzu; Haghighat, Alireza

    2005-01-01

    Multigroup cross sections are one of the major factors that cause uncertainties in the results of deterministic transport calculations. Thus, it is important to prepare effective cross-section libraries that include an appropriate group structure and are based on an appropriate spectrum. There are several multigroup cross-section libraries available for particular applications. For example, the 47-neutron, 20-gamma group BUGLE library that is derived from the 199-neutron, 42-gamma group VITAMIN-B6 library is widely used for light water reactor (LWR) shielding and pressure vessel dosimetry applications. However, there is no publicly available methodology that can construct problem-dependent libraries. Thus, the authors have developed the Contributon and Point-wise Cross Section Driven (CPXSD) methodology for constructing effective fine- and broad-group structures. In this paper, new fine-group structures were constructed using the CPXSD, and new fine-group cross-section libraries were generated. The 450-group LIB450 and 589-group LIB589 libraries were developed for neutrons sensitive to the fast and thermal energy ranges, respectively, for LWR shielding problems. As compared to a VITAMIN-B6-like library, the new fine-group library developed for fast neutron dosimetry calculations resulted in closer agreement to the continuous-energy predictions. For example, for the fast neutron cavity dosimetry, ∼4% improvement was observed for the 237 Np(n,f) reaction rate. For the thermal neutron 1 H(n, γ) reaction, a maximum improvement of ∼14% was observed in the reaction rate at the middowncomer position

  6. Applications of a surveillance and diagnostics methodology using neutron noise from a pressurized-water reactor

    Wood, R.T.; Miller, L.F.; Perez, R.B.

    1992-01-01

    Two applications of a noise diagnostic methodology were performed with ex-core neutron detector data from a pressurized-water reactor (PWR). A feedback dynamics model of the neutron power spectral denisty was derived from a low-order whole-plant physical model made stochastic with the Langevin technique. From a functional fit to plant data, the response of the dynamic system to changes in important physical parameters was evaluated by a direct sensitivity analysis. In addition, changes in monitored spectra were related to changes in physical parameters, and detection thresholds using common surveillance discriminants were determined. A resonance model was developed from perturbation theory to give the ex-core neutron detector response for small in-core mechanical motions in terms of a pole-strength factor, a resonance asymmetry (or skewness) factor, a vibration damping factor, and a frequency of vibration. The mechanical motion paramters for several resonances were determined by a functional fit of the model to plant data taken at various times during a fuel cycle and were tracked to determined trends that indicated vibrational changes of reactor internals. In addition, the resonance model gave the ability to separate the resonant components of the power spectral density after the parameters had been identified. As a result, the behavior of several vibration peaks was monitored over a fuel cycle. The noise diagnostic methodology illustrated by these applications can be used in monitoring the condition of the reactor system. Early detection of degraded mechanical components or undesirable operating conditions by using such surveillance and diagnostic techniques would enhance plant safety. 15 refs., 6 figs., 1 tab

  7. [Binding interaction of harpagoside and bovine serum albumin: spectroscopic methodologies and molecular docking].

    Cao, Tuan-Wu; Huang, Wen-Bing; Shi, Jian-Wei; He, Wei

    2018-03-01

    Scrophularia ningpoensis has exhibited a variety of biological activities and been used as a pharmaceutical product for the treatment of inflammatory ailment, rheumatoid arthritis, osteoarthritis and so on. Harpagoside (HAR) is considerer as a main bioactive compound in this plant. Serum albumin has important physiological roles in transportation, distribution and metabolism of many endogenous and exogenous substances in body. It is of great significance to study the interaction mechanism between HAR and bovine serum albumin (BSA). The mechanism of interaction between HAR and BSA was investigated using 2D and 3D fluorescence, synchronous florescence, ultraviolet spectroscopy and molecular docking. According to the analysis of fluorescence spectra, HAR could strongly quench the fluorescence of BSA, and the static quenching process indicated that the decrease in the quenching constant was observed with the increase in temperature. The magnitude of binding constants (KA) was more than 1×10⁵ L·mol⁻¹, and the number of binding sites(n) was approximate to 1. The thermodynamic parameters were calculated through analysis of fluorescence data with Stern-Volmer and Van't Hoff equation. The calculated enthalpy change (ΔH) and entropy change (ΔS) implied that the main interaction forces of HAR with BSA were the bonding interaction between van der Waals forces and hydrogen. The negative values of energy (ΔG) demonstrated that the binding of HAR with BSA was a spontaneous and exothermic process. The binding distance(r) between HAR and BSA was calculated to be about 2.80 nm based on the theory of Frster's non-radiation energy transfer, which indicated that energy is likely to be transfer from BSA to HAR. Both synchronous and 3D florescence spectroscopy clearly revealed that the microenvironment and conformation of BSA changed during the binding interaction between HAR and BSA. The molecular docking analysis revealed HAR is more inclined to BSA and human serum albumin

  8. Simple knowledge-based descriptors to predict protein-ligand interactions. Methodology and validation

    Nissink, J. Willem M.; Verdonk, Marcel L.; Klebe, Gerhard

    2000-11-01

    A new type of shape descriptor is proposed to describe the spatial orientation for non-covalent interactions. It is built from simple, anisotropic Gaussian contributions that are parameterised by 10 adjustable values. The descriptors have been used to fit propensity distributions derived from scatter data stored in the IsoStar database. This database holds composite pictures of possible interaction geometries between a common central group and various interacting moieties, as extracted from small-molecule crystal structures. These distributions can be related to probabilities for the occurrence of certain interaction geometries among different functional groups. A fitting procedure is described that generates the descriptors in a fully automated way. For this purpose, we apply a similarity index that is tailored to the problem, the Split Hodgkin Index. It accounts for the similarity in regions of either high or low propensity in a separate way. Although dependent on the division into these two subregions, the index is robust and performs better than the regular Hodgkin index. The reliability and coverage of the fitted descriptors was assessed using SuperStar. SuperStar usually operates on the raw IsoStar data to calculate propensity distributions, e.g., for a binding site in a protein. For our purpose we modified the code to have it operate on our descriptors instead. This resulted in a substantial reduction in calculation time (factor of five to eight) compared to the original implementation. A validation procedure was performed on a set of 130 protein-ligand complexes, using four representative interacting probes to map the properties of the various binding sites: ammonium nitrogen, alcohol oxygen, carbonyl oxygen, and methyl carbon. The predicted `hot spots' for the binding of these probes were compared to the actual arrangement of ligand atoms in experimentally determined protein-ligand complexes. Results indicate that the version of SuperStar that applies to

  9. Marked Object Recognition Multitouch Screen Printed Touchpad for Interactive Applications.

    Nunes, Jivago Serrado; Castro, Nelson; Gonçalves, Sergio; Pereira, Nélson; Correia, Vitor; Lanceros-Mendez, Senentxu

    2017-12-01

    The market for interactive platforms is rapidly growing, and touchscreens have been incorporated in an increasing number of devices. Thus, the area of smart objects and devices is strongly increasing by adding interactive touch and multimedia content, leading to new uses and capabilities. In this work, a flexible screen printed sensor matrix is fabricated based on silver ink in a polyethylene terephthalate (PET) substrate. Diamond shaped capacitive electrodes coupled with conventional capacitive reading electronics enables fabrication of a highly functional capacitive touchpad, and also allows for the identification of marked objects. For the latter, the capacitive signatures are identified by intersecting points and distances between them. Thus, this work demonstrates the applicability of a low cost method using royalty-free geometries and technologies for the development of flexible multitouch touchpads for the implementation of interactive and object recognition applications.

  10. Marked Object Recognition Multitouch Screen Printed Touchpad for Interactive Applications

    Jivago Serrado Nunes

    2017-12-01

    Full Text Available The market for interactive platforms is rapidly growing, and touchscreens have been incorporated in an increasing number of devices. Thus, the area of smart objects and devices is strongly increasing by adding interactive touch and multimedia content, leading to new uses and capabilities. In this work, a flexible screen printed sensor matrix is fabricated based on silver ink in a polyethylene terephthalate (PET substrate. Diamond shaped capacitive electrodes coupled with conventional capacitive reading electronics enables fabrication of a highly functional capacitive touchpad, and also allows for the identification of marked objects. For the latter, the capacitive signatures are identified by intersecting points and distances between them. Thus, this work demonstrates the applicability of a low cost method using royalty-free geometries and technologies for the development of flexible multitouch touchpads for the implementation of interactive and object recognition applications.

  11. A non-linear reduced order methodology applicable to boiling water reactor stability analysis

    Prill, Dennis Paul

    2013-01-01

    Thermal-hydraulic coupling between power, flow rate and density, intensified by neutronics feedback are the main drivers of boiling water reactor (BWR) stability behavior. High-power low-flow conditions in connection with unfavorable power distributions can lead the BWR system into unstable regions where power oscillations can be triggered. This important threat to operational safety requires careful analysis for proper understanding. Analyzing an exhaustive parameter space of the non-linear BWR system becomes feasible with methodologies based on reduced order models (ROMs), saving computational cost and improving the physical understanding. Presently within reactor dynamics, no general and automatic prediction of high-dimensional ROMs based on detailed BWR models are available. In this thesis a systematic self-contained model order reduction (MOR) technique is derived which is applicable for several classes of dynamical problems, and in particular to BWRs of any degree of details. Expert knowledge can be given by operational, experimental or numerical transient data and is transfered into an optimal basis function representation. The methodology is mostly automated and provides the framework for the reduction of various different systems of any level of complexity. Only little effort is necessary to attain a reduced version within this self-written code which is based on coupling of sophisticated commercial software. The methodology reduces a complex system in a grid-free manner to a small system able to capture even non-linear dynamics. It is based on an optimal choice of basis functions given by the so-called proper orthogonal decomposition (POD). Required steps to achieve reliable and numerical stable ROM are given by a distinct calibration road-map. In validation and verification steps, a wide spectrum of representative test examples is systematically studied regarding a later BWR application. The first example is non-linear and has a dispersive character

  12. Development and application of a methodology for identifying and characterising scenarios

    Billington, D.; Bailey, L.

    1998-01-01

    interval along each timeline. This report presents illustrative examples of the application of the above methodology to achieve this aim. The results of risk calculations and assigned weights are plotted on a 'weight-risk diagram', which is used to judge the relative significance of the different variant scenarios in relation to the base scenario and the regulatory risk target. The application of this methodology is consistent with a staged approach to performance assessment, in which effort is focused initially on scoping calculations of conditional risk. Only those variant scenarios giving a higher conditional risk than the base scenario are subject to more detailed evaluation, including the assignment of an appropriate weight. From the limited trialling that has been undertaken, the indications are that a tractable approach, consistent with the objectives of comprehensiveness, traceability and clarity, has been achieved. (author)

  13. [Proposed difficult airway teaching methodology. Presentation of an interactive fresh frozen cadaver model].

    Catalá Bauset, J C; de Andres Ibañez, J A; Valverde Navarro, A; Martinez Soriano, F

    2014-04-01

    The aim of this paper is to present a methodology based on the use of fresh-frozen cadavers for training in the management of the airway, and to evaluate the degree of satisfaction among learning physicians. About 6 fresh-frozen cadavers and 14 workstations were prepared where participants were trained in the different skills needed for airway management. The details of preparation of the cadavers are described. The level of satisfaction of the participant was determined using a Likert rating scale of 5 points, at each of the 14 stations, as well as the overall assessment and clinical usefulness of the course. The mean overall evaluation of the course and its usefulness was 4.75 and 4.9, out of 5, respectively. All parts of the course were rated above 4 out of 5. The high level of satisfaction of the course remained homogeneous in the 2 editions analysed. The overall satisfaction of the course was not finally and uniquely determined by any of its particular parts. The fresh cadaver model for training physicians in techniques of airway management is a proposal satisfactory to the participant, and with a realism that approaches the live patient. Copyright © 2013 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Published by Elsevier España. All rights reserved.

  14. Canine neuroanatomy: Development of a 3D reconstruction and interactive application for undergraduate veterinary education.

    Raffan, Hazel; Guevar, Julien; Poyade, Matthieu; Rea, Paul M

    2017-01-01

    Current methods used to communicate and present the complex arrangement of vasculature related to the brain and spinal cord is limited in undergraduate veterinary neuroanatomy training. Traditionally it is taught with 2-dimensional (2D) diagrams, photographs and medical imaging scans which show a fixed viewpoint. 2D representations of 3-dimensional (3D) objects however lead to loss of spatial information, which can present problems when translating this to the patient. Computer-assisted learning packages with interactive 3D anatomical models have become established in medical training, yet equivalent resources are scarce in veterinary education. For this reason, we set out to develop a workflow methodology creating an interactive model depicting the vasculature of the canine brain that could be used in undergraduate education. Using MR images of a dog and several commonly available software programs, we set out to show how combining image editing, segmentation and surface generation, 3D modeling and texturing can result in the creation of a fully interactive application for veterinary training. In addition to clearly identifying a workflow methodology for the creation of this dataset, we have also demonstrated how an interactive tutorial and self-assessment tool can be incorporated into this. In conclusion, we present a workflow which has been successful in developing a 3D reconstruction of the canine brain and associated vasculature through segmentation, surface generation and post-processing of readily available medical imaging data. The reconstructed model was implemented into an interactive application for veterinary education that has been designed to target the problems associated with learning neuroanatomy, primarily the inability to visualise complex spatial arrangements from 2D resources. The lack of similar resources in this field suggests this workflow is original within a veterinary context. There is great potential to explore this method, and introduce

  15. Video over DSL with LDGM Codes for Interactive Applications

    Laith Al-Jobouri

    2016-05-01

    Full Text Available Digital Subscriber Line (DSL network access is subject to error bursts, which, for interactive video, can introduce unacceptable latencies if video packets need to be re-sent. If the video packets are protected against errors with Forward Error Correction (FEC, calculation of the application-layer channel codes themselves may also introduce additional latency. This paper proposes Low-Density Generator Matrix (LDGM codes rather than other popular codes because they are more suitable for interactive video streaming, not only for their computational simplicity but also for their licensing advantage. The paper demonstrates that a reduction of up to 4 dB in video distortion is achievable with LDGM Application Layer (AL FEC. In addition, an extension to the LDGM scheme is demonstrated, which works by rearranging the columns of the parity check matrix so as to make it even more resilient to burst errors. Telemedicine and video conferencing are typical target applications.

  16. Application of Binomial Model and Market Asset Declaimer Methodology for Valuation of Abandon and Expand Options. The Case Study

    Paweł Mielcarz

    2007-06-01

    Full Text Available The article presents a case study of valuation of real options included in a investment project. The main goal of the article is to present the calculation and methodological issues of application the methodology for real option valuation. In order to do it there are used the binomial model and Market Asset Declaimer methodology. The project presented in the article concerns the introduction of radio station to a new market. It includes two valuable real options: to abandon the project and to expand.

  17. Algae-bacteria interactions: Evolution, ecology and emerging applications.

    Ramanan, Rishiram; Kim, Byung-Hyuk; Cho, Dae-Hyun; Oh, Hee-Mock; Kim, Hee-Sik

    2016-01-01

    Algae and bacteria have coexisted ever since the early stages of evolution. This coevolution has revolutionized life on earth in many aspects. Algae and bacteria together influence ecosystems as varied as deep seas to lichens and represent all conceivable modes of interactions - from mutualism to parasitism. Several studies have shown that algae and bacteria synergistically affect each other's physiology and metabolism, a classic case being algae-roseobacter interaction. These interactions are ubiquitous and define the primary productivity in most ecosystems. In recent years, algae have received much attention for industrial exploitation but their interaction with bacteria is often considered a contamination during commercialization. A few recent studies have shown that bacteria not only enhance algal growth but also help in flocculation, both essential processes in algal biotechnology. Hence, there is a need to understand these interactions from an evolutionary and ecological standpoint, and integrate this understanding for industrial use. Here we reflect on the diversity of such relationships and their associated mechanisms, as well as the habitats that they mutually influence. This review also outlines the role of these interactions in key evolutionary events such as endosymbiosis, besides their ecological role in biogeochemical cycles. Finally, we focus on extending such studies on algal-bacterial interactions to various environmental and bio-technological applications. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Methodology of Segment Management Reporting on the Profitability of Agricultural Holding Interaction with Customers

    Aleksandra Vasilyevna Glushchenko

    2015-12-01

    Full Text Available The state program of agricultural development and regulation of agricultural products, raw materials and food in a food embargo on the West European suppliers is aimed at the revitalization of the holding structures. The main purpose of agricultural holdings is to ensure food safety and to maximize the consolidated profit in resource-limited settings. The heterogeneous nature of the needs of customers, leading to different performance of agricultural holding interaction with them has an impact on the formulation and conduct of accounting and requires the formation of an aggregated and relevant information about the profitability of relationships with groups of customers and the long-term development strategy of agroformation interaction with them, so there is a need for research and development methodical bases of formation of the administrative reporting segment that meets the needs of modern practice. The purpose of this study is to develop a method of forming the segment management reporting on the profitability of agricultural holding interaction with customers. As part of the problem research, the authors used different scientific methods, such as analysis, synthesis, observation, group data and logic synthesis. The article discusses the necessity of segmentation agricultural holding customers by the criterion of “cooperation profitability”. The basic problem of generating information about the cost of trading in the accounting information system of agricultural holdings is dealt with; a method of forming the segment management reporting based on the results of the ABC analysis including calculation algorithm functional trade costs (Activity-Based Costing, is developed; rank order of agroholding customers is suggested in accordance with the calculated interval limits for them: Segment A - “highly profitable customers,” B - “problem customers” and C - “low-profit customers”; a set of registers and management accounting

  19. Application of extended statistical combination of uncertainties methodology for digital nuclear power plants

    In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.

  20. SystemVerilog assertions and functional coverage guide to language, methodology and applications

    Mehta, Ashok B

    2016-01-01

    This book provides a hands-on, application-oriented guide to the language and methodology of both SystemVerilog Assertions and SystemVerilog Functional Coverage. Readers will benefit from the step-by-step approach to functional hardware verification using SystemVerilog Assertions and Functional Coverage, which will enable them to uncover hidden and hard to find bugs, point directly to the source of the bug, provide for a clean and easy way to model complex timing checks and objectively answer the question ‘have we functionally verified everything’. Written by a professional end-user of ASIC/SoC/CPU and FPGA design and Verification, this book explains each concept with easy to understand examples, simulation logs and applications derived from real projects. Readers will be empowered to tackle the modeling of complex checkers for functional verification, thereby drastically reducing their time to design and debug. This updated second edition addresses the latest functional set released in IEEE-1800 (2012) L...

  1. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  2. Application of Response Surface Methodology to Optimize Malachite Green Removal by Cl-nZVI Nanocomposites

    Farshid Ghorbani

    2017-09-01

    Full Text Available Disposal of effluents containing dyes into natural ecosystems pose serious threats to both the environment and its aquatic life. Malachite green (MG is a basic dye that has extensive industrial applications, especially in aquaculture, throughout the world. This study reports on the application of the central composite design (CCD under the response surface methodology (RSM for the optimization of MG adsorption from aqueous solutions using the clinoptilolite nano-zerovalence iron (Cl-nZVI nanocomposites. The sorbent structures produced are characterized by means of scanning electron micrograph (SEM, energy-dispersive X-ray spectroscopy (EDS, and vibrating sample magnetometer (VSM. The effects of different parameters including pH, initial MG concentration, and sorbent dosage on the removal efficiency (R of MG were studied to find the optimum operating conditions. For this purpose, a total of 20 sets of experiments were designed by the Design Expert.7.0 software and the values of removal efficiency were used as input response to the software. The optimum pH, initial MG concentration, and sorbent dosage were found to be 5.6, 49.21 mg.L-1, and 1.43 g.L-1, respectively. A high MG removal efficiency (57.90% was obtained with optimal process parameters. Moreover, a desirability value of 0.963 was obtained for the optimization process.

  3. Development of a new damage function model for power plants: Methodology and applications

    Levy, J.I.; Hammitt, J.K.; Yanagisawa, Y.; Spengler, J.D.

    1999-01-01

    Recent models have estimated the environmental impacts of power plants, but differences in assumptions and analytical methodologies have led to diverging findings. In this paper, the authors present a new damage function model that synthesizes previous efforts and refines components that have been associated with variations in impact estimates. Their model focuses on end-use emissions and quantified the direct human health impacts of criteria air pollutants. To compare their model to previous efforts and to evaluate potential policy applications, the authors assess the impacts of an oil and natural gas-fueled cogeneration power plant in Boston, MA. Impacts under baseline assumptions are estimated to be $0.007/kWh of electricity, $0.23/klb of steam, and $0.004/ton-h of chilled water (representing 2--9% of the market value of outputs). Impacts are largely related to ozone (48%) and particulate matter (42%). Addition of upstream emissions and nonpublic health impacts increases externalities by as much as 50%. Sensitivity analyses demonstrate the importance of plant siting, meteorological conditions, epidemiological assumptions, and the monetary value placed on premature mortality as well as the potential influence of global warming. Comparative analyses demonstrate that their model provides reasonable impact estimates and would therefore be applicable in a broad range of policy settings

  4. Methodology development for dosimetry of 90Sr + 90Y beta therapy applicators

    Coelho, T.S.; Yoriyaz, H.; Fernandes, M.A.R.

    2009-01-01

    The 9 0Sr+ 9 0Y applicators, used in beta therapy for prevention of keloids and pterigio, are imported and its dosimetric features are only illustrated by the manufacturers. The exhaustive routine of the medical physicists in the clinic do not make possible the accomplishment of procedures for the confirmation of these parameters. This work presents a methodology development for dosimetry in two 9 0Sr+ 9 0Y beta therapy applicators of the Amersham brand. The Monte Carlo code MCNP 4 C was used for the simulation of the percentage depth dose curves. The experimental measurements of the radiation attenuation had been done with a mini-extrapolation chamber. The results of the experimental measures had been compared with the simulated values. Both percentage deep dose curves, the theoretical and the experimental ones, had presented similar behavior, which may validate the use of the MCNP 4 C for these simulations, strengthening the usage of this method at procedures of dosimetry of these beta radiation sources. (author)

  5. Exciton-plasmon coupling interactions: from principle to applications

    Cao, En; Lin, Weihua; Sun, Mengtao; Liang, Wenjie; Song, Yuzhi

    2018-01-01

    The interaction of exciton-plasmon coupling and the conversion of exciton-plasmon-photon have been widely investigated experimentally and theoretically. In this review, we introduce the exciton-plasmon interaction from basic principle to applications. There are two kinds of exciton-plasmon coupling, which demonstrate different optical properties. The strong exciton-plasmon coupling results in two new mixed states of light and matter separated energetically by a Rabi splitting that exhibits a characteristic anticrossing behavior of the exciton-LSP energy tuning. Compared to strong coupling, such as surface-enhanced Raman scattering, surface plasmon (SP)-enhanced absorption, enhanced fluorescence, or fluorescence quenching, there is no perturbation between wave functions; the interaction here is called the weak coupling. SP resonance (SPR) arises from the collective oscillation induced by the electromagnetic field of light and can be used for investigating the interaction between light and matter beyond the diffraction limit. The study on the interaction between SPR and exaction has drawn wide attention since its discovery not only due to its contribution in deepening and broadening the understanding of SPR but also its contribution to its application in light-emitting diodes, solar cells, low threshold laser, biomedical detection, quantum information processing, and so on.

  6. Methodological Interactionism : Theory and Application to the Firm and to the Building of Trust

    Nooteboom, B.

    2007-01-01

    Recent insights from the ‘embodied cognition’ perspective in cognitive science, supported by neural research, provide a basis for a ‘methodological interactionism’ that transcends both the methodological individualism of economics and the methodological collectivism of (some) sociology, and is

  7. Response surface methodology investigation into the interactions between arsenic and humic acid in water during the coagulation process.

    Watson, Malcolm Alexander; Tubić, Aleksandra; Agbaba, Jasmina; Nikić, Jasmina; Maletić, Snežana; Molnar Jazić, Jelena; Dalmacija, Božo

    2016-07-15

    Interactions between arsenic and natural organic matter (NOM) are key limiting factors during the optimisation of drinking water treatment when significant amounts of both must be removed. This work uses Response Surface Methodology (RSM) to investigate how they interact during their simultaneous removal by iron chloride coagulation, using humic acid (HA) as a model NOM substance. Using a three factor Box-Behnken experimental design, As and HA removals were modelled, as well as a combined removal response. ANOVA results showed the significance of the coagulant dose for all three responses. At high initial arsenic concentrations (200μg/l), As removal was significantly hindered by the presence of HA. In contrast, the HA removal response was found to be largely independent of the initial As concentration, with the optimum coagulant dose increasing at increasing HA concentrations. The combined response was similar to the HA removal response, and the interactions evident are most interesting in terms of optimising treatment processes during the preparation of drinking water, highlighting the importance of utilizing RSM for such investigations. The combined response model was successfully validated with two different groundwaters used for drinking water supply in the Republic of Serbia, showing excellent agreement under similar experimental conditions. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Application of fault tree methodology to modeling of the AP1000 plant digital reactor protection system

    Teolis, D.S.; Zarewczynski, S.A.; Detar, H.L.

    2012-01-01

    The reactor trip system (RTS) and engineered safety features actuation system (ESFAS) in nuclear power plants utilizes instrumentation and control (IC) to provide automatic protection against unsafe and improper reactor operation during steady-state and transient power operations. During normal operating conditions, various plant parameters are continuously monitored to assure that the plant is operating in a safe state. In response to deviations of these parameters from pre-determined set points, the protection system will initiate actions required to maintain the reactor in a safe state. These actions may include shutting down the reactor by opening the reactor trip breakers and actuation of safety equipment based on the situation. The RTS and ESFAS are represented in probabilistic risk assessments (PRAs) to reflect the impact of their contribution to core damage frequency (CDF). The reactor protection systems (RPS) in existing nuclear power plants are generally analog based and there is general consensus within the PRA community on fault tree modeling of these systems. In new plants, such as AP1000 plant, the RPS is based on digital technology. Digital systems are more complex combinations of hardware components and software. This combination of complex hardware and software can result in the presence of faults and failure modes unique to a digital RPS. The United States Nuclear Regulatory Commission (NRC) is currently performing research on the development of probabilistic models for digital systems for inclusion in PRAs; however, no consensus methodology exists at this time. Westinghouse is currently updating the AP1000 plant PRA to support initial operation of plants currently under construction in the United States. The digital RPS is modeled using fault tree methodology similar to that used for analog based systems. This paper presents high level descriptions of a typical analog based RPS and of the AP1000 plant digital RPS. Application of current fault

  9. A tsunami PSA methodology and application for NPP site in Korea

    Kim, Min Kyu; Choi, In-Kil

    2012-01-01

    Highlights: ► A methodology of tsunami PSA was developed in this study. ► Tsunami return period was evaluated by empirical method using historical tsunami record and tidal gauge record. ► Procedure of tsunami fragility analysis was established and target equipments and structures for investigation of tsunami fragility assessment were selected. ► A sample fragility calculation was performed for the equipment in Nuclear Power Plant. ► Accident sequence of tsunami event is developed by according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. - Abstract: A methodology of tsunami PSA was developed in this study. A tsunami PSA consists of tsunami hazard analysis, tsunami fragility analysis and system analysis. In the case of tsunami hazard analysis, evaluation of tsunami return period is a major task. For the evaluation of tsunami return period, numerical analysis and empirical method can be applied. In this study, tsunami return period was evaluated by empirical method using historical tsunami record and tidal gauge record. For the performing a tsunami fragility analysis, procedure of tsunami fragility analysis was established and target equipments and structures for investigation of tsunami fragility assessment were selected. A sample fragility calculation was performed for the equipment in Nuclear Power Plant. In the case of system analysis, accident sequence of tsunami event is developed by according to the tsunami run-up and draw down, and tsunami induced core damage frequency (CDF) is determined. For the application to the real Nuclear Power Plant, the Ulchin 56 NPP which located in east coast of Korean peninsula was selected. Through this study, whole tsunami PSA working procedure was established and example calculation was performed for one of real Nuclear Power Plant in Korea. But for more accurate tsunami PSA result, there are many researches needed for evaluation of hydrodynamic force, effect of

  10. Application of Genetic Algorithm methodologies in fuel bundle burnup optimization of Pressurized Heavy Water Reactor

    Jayalal, M.L.; Ramachandran, Suja; Rathakrishnan, S.; Satya Murty, S.A.V.; Sai Baba, M.

    2015-01-01

    Highlights: • We study and compare Genetic Algorithms (GA) in the fuel bundle burnup optimization of an Indian Pressurized Heavy Water Reactor (PHWR) of 220 MWe. • Two Genetic Algorithm methodologies namely, Penalty Functions based GA and Multi Objective GA are considered. • For the selected problem, Multi Objective GA performs better than Penalty Functions based GA. • In the present study, Multi Objective GA outperforms Penalty Functions based GA in convergence speed and better diversity in solutions. - Abstract: The work carried out as a part of application and comparison of GA techniques in nuclear reactor environment is presented in the study. The nuclear fuel management optimization problem selected for the study aims at arriving appropriate reference discharge burnup values for the two burnup zones of 220 MWe Pressurized Heavy Water Reactor (PHWR) core. Two Genetic Algorithm methodologies namely, Penalty Functions based GA and Multi Objective GA are applied in this study. The study reveals, for the selected problem of PHWR fuel bundle burnup optimization, Multi Objective GA is more suitable than Penalty Functions based GA in the two aspects considered: by way of producing diverse feasible solutions and the convergence speed being better, i.e. it is capable of generating more number of feasible solutions, from earlier generations. It is observed that for the selected problem, the Multi Objective GA is 25.0% faster than Penalty Functions based GA with respect to CPU time, for generating 80% of the population with feasible solutions. When average computational time of fixed generations are considered, Penalty Functions based GA is 44.5% faster than Multi Objective GA. In the overall performance, the convergence speed of Multi Objective GA surpasses the computational time advantage of Penalty Functions based GA. The ability of Multi Objective GA in producing more diverse feasible solutions is a desired feature of the problem selected, that helps the

  11. Assessment of critical minerals: Updated application of an early-warning screening methodology

    McCullough, Erin A.; Nassar, Nedal

    2017-01-01

    Increasing reliance on non-renewable mineral resources reinforces the need for identifying potential supply constraints before they occur. The US National Science and Technology Council recently released a report that outlines a methodology for screening potentially critical minerals based on three indicators: supply risk (R), production growth (G), and market dynamics (M). This early-warning screening was initially applied to 78 minerals across the years 1996 to 2013 and identified a subset of minerals as “potentially critical” based on the geometric average of these indicators—designated as criticality potential (C). In this study, the screening methodology has been updated to include data for 2014, as well as to incorporate revisions and modifications to the data, where applicable. Overall, C declined in 2014 for the majority of minerals examined largely due to decreases in production concentration and price volatility. However, the results vary considerably across minerals, with some minerals, such as gallium, recording increases for all three indicators. In addition to assessing magnitudinal changes, this analysis also examines the significance of the change relative to historical variation for each mineral. For example, although mined nickel’s R declined modestly in 2014 in comparison to that of other minerals, it was by far the largest annual change recorded for mined nickel across all years examined and is attributable to Indonesia’s ban on the export of unprocessed minerals. Based on the 2014 results, 20 minerals with the highest C values have been identified for further study including the rare earths, gallium, germanium, rhodium, tantalum, and tungsten.

  12. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  13. Bioinspired methodology for preparing magnetic responsive chitosan beads to be integrated in a tubular bioreactor for biomedical applications.

    Song, Wenlong; Oliveira, Mariana B; Sher, Praveen; Gil, Sara; Nóbrega, J Miguel; Mano, João F

    2013-08-01

    Magnetic responsive chitosan beads were prepared using a methodology inspired by the rolling of water droplets over lotus leaves. Liquid precursors containing chitosan and magnetic microparticles were dispensed in the form of spherical droplets and crosslinked with genipin over synthetic superhydrophobic surfaces. Scanning electronic microscopy, histology and micro-computed tomography were employed to characterize the structure of the prepared composite beads and the inner distribution of the magnetic particles. Cellular metabolic activity tests showed that fibroblasts-like (L929 cell line) can adhere and proliferate on the prepared chitosan beads. We hypothesize that such spherical biomaterials could be integrated in a new concept of tubular bioreactor. The magnetic beads can be immobilized by an external magnetic field at specific positions and may be transported along the bioreactor by the drag of the culture medium flow. The system behavior was also studied through numerical modeling, which allowed to identify the relative importance of the main parameters, and to conclude that the distance between carrier beads plays a major role on their interaction with the culture medium and, consequently, on the overall system performance. In an up-scaled version of this bioreactor, the herein presented system may comprise different chambers in serial or parallel configurations. This constitutes a simple way of preparing magnetic responsive beads combined with a new design of bioreactor, which may find application in biomedicine and biotechnology, including in cell expansion for tissue engineering or for the production of therapeutic proteins to be used in cell therapies.

  14. Bioinspired methodology for preparing magnetic responsive chitosan beads to be integrated in a tubular bioreactor for biomedical applications

    Song, Wenlong; Oliveira, Mariana B; Sher, Praveen; Gil, Sara; Mano, João F; Nóbrega, J Miguel

    2013-01-01

    Magnetic responsive chitosan beads were prepared using a methodology inspired by the rolling of water droplets over lotus leaves. Liquid precursors containing chitosan and magnetic microparticles were dispensed in the form of spherical droplets and crosslinked with genipin over synthetic superhydrophobic surfaces. Scanning electronic microscopy, histology and micro-computed tomography were employed to characterize the structure of the prepared composite beads and the inner distribution of the magnetic particles. Cellular metabolic activity tests showed that fibroblasts-like (L929 cell line) can adhere and proliferate on the prepared chitosan beads. We hypothesize that such spherical biomaterials could be integrated in a new concept of tubular bioreactor. The magnetic beads can be immobilized by an external magnetic field at specific positions and may be transported along the bioreactor by the drag of the culture medium flow. The system behavior was also studied through numerical modeling, which allowed to identify the relative importance of the main parameters, and to conclude that the distance between carrier beads plays a major role on their interaction with the culture medium and, consequently, on the overall system performance. In an up-scaled version of this bioreactor, the herein presented system may comprise different chambers in serial or parallel configurations. This constitutes a simple way of preparing magnetic responsive beads combined with a new design of bioreactor, which may find application in biomedicine and biotechnology, including in cell expansion for tissue engineering or for the production of therapeutic proteins to be used in cell therapies. (paper)

  15. A systematic methodology to extend the applicability of a bioconversion model for the simulation of various co-digestion scenarios

    Kovalovszki, Adam; Alvarado-Morales, Merlin; Fotidis, Ioannis

    2017-01-01

    Detailed simulation of anaerobic digestion (AD) requires complex mathematical models and the optimization of numerous model parameters. By performing a systematic methodology and identifying parameters with the highest impact on process variables in a well-established AD model, its applicability...... was extended to various co-digestion scenarios. More specifically, the application of the step-by-step methodology led to the estimation of a general and reduced set of parameters, for the simulation of scenarios where either manure or wastewater were co-digested with different organic substrates. Validation...... experimental data quite well, indicating that it offers a reliable reference point for future simulations of anaerobic co-digestion scenarios....

  16. The applicability of the Centeno, Chaudhary and Lopez repair time standard methodology in a rail maintenance environment

    Rommelspacher, Karl Otto

    2015-11-01

    Full Text Available The establishment of labour standards within a production environment has become common practice, and is receiving growing recognition in the maintenance environment. However, the application of labour standards in a transit maintenance organisation has received limited attention. Centeno, Chaudhary and Lopez have developed a repair time standard methodology that has been applied in the transit bus maintenance facilities of three agencies in central Florida in the USA. An investigation into the applicability of this methodology in a rail maintenance environment in South Africa forms the basis for this study.

  17. ASAM - The international programme on application of safety assessment methodologies for near surface radioactive waste disposal facilities

    Batandjieva, B.

    2002-01-01

    The IAEA has launched a new Co-ordinated Research Project (CRP) on Application of Safety Assessment Methodologies for Near Surface Waste Disposal Facilities (ASAM). The CRP will focus on the practical application of the safety assessment methodology, developed under the ISAM programme, for different purposes, such as developing design concepts, licensing, upgrading existing repositories, reassessment of operating disposal facilities. The overall aim of the programme is to assist safety assessors, regulators and other specialists involved in the development and review of safety assessment for near surface disposal facilities in order to achieve transparent, traceable and defendable evaluation of safety of these facilities. (author)

  18. Light & Skin Interactions Simulations for Computer Graphics Applications

    Baranoski, Gladimir V G

    2010-01-01

    Light and Skin Interactions immerses you in one of the most fascinating application areas of computer graphics: appearance simulation. The book first illuminates the fundamental biophysical processes that affect skin appearance, and reviews seminal related works aimed at applications in life and health sciences. It then examines four exemplary modeling approaches as well as definitive algorithms that can be used to generate realistic images depicting skin appearance. An accompanying companion site also includes complete code and data sources for the BioSpec model, which is considered to be the

  19. Application of Response Surface Methodology in Development of Sirolimus Liposomes Prepared by Thin Film Hydration Technique

    Saeed Ghanbarzadeh

    2013-04-01

    Full Text Available Introduction: The present investigation was aimed to optimize the formulating process of sirolimus liposomes by thin film hydration method. Methods: In this study, a 32 factorial design method was used to investigate the influence of two independent variables in the preparation of sirolimus liposomes. The dipalmitoylphosphatidylcholine (DPPC /Cholesterol (Chol and dioleoyl phosphoethanolamine(DOPE /DPPC molar ratios were selected as the independent variables. Particle size (PS and Encapsulation Efficiency (EE % were selected as the dependent variables. To separate the un-encapsulated drug, dialysis method was used. Drug analysis was performed with a validated RP-HPLC method. Results: Using response surface methodology and based on the coefficient values obtained for independent variables in the regression equations, it was clear that the DPPC/Chol molar ratio was the major contributing variable in particle size and EE %. The use of a statistical approach allowed us to see individual and/or interaction effects of influencing parameters in order to obtain liposomes with desired properties and to determine the optimum experimental conditions that lead to the enhancement of characteristics. In the prediction of PS and EE % values, the average percent errors are found to be as 3.59 and 4.09%. This value is sufficiently low to confirm the high predictive power of model. Conclusion: Experimental results show that the observed responses were in close agreement with the predicted values and this demonstrates the reliability of the optimization procedure in prediction of PS and EE % in sirolimus liposomes preparation.

  20. Application of Response Surface Methodology in Extraction of Bioactive Component from Palm Leaves (Elaeis guineensis

    Nur Afiqah Arham

    2013-10-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 The hydroxyl groups of the polyphenols are capable to act as reducing agent for reduction reaction. The effect of drying temperature, extraction temperature and extraction duration were evaluated using central composite design which consists of 20 experimental runs. Response surface methodology (RSM was used to estimate the optimum parameters in extracting polyphenols from the palm leaves. The correspondence analysis of the results yielded a quadratic model which can be used to find optimum conditions of extraction process. The optimum extraction condition of drying temperature, extraction temperature and extraction duration are 70°C, at 70°C of 10 minutes, respectively. Total polyphenols were determined by application of the Folin-Ciocalteu micro method and the extract was found contain of 8 mg GAE/g dry palm leaves at optimum conditions. Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Doi: 10.12777/ijse.5.2.95-100 [How to cite this article: Arham, N.A., Mohamad, N.A.N., Jai, J., Krishnan, J., Noorsuhana Mohd Yusof, N.M. (2013. Application of Response Surface Methodology in Extraction of Bioactive Component from Palm Leaves (Elaeis guineensis. International Journal of Science and

  1. Using ECPs for Interactive Applications in Virtual Cinematography

    Wu , Hui-Yin; Li , Tsai-Yen; Christie , Marc

    2017-01-01

    International audience; This paper introduces an interactive application of our previous work on the Patterns language as creative assistant for editing cameras in 3D virtual environments. Patterns is a set of vocabulary, which was inspired by professional film practice and textbook terminology. The vocabulary allows one to define recurrent stylistic constraints on a sequence of shots, which we term " embedded constraint pattern " (ECP). In our previous work, we proposed a solver that allows ...

  2. Animal health surveillance applications: The interaction of science and management.

    Willeberg, Preben

    2012-08-01

    Animal health surveillance is an ever-evolving activity, since health- and risk-related policy and management decisions need to be backed by the best available scientific evidence and methodology. International organizations, trade partners, politicians, media and the public expect fast, understandable, up-to-date presentation and valid interpretation of animal disease data to support and document proper animal health management - in crises as well as in routine control applications. The delivery and application of surveillance information need to be further developed and optimized, and epidemiologists, risk managers, administrators and policy makers need to work together in order to secure progress. Promising new developments in areas such as risk-based surveillance, spatial presentation and analysis, and genomic epidemiology will be mentioned. Limitations and areas in need of further progress will be underlined, such as the general lack of a wide and open exchange of international animal disease surveillance data. During my more than 30 year career as a professor of Veterinary Epidemiology I had the good fortune of working in challenging environments with different eminent colleagues in different countries on a variety of animal health surveillance issues. My career change from professor to Chief Veterinary Officer (CVO) - "from science to application" - was caused by my desire to see for myself if and how well epidemiology would actually work to solve real-life problems as I had been telling my students for years that it would. Fortunately it worked for me! The job of a CVO is not that different from that of a professor of Veterinary Epidemiology; the underlying professional principles are the same. Every day I had to work from science, and base decisions and discussions on documented evidence - although sometimes the evidence was incomplete or data were simply lacking. A basic understanding of surveillance methodology is very useful for a CVO, since it provides

  3. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  4. Applications of a damage tolerance analysis methodology in aircraft design and production

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  5. Partial least squares path modeling basic concepts, methodological issues and applications

    Noonan, Richard

    2017-01-01

    This edited book presents the recent developments in partial least squares-path modeling (PLS-PM) and provides a comprehensive overview of the current state of the most advanced research related to PLS-PM. The first section of this book emphasizes the basic concepts and extensions of the PLS-PM method. The second section discusses the methodological issues that are the focus of the recent development of the PLS-PM method. The third part discusses the real world application of the PLS-PM method in various disciplines. The contributions from expert authors in the field of PLS focus on topics such as the factor-based PLS-PM, the perfect match between a model and a mode, quantile composite-based path modeling (QC-PM), ordinal consistent partial least squares (OrdPLSc), non-symmetrical composite-based path modeling (NSCPM), modern view for mediation analysis in PLS-PM, a multi-method approach for identifying and treating unobserved heterogeneity, multigroup analysis (PLS-MGA), the assessment of the common method b...

  6. APPLICATION OF THE CP METHODOLOGY IN REDUCTION OF WASTE IN THE PROCESSING OF TOBACCO COMPANIES

    André Luiz Emmel Silva

    2015-01-01

    Full Text Available The production, marketing and processing of tobacco are the base of the municipalities of Vale do Rio Pardo / RS economy. Although it is the raw material for various products, this region is intended almost exclusively for the production of cigarettes. Dominated by a few large multinational, this market moves this imposing financial values, where tobacco is much of the cost of production. Thus, this paper seeks to prove the efficiency of the methodology application Cleaner Production (CP in tobacco waste reduction within the tobacco processing and cigarette manufacturing companies. This analysis was conducted as a case study, carrying out visits to the knowledge production process, identifying the points of waste, taking measurements and developing a set of measures to be taken to minimize these losses. The Cleaner Production method was chosen because it is a relatively new concept and it has shown good results in companies where it is located. Through the measurements, the main points of breaks were identified and then an analysis was performed by applying the concepts of CP, and a set of measures has been proposed to reduce losses. As a result, it was achieved a reduction of 83% in the rate of tobacco waste in the production process. It was concluded that the CP, within the tobacco processing industry, was efficient, impacting directly on production costs, rationalizing the use of raw materials and reducing the total volume of waste generated.

  7. Application of Direct Assessment Approaches and Methodologies to Cathodically Protected Nuclear Waste Transfer Lines

    Dahl, Megan M.; Pikas, Joseph; Edgemon, Glenn L.; Philo, Sarah

    2013-01-01

    The U.S. Department of Energy's (DOE) Hanford Site is responsible for the safe storage, retrieval, treatment, and disposal of approximately 54 million gallons (204 million liters) of radioactive waste generated since the site's inception in 1943. Today, the major structures involved in waste management at Hanford include 149 carbon steel single-shell tanks, 28 carbon-steel double-shell tanks, plus a network of buried metallic transfer lines and ancillary systems (pits, vaults, catch tanks, etc.) required to store, retrieve, and transfer waste within the tank farm system. Many of the waste management systems at Hanford are still in use today. In response to uncertainties regarding the structural integrity of these systems,' an independent, comprehensive integrity assessment of the Hanford Site piping system was performed. It was found that regulators do not require the cathodically protected pipelines located within the Hanford Site to be assessed by External Corrosion Direct Assessment (ECDA) or any other method used to ensure integrity. However, a case study is presented discussing the application of the direct assessment process on pipelines in such a nuclear environment. Assessment methodology and assessment results are contained herein. An approach is described for the monitoring, integration of outside data, and analysis of this information in order to identify whether coating deterioration accompanied by external corrosion is a threat for these waste transfer lines

  8. The Combined ASTER MODIS Emissivity over Land (CAMEL Part 1: Methodology and High Spectral Resolution Application

    E. Eva Borbas

    2018-04-01

    Full Text Available As part of a National Aeronautics and Space Administration (NASA MEaSUREs (Making Earth System Data Records for Use in Research Environments Land Surface Temperature and Emissivity project, the Space Science and Engineering Center (UW-Madison and the NASA Jet Propulsion Laboratory (JPL developed a global monthly mean emissivity Earth System Data Record (ESDR. This new Combined ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer and MODIS (Moderate Resolution Imaging Spectroradiometer Emissivity over Land (CAMEL ESDR was produced by merging two current state-of-the-art emissivity datasets: the UW-Madison MODIS Infrared emissivity dataset (UW BF and the JPL ASTER Global Emissivity Dataset Version 4 (GEDv4. The dataset includes monthly global records of emissivity and related uncertainties at 13 hinge points between 3.6–14.3 µm, as well as principal component analysis (PCA coefficients at 5-km resolution for the years 2000 through 2016. A high spectral resolution (HSR algorithm is provided for HSR applications. This paper describes the 13 hinge-points combination methodology and the high spectral resolutions algorithm, as well as reports the current status of the dataset.

  9. Pursuit of new methodology on risk communication - Research assistance program by open application

    Konoa, N.; Takeshima, K.

    2004-01-01

    In the latter half of 1990s a series of incidents occurred in Japan such as MOX fuel inspection data falsification, Monju fast breeder reactor sodium leakage accident, Tokai nuclear fuel plant (JCO) criticality accident and so on. It is thought that existing measures based on nuclear technology are not well cope with those incidents and another countermeasure utilizing new methodology of cultural and social sciences was keenly felt by both administration agencies and nuclear industries. Above all, the technique such as risk communication to inform the influence of trouble correctly and convincingly to the residents and mass media and to prevent the harm due to rumor is obviously inevitable. Based on these circumstances, Japanese NISA (The Nuclear and Industrial Safety Agency) initiated in 2002FY new project by open application in the field of cultural and social sciences, and risk communication was one of the principal subject of study. Up to now, 6 risk communication studies are currently in progress. The project was taken over from NISA to JNES (Incorporated Administrative Agency Japan Nuclear Energy Safety Organization) since 2004FY. This paper shows the overall structure of the project and the outline of the running studies. (author)

  10. Application of the Coastal Hazard Wheel methodology for coastal multi-hazard assessment and management in the state of Djibouti

    Appelquist, Lars Rosendahl; Balstrøm, Thomas

    2014-01-01

    coastal classification system that incorporates the main static and dynamic parameters determining the characteristics of a coastal environment. The methodology provides information on the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding and can be used...... to support management decisions at local, regional and national level, in areas with limited access to geophysical data. The assessment for Djibouti applies a geographic information system (GIS) to develop a range of national hazard maps along with relevant hazard statistics and is showcasing the procedure......This paper presents the application of a new methodology for coastal multi-hazard assessment and management in a changing global climate on the state of Djibouti. The methodology termed the Coastal Hazard Wheel (CHW) is developed for worldwide application and is based on a specially designed...

  11. Optimization of a High Temperature PEMFC micro-CHP System by Formulation and Application of a Process Integration Methodology

    Arsalis, Alexandros; Nielsen, Mads Pagh; Kær, Søren Knudsen

    2013-01-01

    A 1 kWe micro combined heat and power (CHP) system based on high temperature proton exchange membrane fuel cell (PEMFC) technology is modeled and optimized by formulation and application of a process integration methodology. The system can provide heat and electricity for a singlefamily household...

  12. Application of the methodology of safety probabilistic analysis to the modelling the emergency feedwater system of Juragua nuclear power plant

    Troncoso, M.; Oliva, G.

    1993-01-01

    The application of the methodology developed in the framework of the national plan of safety probabilistic analysis (APS) to the emergency feed water system for the failures of small LOCAS and external electrical supply loss in the nuclear power plant is illustrated in this work. The facilities created by the ARCON code to model the systems and its documentation are also expounded

  13. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242

  14. Application of response surface methodology for optimizing transesterification of Moringa oleifera oil: Biodiesel production

    Rashid, Umer; Anwar, Farooq; Ashraf, Muhammad; Saleem, Muhammad; Yusup, Suzana

    2011-01-01

    Highlights: → Biodiesel production from Moringa oil (MO) has been optimized for the first time using RSM. → RSM-optimized reaction conditions gave a high Moringa oil methyl esters (MOMEs) yield (94.3%). → Fuel properties of MOMEs yielded satisfied the ASTM D 6751 and EU 14214 specifications. → Present RSM-model can be useful for predicting optimum biodiesel yield from other oils. - Abstract: Response surface methodology (RSM), with central composite rotatable design (CCRD), was used to explore optimum conditions for the transesterification of Moringa oleifera oil. Effects of four variables, reaction temperature (25-65 deg. C), reaction time (20-90 min), methanol/oil molar ratio (3:1-12:1) and catalyst concentration (0.25-1.25 wt.% KOH) were appraised. The quadratic term of methanol/oil molar ratio, catalyst concentration and reaction time while the interaction terms of methanol/oil molar ratio with reaction temperature and catalyst concentration, reaction time with catalyst concentration exhibited significant effects on the yield of Moringa oil methyl esters (MOMEs)/biodiesel, p < 0.0001 and p < 0.05, respectively. Transesterification under the optimum conditions ascertained presently by RSM: 6.4:1 methanol/oil molar ratio, 0.80% catalyst concentration, 55 deg. C reaction temperature and 71.08 min reaction time offered 94.30% MOMEs yield. The observed and predicted values of MOMEs yield showed a linear relationship. GLC analysis of MOMEs revealed oleic acid methyl ester, with contribution of 73.22%, as the principal component. Other methyl esters detected were of palmitic, stearic, behenic and arachidic acids. Thermal stability of MOMEs produced was evaluated by thermogravimetric curve. The fuel properties such as density, kinematic viscosity, lubricity, oxidative stability, higher heating value, cetane number and cloud point etc., of MOMEs were found to be within the ASTM D6751 and EN 14214 biodiesel standards.

  15. Application of response surface methodology for optimizing transesterification of Moringa oleifera oil: Biodiesel production

    Rashid, Umer, E-mail: umer.rashid@yahoo.com [Department of Chemistry and Biochemistry, University of Agriculture, Faisalabad 38040 (Pakistan); Chemical Engineering Department, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 31750, Tronoh, Perak (Malaysia); Anwar, Farooq, E-mail: fqanwar@yahoo.com [Department of Chemistry and Biochemistry, University of Agriculture, Faisalabad 38040 (Pakistan); Ashraf, Muhammad, E-mail: ashrafbot@yahoo.com [Department of Botany, University of Agriculture, Faisalabad 38040 (Pakistan); Department of Botany and Microbiology, King Saud University, Riyadh (Saudi Arabia); Saleem, Muhammad [Department of Statistics, Government College University, Faisalabad 38000 (Pakistan); Yusup, Suzana, E-mail: drsuzana_yusuf@petronas.com.my [Chemical Engineering Department, Universiti Teknologi PETRONAS, Bandar Seri Iskandar 31750, Tronoh, Perak (Malaysia)

    2011-08-15

    Highlights: {yields} Biodiesel production from Moringa oil (MO) has been optimized for the first time using RSM. {yields} RSM-optimized reaction conditions gave a high Moringa oil methyl esters (MOMEs) yield (94.3%). {yields} Fuel properties of MOMEs yielded satisfied the ASTM D 6751 and EU 14214 specifications. {yields} Present RSM-model can be useful for predicting optimum biodiesel yield from other oils. - Abstract: Response surface methodology (RSM), with central composite rotatable design (CCRD), was used to explore optimum conditions for the transesterification of Moringa oleifera oil. Effects of four variables, reaction temperature (25-65 deg. C), reaction time (20-90 min), methanol/oil molar ratio (3:1-12:1) and catalyst concentration (0.25-1.25 wt.% KOH) were appraised. The quadratic term of methanol/oil molar ratio, catalyst concentration and reaction time while the interaction terms of methanol/oil molar ratio with reaction temperature and catalyst concentration, reaction time with catalyst concentration exhibited significant effects on the yield of Moringa oil methyl esters (MOMEs)/biodiesel, p < 0.0001 and p < 0.05, respectively. Transesterification under the optimum conditions ascertained presently by RSM: 6.4:1 methanol/oil molar ratio, 0.80% catalyst concentration, 55 deg. C reaction temperature and 71.08 min reaction time offered 94.30% MOMEs yield. The observed and predicted values of MOMEs yield showed a linear relationship. GLC analysis of MOMEs revealed oleic acid methyl ester, with contribution of 73.22%, as the principal component. Other methyl esters detected were of palmitic, stearic, behenic and arachidic acids. Thermal stability of MOMEs produced was evaluated by thermogravimetric curve. The fuel properties such as density, kinematic viscosity, lubricity, oxidative stability, higher heating value, cetane number and cloud point etc., of MOMEs were found to be within the ASTM D6751 and EN 14214 biodiesel standards.

  16. Electrocoagulation and nanofiltration integrated process application in purification of bilge water using response surface methodology.

    Akarsu, Ceyhun; Ozay, Yasin; Dizge, Nadir; Elif Gulsen, H; Ates, Hasan; Gozmen, Belgin; Turabik, Meral

    Marine pollution has been considered an increasing problem because of the increase in sea transportation day by day. Therefore, a large volume of bilge water which contains petroleum, oil and hydrocarbons in high concentrations is generated from all types of ships. In this study, treatment of bilge water by electrocoagulation/electroflotation and nanofiltration integrated process is investigated as a function of voltage, time, and initial pH with aluminum electrode as both anode and cathode. Moreover, a commercial NF270 flat-sheet membrane was also used for further purification. Box-Behnken design combined with response surface methodology was used to study the response pattern and determine the optimum conditions for maximum chemical oxygen demand (COD) removal and minimum metal ion contents of bilge water. Three independent variables, namely voltage (5-15 V), initial pH (4.5-8.0) and time (30-90 min) were transformed to coded values. The COD removal percent, UV absorbance at 254 nm, pH value (after treatment), and concentration of metal ions (Ti, As, Cu, Cr, Zn, Sr, Mo) were obtained as responses. Analysis of variance results showed that all the models were significant except for Zn (P > 0.05), because the calculated F values for these models were less than the critical F value for the considered probability (P = 0.05). The obtained R(2) and Radj(2) values signified the correlation between the experimental data and predicted responses: except for the model of Zn concentration after treatment, the high R(2) values showed the goodness of fit of the model. While the increase in the applied voltage showed negative effects, the increases in time and pH showed a positive effect on COD removal efficiency; also the most effective linear term was found as time. A positive sign of the interactive coefficients of the voltage-time and pH-time systems indicated synergistic effect on COD removal efficiency, whereas interaction between voltage and pH showed an antagonistic

  17. Combining user logging with eye tracking for interactive and dynamic applications.

    Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise

    2015-12-01

    User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.

  18. On investment, uncertainty, and strategic interaction with applications in energy markets

    Murto, P.

    2003-01-01

    The thesis presents dynamic models on investment under uncertainty with the focus on strategic interaction and energy market applications. The uncertainty is modelled using stochastic processes as state variables. The specific questions analyzed include the effect of technological and revenue related uncertainties on the optimal timing of investment, the irreversibility in the choice between alternative investment projects with different degrees of uncertainty, and the effect of strategic interaction on the initiating of discrete investment projects, on the abandonment of a project, and on incremental capacity investments. The main methodological feature is the incorporation of game theoretic concepts in the theory of investment. It is argued that such an approach is often desirable in terms of real applications, because many industries are characterized by both uncertainty and strategic interaction between the firms. Besides extending the theory of investment, this line of work may be seen as an extension of the theory of industrial organization towards the direction that views market stability as one of the factors explaining rational behaviour of the firms. (orig.)

  19. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  20. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  1. Geographical targeting of poverty alleviation programs : methodology and applications in rural India

    Bigman, D.; Srinivasan, P.V.

    2002-01-01

    The paper presents a methodology for mapping poverty within national borders at the level of relatively small geographical areas and illustrates this methodology for India. Poverty alleviation programs in India are presently targeted only at the level of the state. All states includes, however, many

  2. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  3. Scaling up methodology for CO2 emissions in ICT applications in traffic and transport in Europe

    Mans, D.; Jonkers, E.; Giannelos, I.; Palanciuc, D.

    2013-01-01

    The Amitran project aims to define a reference methodology for evaluating the effects of ICT measures in trafäc and transport on energy efficiency and consequently CO2 emissions. This methodology can be used as a reference by future projects and will address different modes for both passenger and

  4. THE COMPETITIVENESS OF THE SOUTH AFRICAN AND AUSTRALIAN FLOWER INDUSTRIES: An application of three methodologies.

    van Rooyen, I.M.; Kirsten, Johann F.; van Rooyen, C.J.; Collins, Ray

    2001-01-01

    Competitiveness is defined to include both comparative and competitive advantage. Three different methodologies are applied in the analysis of the flower industries of South Africa and Australia: "Determinants of competitive advantage" methodology of Michael Porter (1990) describes the factors influencing competitive advantage; "Revealed comparative advantage" states the relative importance of flower trade in each country; and the "Policy Analyses Matrix" calculates the comparative advantage ...

  5. Determining Faculty and Student Views: Applications of Q Methodology in Higher Education

    Ramlo, Susan

    2012-01-01

    William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…

  6. Development of the methodology for application of revised source term to operating nuclear power plants in Korea

    Kang, M.S.; Kang, P.; Kang, C.S.; Moon, J.H.

    2004-01-01

    Considering the current trend in applying the revised source term proposed by NUREG-1465 to the nuclear power plants in the U.S., it is expected that the revised source term will be applied to the Korean operating nuclear power plants in the near future, even though the exact time can not be estimated. To meet the future technical demands, it is necessary to prepare the technical system including the related regulatory requirements in advance. In this research, therefore, it is intended to develop the methodology to apply the revised source term to operating nuclear power plants in Korea. Several principles were established to develop the application methodologies. First, it is not necessary to modify the existing regulations about source term (i.e., any back-fitting to operating nuclear plants is not necessary). Second, if the pertinent margin of safety is guaranteed, the revised source term suggested by NUREG-1465 may be useful to full application. Finally, a part of revised source term could be selected to application based on the technical feasibility. As the results of this research, several methodologies to apply the revised source term to the Korean operating nuclear power plants have been developed, which include: 1) the selective (or limited) application to use only some of all the characteristics of the revised source term, such as release timing of fission products and chemical form of radio-iodine and 2) the full application to use all the characteristics of the revised source term. The developed methodologies are actually applied to Ulchin 9 and 4 units and their application feasibilities are reviewed. The results of this research are used as either a manual in establishing the plan and the procedure for applying the revised source term to the domestic nuclear plant from the utility's viewpoint; or a technical basis of revising the related regulations from the regulatory body's viewpoint. The application of revised source term to operating nuclear

  7. Applications in soil-structure interactions. Final report, June 1979

    Jhaveri, D.P.

    1979-01-01

    Complex phenomenon of soil-structure interaction was assessed. Relationships between the characteristics of the earthquake ground motions, the local soil and geologic conditions, and the response of the structures to the ground motions were studied. (I) The use of the explicit finite-difference method to study linear elastic soil-structure interaction is described. A linear two-dimensional study of different conditions that influence the dynamic compliance and scattering properties of foundations is presented. (II) The FLUSH computer code was used to compute the soil-structure interaction during SIMQUAKE 1B, an experimental underground blast excitation of a 1/12-scale model of a nuclear containment structure. Evaluation was performed using transient excitation, applied to a finite-difference grid. Dynamic foundation properties were studied. Results indicate that the orientation and location of the source relative to the site and the wave environment at the site may be important parameters to be considered. Differences between the computed and experimental recorded responses are indicated, and reasons for the discrepancy are suggested. (III) A case study that examined structural and ground response data tabulated and catalogued from tests at the Nevada Test Site for its applicability to the soil-structure interaction questions of interest is presented. Description, methods, and evaluation of data on soil-structure interaction from forced vibration tests are presented. A two-dimensional finite-difference grid representing a relatively rigid structure resting on uniform ground was analyzed and monitored. Fourier spectra of monitored time histories were also evaluated and are presented. Results show clear evidence of soil-structure interaction and significant agreement with theory. 128 figures, 18 tables

  8. An Investigation to Resolve the Interaction Between Fuel Cell, Power Conditioning System and Application Loads

    Sudip K. Mazumder

    2005-12-31

    Development of high-performance and durable solidoxide fuel cells (SOFCs) and a SOFC power-generating system requires knowledge of the feedback effects from the power-conditioning electronics and from application-electrical-power circuits that may pass through or excite the power-electronics subsystem (PES). Therefore, it is important to develop analytical models and methodologies, which can be used to investigate and mitigate the effects of the electrical feedbacks from the PES and the application loads (ALs) on the reliability and performance of SOFC systems for stationary and non-stationary applications. However, any such attempt to resolve the electrical impacts of the PES on the SOFC would be incomplete unless one utilizes a comprehensive analysis, which takes into account the interactions of SOFC, PES, balance-of-plant system (BOPS), and ALs as a whole. SOFCs respond quickly to changes in load and exhibit high part- and full-load efficiencies due to its rapid electrochemistry, which is not true for the thermal and mechanical time constants of the BOPS, where load-following time constants are, typically, several orders of magnitude higher. This dichotomy can affect the lifetime and durability of the SOFCSs and limit the applicability of SOFC systems for load-varying stationary and transportation applications. Furthermore, without validated analytical models and investigative design and optimization methodologies, realizations of cost-effective, reliable, and optimal PESs (and power-management controls), in particular, and SOFC systems, in general, are difficult. On the whole, the research effort can lead to (a) cost-constrained optimal PES design for high-performance SOFCS and high energy efficiency and power density, (b) effective SOFC power-system design, analyses, and optimization, and (c) controllers and modulation schemes for mitigation of electrical impacts and wider-stability margin and enhanced system efficiency.

  9. Methodology and application of combined watershed and ground-water models in Kansas

    Sophocleous, M.; Perkins, S.P.

    2000-01-01

    Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling

  10. Energetics and Defect Interactions of Complex Oxides for Energy Applications

    Solomon, Jonathan Michael

    The goal of this dissertation is to employ computational methods to gain greater insights into the energetics and defect interactions of complex oxides that are relevant for today's energy challenges. To achieve this goal, the development of novel computational methodologies are required to handle complex systems, including systems containing nearly 650 ions and systems with tens of thousands of possible atomic configurations. The systems that are investigated in this dissertation are aliovalently doped lanthanum orthophosphate (LaPO4) due to its potential application as a proton conducting electrolyte for intermediate temperature fuel cells, and aliovalently doped uranium dioxide (UO2) due to its importance in nuclear fuel performance and disposal. First we undertake density-functional-theory (DFT) calculations on the relative energetics of pyrophosphate defects and protons in LaPO4, including their binding with divalent dopant cations. In particular, for supercell calculations with 1.85 mol% Sr doping, we investigate the dopant-binding energies for pyrophosphate defects to be 0.37 eV, which is comparable to the value of 0.34 eV calculated for proton-dopant binding energies in the same system. These results establish that dopant-defect interactions further stabilize proton incorporation, with the hydration enthalpies when the dopants are nearest and furthest from the protons and pyrophosphate defects being -1.66 eV and -1.37 eV, respectively. Even though our calculations show that dopant binding enhances the enthalpic favorability of proton incorporation, they also suggest that such binding is likely to substantially lower the kinetic rate of hydrolysis of pyrophosphate defects. We then shift our focus to solid solutions of fluorite-structured UO 2 with trivalent rare earth fission product cations (M3+=Y, La) using a combination of ionic pair potential and DFT based methods. Calculated enthalpies of formation with respect to constituent oxides show higher

  11. Enfoques conceptuales y propuestas metodológicas para el estudio de las interacciones entre el medio ambiente y la salud: aplicación a un programa de investigación sobre la tripanosomiasis americana Conceptual approaches and methodological proposals for the study of interactions between environment and health: application to a research program on American trypanosomiasis

    Cristina Romaña

    2003-08-01

    authors describe the program's conceptual and methodological basis and highlight the role of eco-epidemiology for studying the structure and function of natural and anthropogenic foci of infection. Modeling spatial and temporal dynamics can help predict and monitor such tropical diseases.

  12. [Needs assessment to improve the applicability and methodological quality of a German S3 guideline].

    Burckhardt, Marion; Hoffmann, Cristina; Nink-Grebe, Brigitte; Sänger, Sylvia

    2018-04-01

    Clinical practice guidelines can change the practice in healthcare only if their recommendations are implemented in a comprehensive way. The German S3 guideline "Local Therapy of Chronic Wounds in Patients with Peripheral Vascular Disease, Chronic Venous Insufficiency, and Diabetes" will be updated in 2017. The emphasis here is on the guideline's validity, user-friendliness and implementation into practice. Therefore, the aim was to identify the improvements required in regard to the guideline's methods and content presentation. The methodological approach used was the critical appraisal of the guideline according to established quality criteria and an additional stakeholder survey. Both were conducted between August and November 2016. The guideline and its related documents were reviewed independently by two researchers according to the criteria of the "Appraisal of Guidelines for Research and Evaluation" (AGREE-II). Published reviews and peer reviews by external experts and organisations were also taken into account. For the stakeholder survey, a questionnaire with open questions was distributed by e-mail and via the Internet to health professionals and organisations involved in the care of patients with leg ulcers in Germany. The questions were aimed at amendments and new topics based on the stakeholders' experience in inpatient and outpatient care. In addition, the survey focused on gathering suggestions to improve the applicability of the guideline. Suggested new topics and amendments were summarised thematically. The stakeholders' suggestions to improve the applicability, the results of the critical appraisal and the relevant aspects of the external reviews were then summarised according to the AGREE-II domains and presented in a cause and effect diagram. 17 questionnaires (out of 864 sent out by e-mail) were returned. Due to high practice relevance, the stakeholders suggested an expansion of the inclusion criteria to patients with infected wounds and

  13. Perceived Properties of Parameterised Music for Interactive Applications

    Jan Berg

    2006-04-01

    Full Text Available Traditional implementations of sound and music in interactive contexts have their limitations. One way to overcome these and to expand the possibilities of music is to handle the music in a parameterised form. To better understand the properties of the musical parameters resulting from parameterisation, two experiments were carried out. The first experiment investigated selected parameters' capability to change the music; the second experiment examined how the parameters can contribute to express emotions. From these experiments, it is concluded that users without musical training perform differently from musicians on some of the parameters. There is also a clear association between the parameters and the expressed basic emotions. The paper is concluded with observations on how parameterisation might be used in interactive applications.

  14. Examples for application and diagnostics in plasma-powder interaction

    Kersten, H; Wiese, R; Thieme, G; Froehlich, M; Kopitov, A; Bojic, D; Scholze, F; Neumann, H; Quaas, M; Wulff, H; Hippler, R

    2003-01-01

    Low-pressure plasmas offer a unique possibility of confinement, control and fine tailoring of particle properties. Hence, dusty plasmas have grown into a vast field and new applications of plasma-processed dust particles are emerging. There is demand for particles with special properties and for particle-seeded composite materials. For example, the stability of luminophore particles could be improved by coating with protective Al 2 O 3 films which are deposited by a PECVD process using a metal-organic precursor gas. Alternatively, the interaction between plasma and injected micro-disperse powder particles can also be used as a diagnostic tool for the study of plasma surface processes. Two examples will be provided: the interaction of micro-sized (SiO 2 ) grains confined in a radiofrequency plasma with an external ion beam as well as the effect of a dc-magnetron discharge on confined particles during deposition have been investigated

  15. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  16. Application of a new methodology for coastal multi-hazard-assessment and management on the state of Karnataka, India

    Appelquist, Lars Rosendahl; Balstrom, Thomas

    2015-01-01

    This paper presents the application of a new Methodology for coastal multi-hazard assessment & management under a changing global climate on the state of Karnataka, India. The recently published methodology termed the Coastal Hazard Wheel (CHW) is designed for local, regional and national hazard...... at a scale relevant for regional planning purposes. It uses a GIS approach to develop regional and sub-regional hazard maps as well as to produce relevant hazard risk data, and includes a discussion of uncertainties, limitations and management perspectives. The hazard assessment shows that 61 percent...

  17. The methodology of root cause analysis for equipment failure and its application at Guangdong nuclear power stations

    Gao Ligang; Lu Qunxian

    2004-01-01

    The methodology of Equipment Failure Root Cause Analysis (RCA) is described, as a systematic analysis methodology, it includes 9 steps. Its process is explained by some real examples, and the 6 precautions applying RCA is pointed out. The paper also summarizes the experience of RCA application at Daya Bay Nuclear Power Station, and the 7 key factors for RCA success is emphasized, that mainly concerns organization, objective, analyst, analysis technique, external technical supporting system, corrective actions developing and monitoring system for corrective actions. (authors)

  18. Methodological application of Location of service Public Bike. Service MUyBICI of Murcia

    LiÑan Ruiz, R.J.; Berenguer Sempere, F.J.; Vera Lopez, J.A.; Pabon Dueñas, A.B.; Merino Cordoba, S.

    2016-07-01

    The use of non-motorized means of transport such as the bicycle, brings many benefits to the user and for the city in terms of costs and health for the first and decreased environmental pollution for the city. To find the optimal location for placement of the different parties to public bike, aims to attract the usual user and potential, have the feasibility of switching modes without any restrictions, while generating the ability to balance the demands users towards sustainable modes of transport, with special attention to cycling and public bike loan. The implementation of this methodology is performed in the municipality of Murcia (Spain) due to the opening of its public bicycle system MUyBICI which will have 60 benches, with a total of 1,200 posts anchor and put into circulation 600 public bicycles. As selection criteria to be considered for the optimal location of the beds, the existing network of bike paths were considered, roads used by all users of the public highway, a description of travel and a database information with different land uses and socioeconomic data transport areas. In this paper an analysis model and application for optimal design of banking locations for Murcia MUyBICI service occurs. Specifically, they define what are the best locations to attract a larger number of users, in order to achieve a change in the percentage of the modal split of the municipality, increasing the number of users MUyBICI service. This work comes under the direction of the Bicycle Office of Murcia, part of the ALEM (Local Agency for Energy and Environment) service under the Department of Environment of the City of Murcia. (Author)

  19. Application of a methodology for the development and validation of reliable process control software

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  20. Application of six sigma DMAIC methodology to reduce service resolution time in a service organization

    Virender Narula

    2017-11-01

    Full Text Available The popularity of Six Sigma, as a means for improving quality, has grown exponentially in recent years. It is a proven methodology to achieve breakthrough improvement in process performance that generates significant savings to bottom line of an organization. This paper illustrates how Six Sigma methodology may be used to improve service processes. The purpose of this paper is to develop Six Sigma DMAIC methodologies that would help service organizations look into their processes. In addition, it demonstrates the vital linkages between process improvement and process variation. The study identifies critical process parameters and suggests a team structure for Six Sigma project in service operations.

  1. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  2. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  3. Interaction systems design and the protocol- and middleware-centred paradigms in distributed application development

    Andrade Almeida, João; van Sinderen, Marten J.; Quartel, Dick; Ferreira Pires, Luis

    2003-01-01

    This paper aims at demonstrating the benefits and importance of interaction systems design in the development of distributed applications. We position interaction systems design with respect to two paradigms that have influenced the design of distributed applications: the middleware-centred and the protocol-centred paradigm. We argue that interaction systems that support application-level interactions should be explicitly designed, using the externally observable behaviour of the interaction ...

  4. The audit of social administration (AGSC. Methodological proposal for its application in cooperative companies

    Leonardo Ojeda Mesa

    2014-06-01

    This article explains the bases for a methodology proposal that facilitates the execution of Social Administration Audits (AGS in cooperative companies, with the objective of evaluating the administration that they develop  the same social order.

  5. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  6. Efficient Substrate Noise Coupling Verification and Failure Analysis Methodology for Smart Power ICs in Automotive Applications

    Moursy , Yasser; Zou , Hao; Khalil , Raouf; Iskander , Ramy; Tisserand , Pierre; Ton , Dieu-My; Pasetti , Giuseppe; Louërat , Marie-Minerve

    2016-01-01

    International audience; This paper presents a methodology to analyze the substrate noise coupling and reduce their effects in smart power integrated circuits. This methodology considers the propagation of minority carriers in the substrate. Hence, it models the lateral bipolar junction transistors that are layout dependent and are not modeled in conventional substrate extraction tools. It allows the designer to simulate substrate currents and check their effects on circuits functionality. The...

  7. Application of a Resilience Framework to Military Installations: A Methodology for Energy Resilience Business Case Decisions

    2016-09-01

    align to a disruption or an associated downtime impacting mission performance. Reliability metrics and models were also used throughout the study to...Military Installations: A Methodology for Energy Resilience Business Case Decisions N. Judson A.L. Pina E.V. Dydek S.B. Van Broekhoven A.S...Methodology for Energy Resilience Business Case Decisions N. Judson A.L. Pina E.V. Dydek S.B. Van Broekhoven Group 73 A.S. Castillo TBD

  8. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  9. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  10. Methodological issues concerning the application of reliable laser particle sizing in soils

    de Mascellis, R.; Impagliazzo, A.; Basile, A.; Minieri, L.; Orefice, N.; Terribile, F.

    2009-04-01

    During the past decade, the evolution of technologies has enabled laser diffraction (LD) to become a much widespread means of particle size distribution (PSD), replacing sedimentation and sieve analysis in many scientific fields mainly due to its advantages of versatility, fast measurement and high reproducibility. Despite such developments of the last decade, the soil scientist community has been quite reluctant to replace the good old sedimentation techniques (ST); possibly because of (i) the large complexity of the soil matrix inducing different types of artefacts (aggregates, deflocculating dynamics, etc.), (ii) the difficulties in relating LD results with results obtained through sedimentation techniques and (iii) the limited size range of most LD equipments. More recently LD granulometry is slowly gaining appreciation in soil science also because of some innovations including an enlarged size dynamic range (0,01-2000 m) and the ability to implement more powerful algorithms (e.g. Mie theory). Furthermore, LD PSD can be successfully used in the application of physically based pedo-transfer functions (i.e., Arya and Paris model) for investigations of soil hydraulic properties, due to the direct determination of PSD in terms of volume percentage rather than in terms of mass percentage, thus eliminating the need to adopt the rough approximation of a single value for soil particle density in the prediction process. Most of the recent LD work performed in soil science deals with the comparison with sedimentation techniques and show the general overestimation of the silt fraction following a general underestimation of the clay fraction; these well known results must be related with the different physical principles behind the two techniques. Despite these efforts, it is indeed surprising that little if any work is devoted to more basic methodological issues related to the high sensitivity of LD to the quantity and the quality of the soil samples. Our work aims to

  11. A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA

    Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.

    2012-12-01

    A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F

  12. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  13. Methodological developments of low field MRI: Elasto-graphy, MRI-ultrasound interaction and dynamic nuclear polarization

    Madelin, Guillaume

    2005-01-01

    This thesis deals with two aspects of low field (0.2 T) Magnetic Resonance Imaging (MRI): the research of new contrasts due to the interaction between Nuclear Magnetic Resonance (NMR) and acoustics (elasto-graphy, spin-phonon interaction) and enhancement of the signal-to-noise ratio by Dynamic Nuclear Polarization (DNP). Magnetic Resonance Elasto-graphy (MRE) allows to assess some viscoelastic properties of tissues by visualization of the propagation of low frequency acoustic strain waves. A review on MRE is given, as well as a study on local measurement of the acoustic absorption coefficient. The next part is dedicated to MRI-ultrasound interaction. First, the ultrasonic transducer was calibrated for power and acoustic field using the comparison of two methods: the radiation force method (balance method) and laser interferometry. Then, we tried to modify the T1 contrast of tissues by spin-phonon interaction due to the application of ultrasound at the resonance frequency at 0.2 T, which is about 8.25 MHz. No modification of T1 contrast has been obtained, but the acoustic streaming phenomenon has been observed in liquids. MRI visualization of this streaming could make possible to calibrate transducers as well as to assess some mechanical properties of viscous fluids. The goal of the last part was to set up DNP experiments at 0.2 T in order to enhance the NMR signal. This double resonance method is based on the polarization transfer of unpaired electrons of free radicals to the surrounding protons of water. This transfer occurs by cross relaxation during the saturation of an electronic transition using Electronic Paramagnetic Resonance (EPR). Two EPR cavities operating at 5.43 GHz have been tested on oxo-TEMPO free radicals (nitroxide). An enhancement of the NMR signal by a factor 30 was obtained during these preliminary experiments. (author)

  14. Analysis and application of opinion model with multiple topic interactions.

    Xiong, Fei; Liu, Yun; Wang, Liang; Wang, Ximeng

    2017-08-01

    To reveal heterogeneous behaviors of opinion evolution in different scenarios, we propose an opinion model with topic interactions. Individual opinions and topic features are represented by a multidimensional vector. We measure an agent's action towards a specific topic by the product of opinion and topic feature. When pairs of agents interact for a topic, their actions are introduced to opinion updates with bounded confidence. Simulation results show that a transition from a disordered state to a consensus state occurs at a critical point of the tolerance threshold, which depends on the opinion dimension. The critical point increases as the dimension of opinions increases. Multiple topics promote opinion interactions and lead to the formation of macroscopic opinion clusters. In addition, more topics accelerate the evolutionary process and weaken the effect of network topology. We use two sets of large-scale real data to evaluate the model, and the results prove its effectiveness in characterizing a real evolutionary process. Our model achieves high performance in individual action prediction and even outperforms state-of-the-art methods. Meanwhile, our model has much smaller computational complexity. This paper provides a demonstration for possible practical applications of theoretical opinion dynamics.

  15. Analyzing the Feasibility of Using Secure Application Integration Methodology (SAIM) for Integrating DON Enterprise Resource Planning (ERP) Applications

    Marin, Ramon

    2004-01-01

    ...) would provide useful information about a beneficial methodology. SAIM is analyzed, by accessing its step by step directions, for suitability in the integration of the Enterprise Resource Planning (ERP...

  16. Application of the Coastal Hazard Wheel methodology for coastal multi-hazard assessment and management in the state of Djibouti

    Lars Rosendahl Appelquist

    2014-01-01

    Full Text Available This paper presents the application of a new methodology for coastal multi-hazard assessment and management in a changing global climate on the state of Djibouti. The methodology termed the Coastal Hazard Wheel (CHW is developed for worldwide application and is based on a specially designed coastal classification system that incorporates the main static and dynamic parameters determining the characteristics of a coastal environment. The methodology provides information on the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding and can be used to support management decisions at local, regional and national level, in areas with limited access to geophysical data. The assessment for Djibouti applies a geographic information system (GIS to develop a range of national hazard maps along with relevant hazard statistics and is showcasing the procedure for applying the CHW methodology for national hazard assessments. The assessment shows that the coastline of Djibouti is characterized by extensive stretches with high or very high hazards of ecosystem disruption, mainly related to coral reefs and mangrove forests, while large sections along the coastlines of especially northern and southern Djibouti have high hazard levels for gradual inundation. The hazard of salt water intrusion is moderate along most of Djibouti’s coastline, although groundwater availability is considered to be very sensitive to human ground water extraction. High or very high erosion hazards are associated with Djibouti’s sedimentary plains, estuaries and river mouths, while very high flooding hazards are associated with the dry river mouths.

  17. Multiscale Computational Fluid Dynamics: Methodology and Application to PECVD of Thin Film Solar Cells

    Marquis Crose

    2017-02-01

    Full Text Available This work focuses on the development of a multiscale computational fluid dynamics (CFD simulation framework with application to plasma-enhanced chemical vapor deposition of thin film solar cells. A macroscopic, CFD model is proposed which is capable of accurately reproducing plasma chemistry and transport phenomena within a 2D axisymmetric reactor geometry. Additionally, the complex interactions that take place on the surface of a-Si:H thin films are coupled with the CFD simulation using a novel kinetic Monte Carlo scheme which describes the thin film growth, leading to a multiscale CFD model. Due to the significant computational challenges imposed by this multiscale CFD model, a parallel computation strategy is presented which allows for reduced processing time via the discretization of both the gas-phase mesh and microscopic thin film growth processes. Finally, the multiscale CFD model has been applied to the PECVD process at industrially relevant operating conditions revealing non-uniformities greater than 20% in the growth rate of amorphous silicon films across the radius of the wafer.

  18. A methodology for the characterization and diagnosis of cognitive impairments-Application to specific language impairment.

    Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel

    2014-06-01

    The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better

  19. Development of a cross-section methodology and a real-time core model for VVER-1000 simulator application

    Georgieva, Emiliya Lyudmilova

    2016-06-06

    The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.

  20. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  1. Methodology to assess the radiological sensitivity of soils: Application to Spanish soils

    Trueba Alonso, C.

    2005-01-01

    A methodology, based on standard physical and chemical soil properties, has been developed to estimate the radiological sensitivity of soils to a 137 C s and 90 S r contamination. In this framework, the soil radiological sensitivity is defined as the soil capability to mobilise or to retain these radionuclides. The purpose of this methodology is to assess, in terms of radiological sensitivity indexes, the behaviour of 137 C s and 90 S r in soils and their fluxes to man, considering two exposure pathways, the external irradiation exposure and the internal exposure from ingestion. The methodology is applied to the great variety of soil types found in Spain, where the soil profile is the reference unit for the assessment. The results for these soil types show, that their basic soil properties are the key to categorise the radiological sensitivity according to the risks considered. The final categorisation allows to identify soils specially sensible and improves the radiological impact assessment predictions. (Author)

  2. Application of Response Surface Methodology in Optimizing a Three Echelon Inventory System

    Seyed Hossein Razavi Hajiagha

    2014-01-01

    Full Text Available Inventory control is an important subject in supply chain management. In this paper, a three echelon production, distribution, inventory system composed of one producer, two wholesalers and a set of retailers has been considered. Costumers' demands follow a compound Poisson process and the inventory policy is a kind of continuous review (R, Q. In this paper, regarding the standard cost structure in an inventory model, the cost function of system has been approximated using Response Surface Methodology as a combination of designed experiments, simulation, regression analysis and optimization. The proposed methodology in this paper can be applied as a novel method in optimization of inventory policy of supply chains. Also, the joint optimization of inventory parameters, including reorder point and batch order size, is another advantage of the proposed methodology.

  3. Fire risk analysis for nuclear power plants: Methodological developments and applications

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  4. Clearance of surface-contaminated objects from the controlled area of a nuclear facility. Application of the SUDOQU methodology

    Russo, F.; Mommaert, C. [Bel V, Brussels (Belgium); Dillen, T. van [National Institute for Public Health and the Environment (RIVM), Bilthoven (Netherlands)

    2018-01-15

    The lack of clearly defined surface-clearance levels in the Belgian regulation led Bel V to start a collaboration with the Dutch National Institute for Public Health and the Environment (RIVM) to evaluate the applicability of the SUDOQU methodology for the derivation of nuclide-specific surface-clearance criteria for objects released from nuclear facilities. SUDOQU is a methodology for the dose assessment of exposure to a surface-contaminated object, with the innovative assumption of a time-dependent surface activity whose evolution is influenced by removal and deposition mechanisms. In this work, calculations were performed to evaluate the annual effective dose resulting from the use of a typical office item, e.g. a bookcase. Preliminary results allow understanding the interdependencies between the model's underlying mechanisms, and show a strong sensitivity to the main input parameters. The results were benchmarked against those from a model described in Radiation Protection 101, to investigate the impact of the model's main assumptions. Results of the two models were in good agreement. The SUDOQU methodology appears to be a flexible and powerful tool, suitable for the proposed application. Therefore, the project will be extended to more generic study cases, to eventually develop surface-clearance levels applicable to objects leaving nuclear facilities.

  5. An application of Six Sigma methodology to reduce the engine-overheating problem in an automotive company

    Antony, J. [Glasgow Caledonian University (United Kingdom). Six Sigma and Process Improvement Research Centre; Kumar, M. [Glasgow Caledonian University (United Kingdom). Division of Management; Tiwari, M.K. [National Institute of Foundry and Forge Technology, Ranchi (India). Department of Manufacturing Engineering

    2005-08-15

    Six Sigma is a systematic methodology for continuous process quality improvement and for achieving operational excellence. The overstatement that often accompanies the presentation and adoption of Six Sigma in industry can lead to unrealistic expectations as to what Six Sigma is truly capable of achieving. This paper deals with the application of Six Sigma based methodology in eliminating an engine-overheating problem in an automotive company. The DMAIC (define-measure-analyse-improve-control) approach has been followed here to solve an underlying problem of reducing process variation and the associated high defect rate. This paper explores how a foundry can use a systematic and disciplined approach to move towards the goal of Six Sigma quality level. The application of the Six Sigma methodology resulted in a reduction in the jamming problem encountered in the cylinder head and increased the process capability from 0.49 to 1.28. The application of DMAIC has had a significant financial impact (saving over $US110 000 per annum) on the bottom-line of the company. (author)

  6. Optimization of Xylanase Production through Response Surface Methodology by Fusarium sp. BVKT R2 Isolated from forest soil and its applications in saccharification

    Ramanjaneyulu Golla

    2016-09-01

    Full Text Available AbstractXylanses are hydrolytic enzymes with wide applications in several industries like biofuels, paper and pulp, deinking, food and feed. The present study was aimed at hitting at high yield xylanase producing fungi from natural resources. Two highest xylanase producing fungal isolates - Q12 and L1were picked from collection of 450 fungal cultures for the utilization of xylan. These fungal isolates - Q12 and L1 were identified basing on ITS gene sequencing analysis as Fusarium sp. BVKT R2 (KT119615 and Fusarium strain BRR R6 (KT119619, respectively with construction of phylogenetic trees. Fusarium sp. BVKT R2 was further optimized for maximum xylanase production and the interaction effects between variables on production of xylanase were studied through response surface methodology. The optimal conditions for maximal production of xylanase were sorbitol 1.5%, yeast extract 1.5%, pH of 5.0, Temperature of 32.5ºC, and agitation of 175 rpm. Under optimal conditions, the yields of xylanase production by Fusarium sp. BVKT R2 was as high as 4560 U/ml in SmF. Incubation of different lignocellulosic biomasses with crude enzyme of Fusarium sp. BVKT R2 at 37°C for 72 h could achieve about 45% saccharification. The results suggest that Fusarium sp. BVKT R2 has potential applications in saccharification process of biomass.Key words: Fusarium sp., Optimization, Response Surface Methodology, Saccharification, Submerged fermentation, Xylanase

  7. Application of a new methodology on the multicycle analysis for the Laguna Verde NPP en Mexico

    Cortes C, Carlos C.

    1997-01-01

    This paper describes the improvements done in the physical and economic methodologies on the multicycle analysis for the Boiling Water Reactors of the Laguna Verde NPP in Mexico, based on commercial codes and in-house developed computational tools. With these changes in our methodology, three feasible scenarios are generated for the operation of Laguna Verde Nuclear Power Plant Unit 2 at 12, 18 and 24 months. The physical economic results obtained are showed. Further, the effect of the replacement power is included in the economic evaluation. (author). 11 refs., 3 figs., 7 tabs

  8. Putting Foucault to work: an approach to the practical application of Foucault's methodological imperatives

    DAVID A. NICHOLLS

    2009-01-01

    Full Text Available This paper presents an overview of the methodological approach taken in a recently completed Foucauldian discourse analysis of physiotherapy practice. In keeping with other approaches common to postmodern research this paper resists the temptation to define a proper or ‘correct’ interpretation of Foucault’s methodological oeuvre; preferring instead to apply a range of Foucauldian propositions to examples drawn directly from the thesis. In the paper I elucidate on the blended archaeological and genealogical approach I took and unpack some of the key imperatives, principles and rules I grappled with in completing the thesis.

  9. Application of the Integrated Safety Assessment methodology to safety margins. Dynamic Event Trees, Damage Domains and Risk Assessment

    Ibánez, L.; Hortal, J.; Queral, C.; Gómez-Magán, J.; Sánchez-Perea, M.; Fernández, I.; Meléndez, E.; Expósito, A.; Izquierdo, J.M.; Gil, J.; Marrao, H.; Villalba-Jabonero, E.

    2016-01-01

    The Integrated Safety Assessment (ISA) methodology, developed by the Consejo de Seguridad Nuclear, has been applied to an analysis of Zion NPP for sequences with Loss of the Component Cooling Water System (CCWS). The ISA methodology proposal starts from the unfolding of the Dynamic Event Tree (DET). Results from this first step allow assessing the sequence delineation of standard Probabilistic Safety Analysis results. For some sequences of interest of the outlined DET, ISA then identifies the Damage Domain (DD). This is the region of uncertain times and/or parameters where a safety limit is exceeded, which indicates the occurrence of certain damage situation. This paper illustrates application of this concept obtained simulating sequences with MAAP and with TRACE. From information of simulation results of sequence transients belonging to the DD and the time-density probability distributions of the manual actions and of occurrence of stochastic phenomena, ISA integrates the dynamic reliability equations proposed to obtain the sequence contribution to the global Damage Exceedance Frequency (DEF). Reported results show a slight increase in the DEF for sequences investigated following a power uprate from 100% to 110%. This demonstrates the potential use of the method to help in the assessment of design modifications. - Highlights: • This paper illustrates an application of the ISA methodology to safety margins. • Dynamic Event Trees are useful tool for verifying the standard PSA Event Trees. • The ISA methodology takes into account the uncertainties in human action times. • The ISA methodology shows the Damage Exceedance Frequency increase in power uprates.

  10. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. SIMMER-III applications to fuel-coolant interactions

    Morita, K.; Kondo, Sa.; Tobita, Y.; Brear, D.J. [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1998-01-01

    The main purpose of the SIMMER-III code is to provide a numerical simulation of complex multiphase, multicomponent flow problems essential to investigate core disruptive accidents in liquid-metal fast reactors (LMFRs). However, the code is designed to be sufficiently flexible to be applied to a variety of multiphase flows, in addition to LMFR safety issues. In the present study, some typical experiments relating to fuel-coolant interactions (FCIs) have been analyzed by SIMMER-III to demonstrate that the code is applicable to such complex and highly transient multiphase flow situations. It is shown that SIMMER-III can reproduce the premixing phase both in water and sodium systems as well as the propagation of steam explosion. It is thus demonstrated the code is basically capable of simulating integral multiphase thermal-hydraulic problems included in FCI experiments. (author)

  12. 76 FR 21036 - Application of the Prevailing Wage Methodology in the H-2B Program

    2011-04-14

    ... Department to ``promulgate new rules concerning the calculation of the prevailing wage rate in the H-2B... wage methodology set forth in this Rule applies only to wages paid for work performed on or after...: Notice. SUMMARY: On January 19, 2011, the Department of Labor (Department) published a final rule, Wage...

  13. A general improved methodology to forecasting future oil production: Application to the UK and Norway

    Fiévet, L.; Forró, Z.; Cauwels, P.; Sornette, D.

    2015-01-01

    We present a new Monte-Carlo methodology to forecast the crude oil production of Norway and the U.K. based on a two-step process, (i) the nonlinear extrapolation of the current/past performances of individual oil fields and (ii) a stochastic model of the frequency of future oil field discoveries. Compared with the standard methodology that tends to underestimate remaining oil reserves, our method gives a better description of future oil production, as validated by our back-tests starting in 2008. Specifically, we predict remaining reserves extractable until 2030 to be 5.7 ± 0.3 billion barrels for Norway and 3.0 ± 0.3 billion barrels for the UK, which are respectively 45% and 66% above the predictions using an extrapolation of aggregate production. - Highlights: • Two step methodology to forecast a countries oil production. • Nonlinear extrapolation of the performance of individual fields. • Stochastic model of the frequency of future discoveries. • Backtest starting in 2008 of the methodology. • Improvement upon standard extrapolation of aggregate production

  14. Implementation and training methodology of subcritical reactors neutronic calculations triggered by external neutron source and applications

    Carluccio, Thiago

    2011-01-01

    This works had as goal to investigate calculational methodologies on subcritical source driven reactor, such as Accelerator Driven Subcritical Reactor (ADSR) and Fusion Driven Subcritical Reactor (FDSR). Intense R and D has been done about these subcritical concepts, mainly due to Minor Actinides (MA) and Long Lived Fission Products (LLFP) transmutation possibilities. In this work, particular emphasis has been given to: (1) complement and improve calculation methodology with neutronic transmutation and decay capabilities and implement it computationally, (2) utilization of this methodology in the Coordinated Research Project (CRP) of the International Atomic Energy Agency Analytical and Experimental Benchmark Analysis of ADS and in the Collaborative Work on Use of Low Enriched Uranium in ADS, especially in the reproduction of the experimental results of the Yalina Booster subcritical assembly and study of a subcritical core of IPEN / MB-01 reactor, (3) to compare different nuclear data libraries calculation of integral parameters, such as k eff and k src , and differential distributions, such as spectrum and flux, and nuclides inventories and (4) apply the develop methodology in a study that may help future choices about dedicated transmutation system. The following tools have been used in this work: MCNP (Monte Carlo N particle transport code), MCB (enhanced version of MCNP that allows burnup calculation) and NJOY to process nuclear data from evaluated nuclear data files. (author)

  15. A social network perspective on teacher collaboration in schools: Theory, methodology, and applications

    Moolenaar, Nienke

    2012-01-01

    An emerging trend in educational research is the use of social network theory and methodology to understand how teacher collaboration can support or constrain teaching, learning, and educational change. This article provides a critical synthesis of educational literature on school social networks

  16. Q and you: The application of Q methodology in recreation research

    Whitney. Ward

    2010-01-01

    Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...

  17. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  18. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  19. Simplified life cycle assessment models: methodological framework and applications to energy pathways

    Padey, Pierryves

    2013-01-01

    The energy transition debate is a key issue for today and the coming years. One of the challenges is to limit the environmental impacts of electricity production. Decision support tools, sufficiently accurate, simple to use, accounting for environmental aspects and favoring future energetic choices, must be implemented. However, the environmental assessment of the energy pathways is complex, and it means considering a two levels characterization. The 'energy pathway' is the first level and corresponds to its environmental distribution, to compare overall pathways. The 'system pathway' is the 2. level and compares environmental impacts of systems within each pathway. We have devised a generic methodology covering both necessary characterization levels by estimating the energy pathways environmental profiles while allowing a simple comparison of its systems environmental impacts. This methodology is based on the definition of a parameterized Life Cycle Assessment model and considers, through a Global Sensitivity Analysis, the environmental impacts of a large sample of systems representative of an energy pathway. As a second step, this methodology defines simplified models based on few key parameters identified as inducing the largest variability in the energy pathway environmental impacts. These models assess in a simple way the systems environmental impacts, avoiding any complex LCAs. This reduction methodology has been applied to the onshore wind power energy pathway in Europe and the photovoltaic energy pathway in France. (author)

  20. Application of a statistical methodology for the comprehension of corrosion phenomena on Phenix spent fuel pins

    Pantera, L.

    1992-11-01

    The maximum burnup of Phenix fuel elements is strongly conditioned by the internal corrosion of the steel cladding. This thesis is a part of a new study program on the corrosion phenomena. Based on the results of an experimental program during the years 1980-1990 its objective is the use of a statistical methodology for a better comprehension of the corrosion phenomena

  1. Advanced software development workstation: Object-oriented methodologies and applications for flight planning and mission operations

    Izygon, Michel

    1993-01-01

    The work accomplished during the past nine months in order to help three different organizations involved in Flight Planning and in Mission Operations systems, to transition to Object-Oriented Technology, by adopting one of the currently most widely used Object-Oriented analysis and Design Methodology is summarized.

  2. Methodology for estimating biomass energy potential and its application to Colombia

    Gonzalez-Salazar, Miguel Angel; Morini, Mirko; Pinelli, Michele; Spina, Pier Ruggero; Venturini, Mauro; Finkenrath, Matthias; Poganietz, Witold-Roger

    2014-01-01

    Highlights: • Methodology to estimate the biomass energy potential and its uncertainty at a country level. • Harmonization of approaches and assumptions in existing assessment studies. • The theoretical and technical biomass energy potential in Colombia are estimated in 2010. - Abstract: This paper presents a methodology to estimate the biomass energy potential and its associated uncertainty at a country level when quality and availability of data are limited. The current biomass energy potential in Colombia is assessed following the proposed methodology and results are compared to existing assessment studies. The proposed methodology is a bottom-up resource-focused approach with statistical analysis that uses a Monte Carlo algorithm to stochastically estimate the theoretical and the technical biomass energy potential. The paper also includes a proposed approach to quantify uncertainty combining a probabilistic propagation of uncertainty, a sensitivity analysis and a set of disaggregated sub-models to estimate reliability of predictions and reduce the associated uncertainty. Results predict a theoretical energy potential of 0.744 EJ and a technical potential of 0.059 EJ in 2010, which might account for 1.2% of the annual primary energy production (4.93 EJ)

  3. Supporting secure programming in web applications through interactive static analysis

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2013-01-01

    Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513

  4. Supporting secure programming in web applications through interactive static analysis

    Jun Zhu

    2014-07-01

    Full Text Available Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  5. Interactive visualization and analysis of multimodal datasets for surgical applications.

    Kirmizibayrak, Can; Yim, Yeny; Wakid, Mike; Hahn, James

    2012-12-01

    Surgeons use information from multiple sources when making surgical decisions. These include volumetric datasets (such as CT, PET, MRI, and their variants), 2D datasets (such as endoscopic videos), and vector-valued datasets (such as computer simulations). Presenting all the information to the user in an effective manner is a challenging problem. In this paper, we present a visualization approach that displays the information from various sources in a single coherent view. The system allows the user to explore and manipulate volumetric datasets, display analysis of dataset values in local regions, combine 2D and 3D imaging modalities and display results of vector-based computer simulations. Several interaction methods are discussed: in addition to traditional interfaces including mouse and trackers, gesture-based natural interaction methods are shown to control these visualizations with real-time performance. An example of a medical application (medialization laryngoplasty) is presented to demonstrate how the combination of different modalities can be used in a surgical setting with our approach.

  6. Supporting secure programming in web applications through interactive static analysis.

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  7. Methodology for the biosphere evaluation during the RRAA management. Application for the Mediterranean system

    Pinedo, P.; Simon, I.; Aguero, A.

    1998-01-01

    For several years CIEMAT has been developing for ENRESA knowledge and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water impacts arising from the resulting distribution of radionuclides in the biosphere. In 1996, a Methodology to analyse the biosphere in the context of high level waste repositories was proposed to ENRESA, where the issues mentioned above were considered and treated. The level of development of the different aspects proposed within the Methodology was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects like the procedure for conceptual model development using the RES matrix and the description of biosphere systems representatives of the long term needed further developments. These own methodological developments have been developed in parallel with similar international developments within which there were and are an active participation, the BIOMOVS II international Project, finalized in 1996 and where it was developed the so called Reference Biosphere Methodology and, the International Atomic Energy Agency (IAEA) Programme on Biosphere Modelling and Assessment (BIOMASS), that is developed at present in collaboration with several national organizations, ENRESA and CIEMAT among them. The work described here takes account of these international developments. The overall purpose of this work is to apply the Methodology, to the last performance assessment (PA) exercise made by ENRESA, using form it the general and particular information about the assessment context, the source term, and the geo-biosphere interface data. (Author) 6 refs

  8. Application of CASMO-4/MICROBURN-B2 methodology to mixed cores with Westinghouse Optima2 fuel

    Hsiao, Ming Yuan; Wheeler, John K.; Hoz, Carlos de la [Nuclear Fuels, Warrenville (United States)

    2008-10-15

    The first application of CASMO-4/MICROBURN-B2 methodology to Westinghouse SVEA-96 Optima2 reload cycle is described in this paper. The first Westinghouse Optima2 reload cycle in the U.S. is Exelon's Quad Cities Unit 2 Cycle 19 (Q2C19). The core contains fresh Optima2 fuel and once burned and twice burned GE14 fuel. Although the licensing analyses for the reload cycle are performed by Westinghouse with Westinghouse methodology, the core is monitored with AREVA's POWERPLEX-III core monitoring system that is based on the CASMO-4/MICROBURN-B2 (C4/B2) methodology. This necessitates the development of a core model based on the C4/B2 methodology for both reload design and operational support purposes. In addition, as expected, there are many differences between the two vendors' methodologies; they differ not only in modeling some of the physical details of the Optima2 bundles but also in the modeling capability of the computer codes. In order to have high confidence that the online core monitoring results during the cycle startup and operation will comply with the Technical Specifications requirements (e.g., thermal limits, shutdown margins), the reload core design generated by Westinghouse design methodology was confirmed by the C4/B2 model. The C4/B2 model also assures that timely operational support during the cycle can be provided. Since this is the first application of C4/B2 methodology to an Optima2 reload in the US, many issues in the lattice design, bundle design, and reload core design phases were encountered. Many modeling issues have to be considered in order to develop a successful C4/B2 core model for the Optima2/GE14 mixed core. Some of the modeling details and concerns and their resolutions are described. The Q2C19 design was successfully completed and the 2 year cycle successfully started up in April 2006 and shut down in March 2008. Some of the operating results are also presented.

  9. Application of CASMO-4/MICROBURN-B2 methodology to mixed cores with Westinghouse Optima2 fuel

    Hsiao, Ming Yuan; Wheeler, John K.; Hoz, Carlos de la [Nuclear Fuels, Warrenville (United States)

    2008-10-15

    The first application of CASMO-4/MICROBURN-B2 methodology to Westinghouse SVEA-96 Optima2 reload cycle is described in this paper. The first Westinghouse Optima2 reload cycle in the U.S. is Exelon's Quad Cities Unit 2 Cycle 19 (Q2C19). The core contains fresh Optima2 fuel and once burned and twice burned GE14 fuel. Although the licensing analyses for the reload cycle are performed by Westinghouse with Westinghouse methodology, the core is monitored with AREVA's POWERPLEX-III core monitoring system that is based on the CASMO-4/MICROBURN-B2 (C4/B2) methodology. This necessitates the development of a core model based on the C4/B2 methodology for both reload design and operational support purposes. In addition, as expected, there are many differences between the two vendors' methodologies; they differ not only in modeling some of the physical details of the Optima2 bundles but also in the modeling capability of the computer codes. In order to have high confidence that the online core monitoring results during the cycle startup and operation will comply with the Technical Specifications requirements (e.g., thermal limits, shutdown margins), the reload core design generated by Westinghouse design methodology was confirmed by the C4/B2 model. The C4/B2 model also assures that timely operational support during the cycle can be provided. Since this is the first application of C4/B2 methodology to an Optima2 reload in the US, many issues in the lattice design, bundle design, and reload core design phases were encountered. Many modeling issues have to be considered in order to develop a successful C4/B2 core model for the Optima2/GE14 mixed core. Some of the modeling details and concerns and their resolutions are described. The Q2C19 design was successfully completed and the 2 year cycle successfully started up in April 2006 and shut down in March 2008. Some of the operating results are also presented.

  10. Application of CASMO-4/MICROBURN-B2 methodology to mixed cores with Westinghouse Optima2 fuel

    Hsiao, Ming Yuan; Wheeler, John K.; Hoz, Carlos de la

    2008-01-01

    The first application of CASMO-4/MICROBURN-B2 methodology to Westinghouse SVEA-96 Optima2 reload cycle is described in this paper. The first Westinghouse Optima2 reload cycle in the U.S. is Exelon's Quad Cities Unit 2 Cycle 19 (Q2C19). The core contains fresh Optima2 fuel and once burned and twice burned GE14 fuel. Although the licensing analyses for the reload cycle are performed by Westinghouse with Westinghouse methodology, the core is monitored with AREVA's POWERPLEX-III core monitoring system that is based on the CASMO-4/MICROBURN-B2 (C4/B2) methodology. This necessitates the development of a core model based on the C4/B2 methodology for both reload design and operational support purposes. In addition, as expected, there are many differences between the two vendors' methodologies; they differ not only in modeling some of the physical details of the Optima2 bundles but also in the modeling capability of the computer codes. In order to have high confidence that the online core monitoring results during the cycle startup and operation will comply with the Technical Specifications requirements (e.g., thermal limits, shutdown margins), the reload core design generated by Westinghouse design methodology was confirmed by the C4/B2 model. The C4/B2 model also assures that timely operational support during the cycle can be provided. Since this is the first application of C4/B2 methodology to an Optima2 reload in the US, many issues in the lattice design, bundle design, and reload core design phases were encountered. Many modeling issues have to be considered in order to develop a successful C4/B2 core model for the Optima2/GE14 mixed core. Some of the modeling details and concerns and their resolutions are described. The Q2C19 design was successfully completed and the 2 year cycle successfully started up in April 2006 and shut down in March 2008. Some of the operating results are also presented

  11. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  12. Nanoparticle decoration with surfactants: Molecular interactions, assembly, and applications

    Heinz, Hendrik; Pramanik, Chandrani; Heinz, Ozge; Ding, Yifu; Mishra, Ratan K.; Marchon, Delphine; Flatt, Robert J.; Estrela-Lopis, Irina; Llop, Jordi; Moya, Sergio; Ziolo, Ronald F.

    2017-02-01

    Nanostructures of diverse chemical nature are used as biomarkers, therapeutics, catalysts, and structural reinforcements. The decoration with surfactants has a long history and is essential to introduce specific functions. The definition of surfactants in this review is very broad, following its lexical meaning ;surface active agents;, and therefore includes traditional alkyl modifiers, biological ligands, polymers, and other surface active molecules. The review systematically covers covalent and non-covalent interactions of such surfactants with various types of nanomaterials, including metals, oxides, layered materials, and polymers as well as their applications. The major themes are (i) molecular recognition and noncovalent assembly mechanisms of surfactants on the nanoparticle and nanocrystal surfaces, (ii) covalent grafting techniques and multi-step surface modification, (iii) dispersion properties and surface reactions, (iv) the use of surfactants to influence crystal growth, as well as (v) the incorporation of biorecognition and other material-targeting functionality. For the diverse materials classes, similarities and differences in surfactant assembly, function, as well as materials performance in specific applications are described in a comparative way. Major factors that lead to differentiation are the surface energy, surface chemistry and pH sensitivity, as well as the degree of surface regularity and defects in the nanoparticle cores and in the surfactant shell. The review covers a broad range of surface modifications and applications in biological recognition and therapeutics, sensors, nanomaterials for catalysis, energy conversion and storage, the dispersion properties of nanoparticles in structural composites and cement, as well as purification systems and classical detergents. Design principles for surfactants to optimize the performance of specific nanostructures are discussed. The review concludes with challenges and opportunities.

  13. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  14. Methodology for accident analyses of fusion breeder blankets and its application to helium-cooled pebble bed blanket

    Panayotov, Dobromir; Grief, Andrew; Merrill, Brad J.; Humrickhouse, Paul; Trow, Martin; Dillistone, Michael; Murgatroyd, Julian T.; Owen, Simon; Poitevin, Yves; Peers, Karen; Lyons, Alex; Heaton, Adam; Scott, Richard

    2016-01-01

    Graphical abstract: - Highlights: • Test Blanket Systems (TBS) DEMO breeding blankets (BB) safety demonstration. • Comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena. • Development of accident analysis specifications (AAS) via the use of phenomena identification and ranking tables (PIRT). • PIRT application to identify required physical models for BB accidents analysis, code assessment and selection. • Development of MELCOR and RELAP5 codes TBS models. • Qualification of the models via comparison with finite element calculations, code-tocode comparisons, and sensitivity studies. - Abstract: ‘Fusion for Energy’ (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. The methodology phases are illustrated in the paper by its application to the EU HCPB TBS using both MELCOR and RELAP5 codes.

  15. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  16. Methodology for accident analyses of fusion breeder blankets and its application to helium-cooled pebble bed blanket

    Panayotov, Dobromir, E-mail: dobromir.panayotov@f4e.europa.eu [Fusion for Energy (F4E), Josep Pla, 2, Torres Diagonal Litoral B3, Barcelona E-08019 (Spain); Grief, Andrew [Amec Foster Wheeler, Booths Park, Chelford Road, Knutsford WA16 8QZ, Cheshire (United Kingdom); Merrill, Brad J.; Humrickhouse, Paul [Idaho National Laboratory, PO Box 1625, Idaho Falls, ID (United States); Trow, Martin; Dillistone, Michael; Murgatroyd, Julian T.; Owen, Simon [Amec Foster Wheeler, Booths Park, Chelford Road, Knutsford WA16 8QZ, Cheshire (United Kingdom); Poitevin, Yves [Fusion for Energy (F4E), Josep Pla, 2, Torres Diagonal Litoral B3, Barcelona E-08019 (Spain); Peers, Karen; Lyons, Alex; Heaton, Adam; Scott, Richard [Amec Foster Wheeler, Booths Park, Chelford Road, Knutsford WA16 8QZ, Cheshire (United Kingdom)

    2016-11-01

    Graphical abstract: - Highlights: • Test Blanket Systems (TBS) DEMO breeding blankets (BB) safety demonstration. • Comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena. • Development of accident analysis specifications (AAS) via the use of phenomena identification and ranking tables (PIRT). • PIRT application to identify required physical models for BB accidents analysis, code assessment and selection. • Development of MELCOR and RELAP5 codes TBS models. • Qualification of the models via comparison with finite element calculations, code-tocode comparisons, and sensitivity studies. - Abstract: ‘Fusion for Energy’ (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. The methodology phases are illustrated in the paper by its application to the EU HCPB TBS using both MELCOR and RELAP5 codes.

  17. AN INVESTIGATION TO RESOLVE THE INTERACTION BETWEEN FUEL CELL, POWER CONDITIONING SYSTEM AND APPLICATION LOADS

    Sudip K. Mazumder; Chuck McKintyre; Dan Herbison; Doug Nelson; Comas Haynes; Michael von Spakovsky; Joseph Hartvigsen; S. Elangovan

    2003-11-03

    Solid-Oxide Fuel Cell (SOFC) stacks respond quickly to changes in load and exhibit high part- and full-load efficiencies due to its rapid electrochemistry. However, this is not true for the thermal, mechanical, and chemical balance-of-plant subsystem (BOPS), where load-following time constants are, typically, several orders of magnitude higher. This dichotomy diminishes the reliability and performance of the electrode with increasing demand of load. Because these unwanted phenomena are not well understood, the manufacturers of SOFC use conservative schemes (such as, delayed load-following to compensate for slow BOPS response or expensive inductor filtering) to control stack responses to load variations. This limits the applicability of SOFC systems for load-varying stationary and transportation applications from a cost standpoint. Thus, a need exists for the synthesis of component- and system-level models of SOFC power-conditioning systems and the development of methodologies for investigating the system-interaction issues (which reduce the lifetime and efficiency of a SOFC) and optimizing the responses of each subsystem, leading to optimal designs of power-conditioning electronics and optimal control strategies, which mitigate the electrical-feedback effects. Equally important are ''multiresolution'' finite-element modeling and simulation studies, which can predict the impact of changes in system-level variables (e.g., current ripple and load-transients) on the local current densities, voltages, and temperature (these parameters are very difficult or cumbersome, if not impossible to obtain) within a SOFC cell. Towards that end, for phase I of this project, sponsored by the U.S. DOE (NETL), we investigate the interactions among fuel cell, power-conditioning system, and application loads and their effects on SOFC reliability (durability) and performance. A number of methodologies have been used in Phase I to develop the steady-state and transient

  18. Project Management Methodology for the Development of M-Learning Web Based Applications

    Adrian VISOIU

    2010-01-01

    Full Text Available M-learning web based applications are a particular case of web applications designed to be operated from mobile devices. Also, their purpose is to implement learning aspects. Project management of such applications takes into account the identified peculiarities. M-learning web based application characteristics are identified. M-learning functionality covers the needs of an educational process. Development is described taking into account the mobile web and its influences over the analysis, design, construction and testing phases. Activities building up a work breakdown structure for development of m-learning web based applications are presented. Project monitoring and control techniques are proposed. Resources required for projects are discussed.

  19. Other best-estimate code and methodology applications in addition to licensing

    Tanarro, A.

    1999-01-01

    Along with their applications for licensing purposes, best-estimate thermalhydraulic codes allow for a wide scope of additional uses and applications, in which as realistic and realizable results as possible are necessary. Although many of these applications have been successfully developed nowadays, the use of best-estimate codes for applications other than those associated to licensing processes is not so well known among the nuclear community. This issue shows some of these applications, briefly describing their more significant and specific features. (Author)

  20. Application of Response Surface Methodology (RSM for Optimization of Operating Parameters and Performance Evaluation of Cooling Tower Cold Water Temperature

    Ramkumar RAMAKRISHNAN

    2012-01-01

    Full Text Available The performance of a cooling tower was analyzed with various operating parameters tofind the minimum cold water temperature. In this study, optimization of operating parameters wasinvestigated. An experimental design was carried out based on central composite design (CCD withresponse surface methodology (RSM. This paper presents optimum operating parameters and theminimum cold water temperature using the RSM method. The RSM was used to evaluate the effectsof operating variables and their interaction towards the attainment of their optimum conditions.Based on the analysis, air flow, hot water temperature and packing height were high significanteffect on cold water temperature. The optimum operating parameters were predicted using the RSMmethod and confirmed through experiment.

  1. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  2. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  3. Application of the accident management information needs methodology to a severe accident sequence

    Ward, L.W.; Hanson, D.J.; Nelson, W.R.; Solberg, D.E.

    1989-01-01

    The U.S. Nuclear Regulatory Commission is conducting an accident management research program that emphasizes the use of severe accident research to enhance the ability of plant operating personnel to effectively manage severe accidents. Hence, it is necessary to ensure that the plant instrumentation and information systems adequately provide this information to the operating staff during accident conditions. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed. The methodology identifies (a) the information needs of the plant personnel during a wide range of accident conditions, (b) the existing plant measurements capable of supplying these information needs and minor additions to instrument and display systems that would enhance management capabilities, (c) measurement capabilities and limitations during severe accident conditions, and (d) areas in which the information systems could mislead plant personnel

  4. Application of the accident management information needs methodology to a severe accident sequence

    Ward, L.W.; Hanson, D.J.; Nelson, W.R. (Idaho National Engineering Laboratory, Idaho Falls (USA)); Solberg, D.E. (Nuclear Regulatory Commission, Washington, DC (USA))

    1989-11-01

    The U.S. Nuclear Regulatory Commission is conducting an accident management research program that emphasizes the use of severe accident research to enhance the ability of plant operating personnel to effectively manage severe accidents. Hence, it is necessary to ensure that the plant instrumentation and information systems adequately provide this information to the operating staff during accident conditions. A methodology to identify and assess the information needs of the operating staff of a nuclear power plant during a severe accident has been developed. The methodology identifies (a) the information needs of the plant personnel during a wide range of accident conditions, (b) the existing plant measurements capable of supplying these information needs and minor additions to instrument and display systems that would enhance management capabilities, (c) measurement capabilities and limitations during severe accident conditions, and (d) areas in which the information systems could mislead plant personnel.

  5. Systematic Review of the Application of Lean and Six Sigma Quality Improvement Methodologies in Radiology.

    Amaratunga, Thelina; Dobranowski, Julian

    2016-09-01

    Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed

  6. Prediction Methodology for Proton Single Event Burnout: Application to a STRIPFET Device

    Siconolfi, Sara; Oser, Pascal; Spiezia, Giovanni; Hubert, Guillaume; David, Jean-Pierre

    2015-01-01

    This paper presents a single event burnout (SEB) sensitivity characterization for power MOSFETs, independent from tests, through a prediction model issued from TCAD analysis and the knowledge of device topology. The methodology is applied to a STRIPFET device and compared to proton data obtained at PSI, showing a good agreement in the order of magnitude of proton SEB cross section, and thus validating the prediction model as an alternative device characterization with respect to SEB.

  7. Multi-objective and multi-physics optimization methodology for SFR core: application to CFV concept

    Fabbris, Olivier

    2014-01-01

    Nuclear reactor core design is a highly multidisciplinary task where neutronics, thermal-hydraulics, fuel thermo-mechanics and fuel cycle are involved. The problem is moreover multi-objective (several performances) and highly dimensional (several tens of design parameters).As the reference deterministic calculation codes for core characterization require important computing resources, the classical design method is not well suited to investigate and optimize new innovative core concepts. To cope with these difficulties, a new methodology has been developed in this thesis. Our work is based on the development and validation of simplified neutronics and thermal-hydraulics calculation schemes allowing the full characterization of Sodium-cooled Fast Reactor core regarding both neutronics performances and behavior during thermal hydraulic dimensioning transients.The developed methodology uses surrogate models (or meta-models) able to replace the neutronics and thermal-hydraulics calculation chain. Advanced mathematical methods for the design of experiment, building and validation of meta-models allows substituting this calculation chain by regression models with high prediction capabilities.The methodology is applied on a very large design space to a challenging core called CFV (French acronym for low void effect core) with a large gain on the sodium void effect. Global sensitivity analysis leads to identify the significant design parameters on the core design and its behavior during unprotected transient which can lead to severe accidents. Multi-objective optimizations lead to alternative core configurations with significantly improved performances. Validation results demonstrate the relevance of the methodology at the pre-design stage of a Sodium-cooled Fast Reactor core. (author) [fr

  8. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  9. Adjustment methodology for preliminary study on the distribution of bone tissue boron. Potential therapeutic applications

    Brandizzi, D; Dagrosa, A; Carpano, M.; Olivera, M. S.; Nievas, S; Cabrini, R.L.

    2013-01-01

    Boron is an element that has an affinity for bone tissue and represents a considered element in bone health . Other boron compounds are used in the Boron Neutron Capture Therapy (BNCT ) in the form of sodium borocaptate (BSH ) and borono phenylalanine (BPA). The results of clinical trials up to date are encouraging but not conclusive . At an experimental level , some groups have applied BNCT in osteosarcomas . We present preliminary methodological adjustments for the presence of boron in bone. (author)

  10. APPLICATION OF LOT QUALITY ASSURANCE SAMPLING FOR ASSESSING DISEASE CONTROL PROGRAMMES - EXAMINATION OF SOME METHODOLOGICAL ISSUES

    T. R. RAMESH RAO

    2011-01-01

    Lot Quality Assurance Sampling (LQAS), a statistical tool in industrial setup, has been in use since 1980 for monitoring and evaluation of programs on disease control / immunization status among children / health workers performance in health system. While conducting LQAS in the field, there are occasions, even after due care of design, there are practical and methodological issues to be addressed before it is recommended for implementation and intervention. LQAS is applied under the assumpti...

  11. Methodology on quantification of sonication duration for safe application of MR guided focused ultrasound for liver tumour ablation.

    Mihcin, Senay; Karakitsios, Ioannis; Le, Nhan; Strehlow, Jan; Demedts, Daniel; Schwenke, Michael; Haase, Sabrina; Preusser, Tobias; Melzer, Andreas

    2017-12-01

    Magnetic Resonance Guided Focused Ultrasound (MRgFUS) for liver tumour ablation is a challenging task due to motion caused by breathing and occlusion due the ribcage between the transducer and the tumour. To overcome these challenges, a novel system for liver tumour ablation during free breathing has been designed. The novel TRANS-FUSIMO Treatment System (TTS, EUFP7) interacts with a Magnetic Resonance (MR) scanner and a focused ultrasound transducer to sonicate to a moving target in liver. To meet the requirements of ISO 13485; a quality management system for medical device design, the system needs to be tested for certain process parameters. The duration of sonication and, the delay after the sonication button is activated, are among the parameters that need to be quantified for efficient and safe ablation of tumour tissue. A novel methodology is developed to quantify these process parameters. A computerised scope is programmed in LabVIEW to collect data via hydrophone; where the coordinates of fiber-optic sensor assembly was fed into the TRANS-FUSIMO treatment software via Magnetic Resonance Imaging (MRI) to sonicate to the tip of the sensor, which is synchronised with the clock of the scope, embedded in a degassed water tank via sensor assembly holder. The sonications were executed for 50 W, 100 W, 150 W for 10 s to quantify the actual sonication duration and the delay after the emergency stop by two independent operators for thirty times. The deviation of the system from the predefined specs was calculated. Student's-T test was used to investigate the user dependency. The duration of sonication and the delay after the sonication were quantified successfully with the developed method. TTS can sonicate with a maximum deviation of 0.16 s (Std 0.32) from the planned duration and with a delay of 14 ms (Std 0.14) for the emergency stop. Student's T tests indicate that the results do not depend on operators (p > .05). The evidence obtained via this

  12. Application of the risk-informed methodology for APR1400 P-T limits curve

    Kim, K.; Namgung, I. [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-07-01

    A reactor pressure vessel (RPV) in a nuclear power plant has operational limits of pressure and temperature to prevent a potential drastic propagation of cracks due to brittle fracture. We call it a pressure-temperature limits curve (P-T limits curve). Appendix G of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code, Section XI, provides deterministic procedures to develop the P-T limits curve for the reactor pressure vessel. Recently, an alternative risk-informed methodology has been added in the ASME Code. Risk-informed means that we can consider insights from a probabilistic risk assessment by using this methodology. This alternative methodology provides a simple procedure to develop risk-informed P-T limits for heat up, cool down, and hydrostatic test events. The risk-informed P-T limits curve is known to provide more operational flexibility, particularly for reactor pressure vessels with relatively high irradiation levels and radiation sensitive materials. In this paper, we developed both the deterministic and a risk-informed P-T limits curve for an APR1400 reactor using Appendix G of the ASME Code, Section XI and compare the results in terms of additional operational margin. (author)

  13. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  14. Methodology Development and Applications of Proliferation Resistance and Physical Protection Evaluation

    Bari, R.A.; Peterson, P.F.; Therios, I.U.; Whitlock, J.J.

    2010-01-01

    We present an overview of the program on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of advanced nuclear energy systems (NESs) sponsored by the Generation IV International Forum (GIF). For a proposed NES design, the methodology defines a set of challenges, analyzes system response to these challenges, and assesses outcomes. The challenges to the NES are the threats posed by potential actors (proliferant States or sub-national adversaries). The characteristics of Generation IV systems, both technical and institutional, are used to evaluate the response of the system and to determine its resistance against proliferation threats and robustness against sabotage and terrorism threats. The outcomes of the system response are expressed in terms of a set of measures, which are the high-level PR and PP characteristics of the NES. The methodology is organized to allow evaluations to be performed at the earliest stages of system design and to become more detailed and more representative as the design progresses. It can thus be used to enable a program in safeguards by design or to enhance the conceptual design process of an NES with regard to intrinsic features for PR and PP.

  15. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  16. Influence of activated carbon characteristics on toluene and hexane adsorption: Application of surface response methodology

    Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa

    2013-01-01

    The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.

  17. High-frequency measurements of aeolian saltation flux: Field-based methodology and applications

    Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.

    2018-02-01

    Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.

  18. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  19. Application fo fault tree methodology in the risk analysis of complex systems

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  20. Level II Probabilistic Safety Analysis Methodology for the Application to GEN-IV Sodium-cooled Fast Reactor

    Park, S. Y.; Kim, T. W.; Han, S. H.; Jeong, H. Y.

    2010-03-01

    The Korea Atomic Energy Research Institute (KAERI) has been developing liquid metal reactor (LMR) design technologies under a National Nuclear R and D Program. Nevertheless, there is no experience of the probabilistic safety assessment (PSA) domestically for a fast reactor with the metal fuel. Therefore, the objective of this study is to establish the methodologies of risk assessment for the reference design of GEN-IV sodium fast reactor (SFR). An applicability of the PSA methodology of U. S. NRC and PRISM plant to the domestic GEN-IV SFR has been studied. The study contains a plant damage state analysis, a containment event tree analysis, and a source-term release category binning process

  1. Methodology for the economic evaluation of the application of the eolic energy and lot in the desalinization of sea water

    Cisneros Ramirez, Cesar A.

    2007-01-01

    The methodology that is presented allows the preliminary evaluation of the cost of the water of sea ($/m3) of a non connected system to the net, fed with renewable energy (eolic and photovoltaic lot) or with an electric generator. The production capacities they are limited to the 100 m 3 /d. The desalinisation plant can be fed by a single energy source or for but of one of them, what will constitute in this last case a system with feeding hybrid. In all the cases it was considered the necessity of energy storage to inclination of batteries to exception of when the feeding was by means of a generator electric. In the annex a chart is presented with the result of the application of the methodology

  2. A methodology for accident analysis of fusion breeder blankets and its application to helium-cooled lead–lithium blanket

    Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; Trow, Martin; Dillistone, Michael

    2016-01-01

    'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.

  3. Definition of a methodology for the management of geological heritage. An application to the Azores archipelago (Portugal)

    Lima, Eva; Nunes, João; Brilha, José; Calado, Helena

    2013-04-01

    The conservation of the geological heritage requires the support of appropriate policies, which should be the result of the integration of nature conservation, environmental and land-use planning, and environmental education perspectives. There are several papers about inventory methodologies for geological heritage and its scientific, educational and tourism uses (e.g. Cendrero, 2000, Lago et al., 2000; Brilha, 2005; Carcavilla et al., 2007). However, management methodologies for geological heritage are still poorly developed. They should be included in environmental and land-use planning and nature conservation policies, in order to support a holistic approach to natural heritage. This gap is explained by the fact that geoconservation is a new geoscience still needed of more basic scientific research, like any other geoscience (Henriques et al., 2011). It is necessary to establish protocols and mechanisms for the conservation and management of geological heritage. This is a complex type of management because it needs to address not only the fragile natural features to preserve but also legal, economic, cultural, educational and recreational aspects. In addition, a management methodology should ensure the geosites conservation, the local development and the dissemination of the geological heritage (Carcavilla et al., 2007). This work is part of a PhD project aiming to contribute to fill this gap that exists in the geoconservation domain, specifically in terms of establishing an appropriate methodology for the management of geological heritage, taking into account the natural diversity of geosites and the variety of natural and anthropic threats. The proposed methodology will be applied to the geological heritage of the Azores archipelago, which management acquires particular importance and urgency after the decision of the Regional Government to create the Azores Geopark and its application to the European and Global Geoparks Networks. Acknowledgment This work is

  4. Safety Assessment Methodologies and Their Application in Development of Near Surface Waste Disposal Facilities--ASAM Project

    Batandjieva, B.; Metcalf, P.

    2003-01-01

    Safety of near surface disposal facilities is a primary focus and objective of stakeholders involved in radioactive waste management of low and intermediate level waste and safety assessment is an important tool contributing to the evaluation and demonstration of the overall safety of these facilities. It plays significant role in different stages of development of these facilities (site characterization, design, operation, closure) and especially for those facilities for which safety assessment has not been performed or safety has not been demonstrated yet and the future has not been decided. Safety assessments also create the basis for the safety arguments presented to nuclear regulators, public and other interested parties in respect of the safety of existing facilities, the measures to upgrade existing facilities and development of new facilities. The International Atomic Energy Agency (IAEA) has initiated a number of research coordinated projects in the field of development and improvement of approaches to safety assessment and methodologies for safety assessment of near surface disposal facilities, such as NSARS (Near Surface Radioactive Waste Disposal Safety Assessment Reliability Study) and ISAM (Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities) projects. These projects were very successful and showed that there is a need to promote the consistent application of the safety assessment methodologies and to explore approaches to regulatory review of safety assessments and safety cases in order to make safety related decisions. These objectives have been the basis of the IAEA follow up coordinated research project--ASAM (Application of Safety Assessment Methodologies for Near Surface Disposal Facilities), which will commence in November 2002 and continue for a period of three years

  5. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  6. History, applications, methodological issues and perspectives for the use of environmental DNA (eDNA) in marine and freshwater environments.

    Díaz-Ferguson, Edgardo E; Moyer, Gregory R

    2014-12-01

    Genetic material (short DNA fragments) left behind by species in nonliving components of the environment (e.g. soil, sediment, or water) is defined as environmental DNA (eDNA). This DNA has been previously described as particulate DNA and has been used to detect and describe microbial communities in marine sediments since the mid-1980's and phytoplankton communities in the water column since the early-1990's. More recently, eDNA has been used to monitor invasive or endangered vertebrate and invertebrate species. While there is a steady increase in the applicability of eDNA as a monitoring tool, a variety of eDNA applications are emerging in fields such as forensics, population and community ecology, and taxonomy. This review provides scientist with an understanding of the methods underlying eDNA detection as well as applications, key methodological considerations, and emerging areas of interest for its use in ecology and conservation of freshwater and marine environments.

  7. Parametric Model for Astrophysical Proton-Proton Interactions and Applications

    Karlsson, Niklas [KTH Royal Institute of Technology, Stockholm (Sweden)

    2007-01-01

    Observations of gamma-rays have been made from celestial sources such as active galaxies, gamma-ray bursts and supernova remnants as well as the Galactic ridge. The study of gamma rays can provide information about production mechanisms and cosmic-ray acceleration. In the high-energy regime, one of the dominant mechanisms for gamma-ray production is the decay of neutral pions produced in interactions of ultra-relativistic cosmic-ray nuclei and interstellar matter. Presented here is a parametric model for calculations of inclusive cross sections and transverse momentum distributions for secondary particles--gamma rays, e±, ve, $\\bar{v}$e, vμ and $\\bar{μ}$e--produced in proton-proton interactions. This parametric model is derived on the proton-proton interaction model proposed by Kamae et al.; it includes the diffraction dissociation process, Feynman-scaling violation and the logarithmically rising inelastic proton-proton cross section. To improve fidelity to experimental data for lower energies, two baryon resonance excitation processes were added; one representing the Δ(1232) and the other multiple resonances with masses around 1600 MeV/c2. The model predicts the power-law spectral index for all secondary particle to be about 0.05 lower in absolute value than that of the incident proton and their inclusive cross sections to be larger than those predicted by previous models based on the Feynman-scaling hypothesis. The applications of the presented model in astrophysics are plentiful. It has been implemented into the Galprop code to calculate the contribution due to pion decays in the Galactic plane. The model has also been used to estimate the cosmic-ray flux in the Large Magellanic Cloud based on HI, CO and gamma-ray observations. The transverse momentum distributions enable calculations when the proton distribution is anisotropic. It is shown that the gamma-ray spectrum and flux due to a

  8. THE PROPOSED METHODOLOGIES FOR THE SIX SIGMA METHOD AND TQM STRATEGY AS WELL AS THEIR APPLICATION IN PRACTICE IN MACEDONIA

    Elizabeta Mitreva

    2014-05-01

    Full Text Available This paper presents the proposed methodologies for the Six Sigma method and the TQM strategy as well as their application in practice in Macedonia. Although the philosophy of the total quality management (TQM is deeply involved in many industries and business areas of European and other countries it is insufficiently known and present in our country and other developing countries. The same applies to the Six Sigma approach of reducing the dispersion of a process and it is present in a small fraction in Macedonian companies. The results of the implementation have shown that the application of the Six Sigma approach does not refer to the number of defects per million opportunities but to the systematic and systemic lowering of the dispersion process. The operation and effect of the implementation of the six sigma method engages experts that receive a salary depending on the success of the Six Sigma program. On other hand the results of the application of the TQM methodology within the Macedonian companies will depend on the commitment of all employees and their motivation.

  9. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  10. Application of Taguchi methodology to improve the functional quality of a mechanical device

    Regeai, Awatef Omar

    2005-01-01

    Manufacturing and quality control are recognized branches of engineering management. special attention has been made to improve thr tools and methods for the purpose of improving the products quality and finding solutions for any Obstacles and/or problems during the production process. Taguchi methodology is one of the most powerful techniques for improving product and manufacturing process quality at low cost. It is a strategical and practical method that aims to assist managers and industrial engineers to tackle manufacturing quality problems in a systematic and structured manner. The potential benefit of Taguchi methodology lies in its ease of use, its emphasis on reducing variability to give more economical products and hence the accessibility to the engineering fraternity for solving real life quality problems. This study applies Taguchi methodology to improve the functional quality of a local made chain gear by a purposed heat treatment process. The hardness of steel is generally a function not of its composition only, but rather of its heat treatment. The study investigates the effects of various heat treatment parameters, including ramp rate of heating, normalizing holding time, normalizing temperature, annealing holding time, annealing temperature, hardening holding time, hardening temperature, quenching media, tempering temperature and tempering holding time upon the hardness, which is a measure of resistance to plastic deformation. Both the analysis of means (ANOM) and Signal to Noise ratio (S/N) have been carried out for determining the optimal condition of the process. A significant improvement of the functional quality characteristic (hardness) by more than 32% was obtained. The Scanning Electron Microscopy technique was used in this study to obtain visual evidence of the quality and continuous improvement of the heat treated samples. (author)

  11. Direct potable reuse microbial risk assessment methodology: Sensitivity analysis and application to State log credit allocations.

    Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P

    2018-01-01

    Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.

  12. Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.

    Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe

    2012-01-01

    Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.

  13. Methodology for the application of the probabilistic safety analysis to the cobalto therapy units in Cuba

    Vilaragut Llanes, Juan Jose; Ferro Fernandez, Ruben; Troncoso Fleitas, Mayra; Lozano Lima, Berta; De la Fuente Puch, Andres; Perez Reyes, Yolanda; Dumenigo Gonzalez, Cruz

    2001-01-01

    Presently work the main elements are discussed kept in mind for the use of the Analyses Probabilistas of Security in the evaluation of the security of the units of cobalto therapy of Cuba and it is presented, like part of the results of the first stage of the Study, the Methodological Guide that is being used in a Contract of Investigation of the OIEA that at the moment carries out the community of authors of the CNSN, of group with other specialists of the Ministry of Public Health (MINSAP)

  14. Applicability of risk-informed criticality methodology to spent fuel repositories

    Mays, C.; Thomas, D.A.; Favet, D.

    2000-01-01

    An important objective of geologic disposal is keeping the fissionable material in a condition so that a self-sustaining nuclear chain reaction (criticality) is highly unlikely. This objective supports the overall performance objective of any repository, which is to protect the health and safety of the public by limiting radiological exposure. This paper describes a risk-informed, performance-based methodology, which combines deterministic and probabilistic approaches for evaluating the criticality potential of high-level waste and spent nuclear fuel after the repository is sealed and permanently closed (postclosure). (authors)

  15. Application of ASSET methodology and operational experience feedback of NPPs in China

    Lan, Ziyong [The National Nuclear Safety Administration, Beijing (China)

    1997-10-01

    The introductive presentation of ASSET methodology to China started in March 1992, 3 experts from the IAEA held the ASSET Seminar in Wuhan, China. Three years later, an IAEA seminar on ASSET Method and Operational Experience Feedback proceeded in Beijing on 20-24 March 1995. Another ASSET seminar on Self-Assessment and Operational Experience Feedback was held at Guangdong NPP site on 2-6 December 1996, The NNSA and the GNPP hosted the seminar, 2 IAEA experts, 55 participants from the NPPs, research institutes, the regulatory body (NNSA) and its regional offices attended the seminar. 3 figs, 5 tabs.

  16. Methodology and Applications in Non-linear Model-based Geostatistics

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model...... priors for Bayesian inference is discussed. Procedures for parameter estimation and prediction are studied. Theoretical properties of Markov chain Monte Carlo algorithms are investigated, and different algorithms are compared. In addition, the thesis contains a manual for an R-package, geoRglmm, which...

  17. Application of a systematic methodology for sustainable carbon dioxide utilization process design

    Plaza, Cristina Calvera; Frauzem, Rebecca; Gani, Rafiqul

    than carbon capture and storage. To achieve this a methodology is developed to design sustainable carbon dioxide utilization processes. First, the information on the possible utilization alternatives is collected, including the economic potential of the process and the carbon dioxide emissions...... emission are desired in order to reduce the carbon dioxide emissions. Using this estimated preliminary evaluation, the top processes, with the most negative carbon dioxide emission are investigated by rigorous detailed simulation to evaluate the net carbon dioxide emissions. Once the base case design...

  18. THEORY AND METHODOLOGY OF ICT APPLICATION INTO A SOCIAL STUDY IN ABROAD AND UKRAINE: GENERAL APPROACHES

    Olena O. Hrytsenchuk

    2011-02-01

    Full Text Available The article presents an analysis of theoretical and methodological foundations of the implementation of information and communication technology (ICT in secondary education and particularly in the social sciences in Western Europe, USA and Ukraine today. Materials and documents of UNDP, Council of Europe, the Organization of European Cooperation and Development (OECD and documents legal and regulatory framework of education, school curricula and programs of foreign countries and Ukraine were researched. There are outlined approaches ICT use in subject areas of Social cycle of secondary school, the article covers some directions of national education strategies for using ICT in Western Europe, USA and Ukraine and prospects of development.

  19. Electrodeposition of Iridium Oxide by Cyclic Voltammetry: Application of Response Surface Methodology

    Kakooei Saeid

    2014-07-01

    Full Text Available The effects of scan rate, temperature, and number of cycles on the coating thickness of IrOX electrodeposited on a stainless steel substrate by cyclic voltammetry were investigated in a statistical system. The central composite design, combined with response surface methodology, was used to study condition of electrodeposition. All fabricated electrodes were characterized using electrochemical methods. Field emission scanning electron microscopy and energy-dispersive X-ray spectroscopy were performed for IrOX film characterization. Results showed that scan rate significantly affects the thickness of the electrodeposited layer. Also, the number of cycles has a greater effect than temperature on the IrOX thickness.

  20. Application of frequency domain line edge roughness characterization methodology in lithography

    Sun, Lei; Wang, Wenhui; Beique, Genevieve; Wood, Obert; Kim, Ryoung-Han

    2015-03-01

    A frequency domain 3 sigma LER characterization methodology combining the standard deviation and power spectral density (PSD) methods is proposed. In the new method, the standard deviation is calculated in the frequency domain instead of the spatial domain as in the conventional method. The power spectrum of the LER is divided into three regions: low frequency (LF), middle frequency (MF) and high frequency (HF) regions. The frequency region definition is based on process visual comparisons. Three standard deviation numbers are used to characterize the LER in the three frequency regions. Pattern wiggling can be detected quantitatively with a wiggling factor which is also proposed in this paper.

  1. Application of ASSET methodology and operational experience feedback of NPPs in China

    Ziyong Lan

    1997-01-01

    The introductive presentation of ASSET methodology to China started in March 1992, 3 experts from the IAEA held the ASSET Seminar in Wuhan, China. Three years later, an IAEA seminar on ASSET Method and Operational Experience Feedback proceeded in Beijing on 20-24 March 1995. Another ASSET seminar on Self-Assessment and Operational Experience Feedback was held at Guangdong NPP site on 2-6 December 1996, The NNSA and the GNPP hosted the seminar, 2 IAEA experts, 55 participants from the NPPs, research institutes, the regulatory body (NNSA) and its regional offices attended the seminar. 3 figs, 5 tabs

  2. Eliciting and communicating expert judgments: Methodology and application to nuclear safety

    Winterfeldt, D. von

    1989-01-01

    The most ambitious and certainly the most extensive formal expert judgment process was the elicitation of numerous events and uncertain quantities for safety issues in five nuclear power plants in the U.S. The general methodology for formal expert elicitations are described. An overview of the expert elicitation process of NUREG 1150 is provided and the elicitation of probabilities for the interfacing systems loss of coolant accident LOCA (ISL) in PWRs is given as an example of this elicitation process. Some lessons learned from this study are presented. (DG)

  3. A new synthetic methodology for the preparation of biocompatible and organo-soluble barbituric- and thiobarbituric acid based chitosan derivatives for biomedical applications

    Shahzad, Sohail [Interdisciplinary Research Center in Biomedical Materials, COMSATS Institute of Information Technology, Lahore 54000 (Pakistan); Department of Chemistry, The Islamia University of Bahawalpur, Bahawalpur 63100 (Pakistan); Shahzadi, Lubna [Interdisciplinary Research Center in Biomedical Materials, COMSATS Institute of Information Technology, Lahore 54000 (Pakistan); Mahmood, Nasir [Department of Allied Health Sciences and Chemical Pathology, Department of Human Genetics and Molecular Biology, University of Health Sciences, Lahore (Pakistan); Siddiqi, Saadat Anwar [Interdisciplinary Research Center in Biomedical Materials, COMSATS Institute of Information Technology, Lahore 54000 (Pakistan); Rauf, Abdul [Department of Chemistry, The Islamia University of Bahawalpur, Bahawalpur 63100 (Pakistan); Manzoor, Faisal; Chaudhry, Aqif Anwar [Interdisciplinary Research Center in Biomedical Materials, COMSATS Institute of Information Technology, Lahore 54000 (Pakistan); Rehman, Ihtesham ur [Interdisciplinary Research Center in Biomedical Materials, COMSATS Institute of Information Technology, Lahore 54000 (Pakistan); Department of Materials Science and Engineering, The Kroto Research Institute, The University of Sheffield, North Campus, Broad Lane, Sheffield, S3 7HQ (United Kingdom); Yar, Muhammad, E-mail: drmyar@ciitlahore.edu.pk [Interdisciplinary Research Center in Biomedical Materials, COMSATS Institute of Information Technology, Lahore 54000 (Pakistan)

    2016-09-01

    Chitosan's poor solubility especially in organic solvents limits its use with other organo-soluble polymers; however such combinations are highly required to tailor their properties for specific biomedical applications. This paper describes the development of a new synthetic methodology for the synthesis of organo-soluble chitosan derivatives. These derivatives were synthesized from chitosan (CS), triethyl orthoformate and barbituric or thiobarbituric acid in the presence of 2-butannol. The chemical interactions and new functional motifs in the synthesized CS derivatives were evaluated by FTIR, DSC/TGA, UV/VIS, XRD and {sup 1}H NMR spectroscopy. A cytotoxicity investigation for these materials was performed by cell culture method using VERO cell line and all the synthesized derivatives were found to be non-toxic. The solubility analysis showed that these derivatives were readily soluble in organic solvents including DMSO and DMF. Their potential to use with organo-soluble commercially available polymers was exploited by electrospinning; the synthesized derivatives in combination with polycaprolactone delivered nanofibrous membranes. - Highlights: • Development of a new synthetic methodology • Synthesis of organo-soluble chitosan (CS) derivatives • VERO cells proliferation • Nanofibrous membranes from the synthesized chitosan derivatives and polycaprolactone.

  4. Application of spectral distributions in effective interaction theory

    Chang, B.D.

    1980-01-01

    The calculation of observable quantities in a large many-particle space is very complicated and often impractical. In effective interaction theory, to simplify the calculation, the full many-particle space is truncated to a small, manageable model space and the operators associated with the observables are renormalized to accommodate the truncation effects. The operator that has been most extensively studied for renormalization is the Hamiltonian. The renormalized Hamiltonian, often called the effective Hamiltonian, can be defined such that it not only gives the eigenvalues, but also the projections of the full-space (true) eigen-functions onto the model space. These projected wave functions then provide a convenient basis for renormalization of other operators. The usual framework for renormalization is perturbation theory. Unfortunately, the conventional perturbation series for effective Hamiltonians have problems with convergence and their high order terms (especially 4th or higher) are also difficult to calculate. The characteristics of spectral distributions can be helptul in determining the model space and calculating the effective Hamiltonian. In this talk applications of spectral distributions are discussed in the following areas: (1) truncation of many particle spaces by selection of configurations; (2) orthogonal polynomial expansions for the effective Hamiltonian; and (3) establishing new criteria for the effective Hamiltonian

  5. Mineral surface–organic matter interactions: basics and applications

    Valdrè, G; Moro, D; Ulian, G

    2012-01-01

    The ability to control the binding of biological and organic molecules to a crystal surface is central in several fields; for example, in biotechnology, catalysis, molecular microarrays, biosensors preparation and environmental sciences. The nano-morphology and nanostructure at the surface may have physico-chemical properties that are very different from those of the underlying mineral substrate. Recent developments in scanning probe microscopy (SPM) have widened the spectrum of possible investigations that can be performed at the nanometric level on the surface of minerals. They range from the study of physical properties such as surface potential, electric field topological determination, Brønsted–Lowry site distributions, to chemical and spectroscopic analysis in air, in liquid or in gaseous environments. After an introduction to SPM modes of operation and new SPM-based technological developments, we will present recent examples of applications in the study of interactions between organic matter and mineral surface and report on the advances in knowledge that have been made by the use of scanning probe microscopy.

  6. Response surface methodology approach for structural reliability analysis: An outline of typical applications performed at CEC-JRC, Ispra

    Lucia, A.C.

    1982-01-01

    The paper presents the main results of the work carried out at JRC-Ispra for the study of specific problems posed by the application of the response surface methodology to the exploration of structural and nuclear reactor safety codes. Some relevant studies have been achieved: assessment of structure behaviours in the case of seismic occurrences; determination of the probability of coherent blockage in LWR fuel elements due to LOCA occurrence; analysis of ATWS consequences in PWR reactors by means of an ALMOD code; analysis of the first wall for an experimental fusion reactor by means of the Bersafe code. (orig.)

  7. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  8. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    Acero, R; Pueo, M; Santolaria, J; Aguilar, J J; Brau, A

    2015-01-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures. (paper)

  9. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  10. Nuclear Forensics: A Methodology Applicable to Nuclear Security and to Non-Proliferation

    Mayer, K; Wallenius, M; Luetzenkirchen, K; Galy, J; Varga, Z; Erdmann, N; Buda, R; Kratz, J-V; Trautmann, N; Fifield, K

    2011-01-01

    Nuclear Security aims at the prevention and detection of and response to, theft, sabotage, unauthorized access, illegal transfer or other malicious acts involving nuclear material. Nuclear Forensics is a key element of nuclear security. Nuclear Forensics is defined as a methodology that aims at re-establishing the history of nuclear material of unknown origin. It is based on indicators that arise from known relationships between material characteristics and process history. Thus, nuclear forensics analysis includes the characterization of the material and correlation with production history. To this end, we can make use of parameters such as the isotopic composition of the nuclear material and accompanying elements, chemical impurities, macroscopic appearance and microstructure of the material. In the present paper, we discuss the opportunities for attribution of nuclear material offered by nuclear forensics as well as its limitations. Particular attention will be given to the role of nuclear reactions. Such reactions include the radioactive decay of the nuclear material, but also reactions with neutrons. When uranium (of natural composition) is exposed to neutrons, plutonium is formed, as well as 236 U. We will illustrate the methodology using the example of a piece of uranium metal that dates back to the German nuclear program in the 1940's. A combination of different analytical techniques and model calculations enables a nuclear forensics interpretation, thus correlating the material characteristics with the production history.

  11. Nuclear Forensics: A Methodology Applicable to Nuclear Security and to Non-Proliferation

    Mayer, K.; Wallenius, M.; Lützenkirchen, K.; Galy, J.; Varga, Z.; Erdmann, N.; Buda, R.; Kratz, J.-V.; Trautmann, N.; Fifield, K.

    2011-09-01

    Nuclear Security aims at the prevention and detection of and response to, theft, sabotage, unauthorized access, illegal transfer or other malicious acts involving nuclear material. Nuclear Forensics is a key element of nuclear security. Nuclear Forensics is defined as a methodology that aims at re-establishing the history of nuclear material of unknown origin. It is based on indicators that arise from known relationships between material characteristics and process history. Thus, nuclear forensics analysis includes the characterization of the material and correlation with production history. To this end, we can make use of parameters such as the isotopic composition of the nuclear material and accompanying elements, chemical impurities, macroscopic appearance and microstructure of the material. In the present paper, we discuss the opportunities for attribution of nuclear material offered by nuclear forensics as well as its limitations. Particular attention will be given to the role of nuclear reactions. Such reactions include the radioactive decay of the nuclear material, but also reactions with neutrons. When uranium (of natural composition) is exposed to neutrons, plutonium is formed, as well as 236U. We will illustrate the methodology using the example of a piece of uranium metal that dates back to the German nuclear program in the 1940's. A combination of different analytical techniques and model calculations enables a nuclear forensics interpretation, thus correlating the material characteristics with the production history.

  12. Regional Energy Demand Responses To Climate Change. Methodology And Application To The Commonwealth Of Massachusetts

    Amato, A.D.; Ruth, M.; Kirshen, P.; Horwitz, J.

    2005-01-01

    Climate is a major determinant of energy demand. Changes in climate may alter energy demand as well as energy demand patterns. This study investigates the implications of climate change for energy demand under the hypothesis that impacts are scale dependent due to region-specific climatic variables, infrastructure, socioeconomic, and energy use profiles. In this analysis we explore regional energy demand responses to climate change by assessing temperature-sensitive energy demand in the Commonwealth of Massachusetts. The study employs a two-step estimation and modeling procedure. The first step evaluates the historic temperature sensitivity of residential and commercial demand for electricity and heating fuels, using a degree-day methodology. We find that when controlling for socioeconomic factors, degree-day variables have significant explanatory power in describing historic changes in residential and commercial energy demands. In the second step, we assess potential future energy demand responses to scenarios of climate change. Model results are based on alternative climate scenarios that were specifically derived for the region on the basis of local climatological data, coupled with regional information from available global climate models. We find notable changes with respect to overall energy consumption by, and energy mix of the residential and commercial sectors in the region. On the basis of our findings, we identify several methodological issues relevant to the development of climate change impact assessments of energy demand

  13. IoT-Based Information System for Healthcare Application: Design Methodology Approach

    Damian Dziak

    2017-06-01

    Full Text Available Over the last few decades, life expectancy has increased significantly. However, elderly people who live on their own often need assistance due to mobility difficulties, symptoms of dementia or other health problems. In such cases, an autonomous supporting system may be helpful. This paper proposes the Internet of Things (IoT-based information system for indoor and outdoor use. Since the conducted survey of related works indicated a lack of methodological approaches to the design process, therefore a Design Methodology (DM, which approaches the design target from the perspective of the stakeholders, contracting authorities and potential users, is introduced. The implemented solution applies the three-axial accelerometer and magnetometer, Pedestrian Dead Reckoning (PDR, thresholding and the decision trees algorithm. Such an architecture enables the localization of a monitored person within four room-zones with accuracy; furthermore, it identifies falls and the activities of lying, standing, sitting and walking. Based on the identified activities, the system classifies current activities as normal, suspicious or dangerous, which is used to notify the healthcare staff about possible problems. The real-life scenarios validated the high robustness of the proposed solution. Moreover, the test results satisfied both stakeholders and future users and ensured further cooperation with the project.

  14. A methodology for optimisation of countermeasures for animal products after a nuclear accident and its application

    Hwang, Won Tae; Cho, Gyuseong; Han, Moon Hee

    1999-01-01

    A methodology for the optimisation of the countermeasures associated with the contamination of animal products was designed based on cost-benefit analysis. Results are discussed for the hypothetical deposition of radionuclides on 15 August, when pastures are fully developed in Korean agricultural conditions. A dynamic food chain model, DYNACON, was used to evaluate the effectiveness of the countermeasures for reducing the ingestion dose. The countermeasures considered were: (1) a ban on food consumption; and (2) the substitution of clean fodder. These are effective in reducing the ingestion dose as well as simple and easy to carry out in the first year after deposition. The net benefit of the countermeasures was quantitatively estimated in terms of avertable doses and monetary costs. The benefit depends on a variety of factors, such as radionuclide concentrations on the ground, starting time and duration of the countermeasures. It is obvious that a fast reaction after deposition is important in maximising the cost effectiveness of the countermeasures. In most cases, the substitution of clean fodder is more cost effective than a ban on food consumption. The methodology used in this study may serve as a basis for rapid decision-making on the introduction of countermeasures relating to the contamination of animal products after a nuclear accident

  15. Regional Energy Demand Responses To Climate Change. Methodology And Application To The Commonwealth Of Massachusetts

    Amato, A.D.; Ruth, M. [Environmental Policy Program, School of Public Policy, University of Maryland, 3139 Van Munching Hall, College Park, MD (United States); Kirshen, P. [Department of Civil and Environmental Engineering, Tufts University, Anderson Hall, Medford, MA (United States); Horwitz, J. [Climatological Database Consultant, Binary Systems Software, Newton, MA (United States)

    2005-07-01

    Climate is a major determinant of energy demand. Changes in climate may alter energy demand as well as energy demand patterns. This study investigates the implications of climate change for energy demand under the hypothesis that impacts are scale dependent due to region-specific climatic variables, infrastructure, socioeconomic, and energy use profiles. In this analysis we explore regional energy demand responses to climate change by assessing temperature-sensitive energy demand in the Commonwealth of Massachusetts. The study employs a two-step estimation and modeling procedure. The first step evaluates the historic temperature sensitivity of residential and commercial demand for electricity and heating fuels, using a degree-day methodology. We find that when controlling for socioeconomic factors, degree-day variables have significant explanatory power in describing historic changes in residential and commercial energy demands. In the second step, we assess potential future energy demand responses to scenarios of climate change. Model results are based on alternative climate scenarios that were specifically derived for the region on the basis of local climatological data, coupled with regional information from available global climate models. We find notable changes with respect to overall energy consumption by, and energy mix of the residential and commercial sectors in the region. On the basis of our findings, we identify several methodological issues relevant to the development of climate change impact assessments of energy demand.

  16. Headspace mass spectrometry methodology: application to oil spill identification in soils

    Perez Pavon, J.L.; Garcia Pinto, C.; Moreno Cordero, B. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Guerrero Pena, A. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Laboratorio de Suelos, Plantas y Aguas, Campus Tabasco, Colegio de Postgraduados, Cardenas, Tabasco (Mexico)

    2008-05-15

    In the present work we report the results obtained with a methodology based on direct coupling of a headspace generator to a mass spectrometer for the identification of different types of petroleum crudes in polluted soils. With no prior treatment, the samples are subjected to the headspace generation process and the volatiles generated are introduced directly into the mass spectrometer, thereby obtaining a fingerprint of volatiles in the sample analysed. The mass spectrum corresponding to the mass/charge ratios (m/z) contains the information related to the composition of the headspace and is used as the analytical signal for the characterization of the samples. The signals obtained for the different samples were treated by chemometric techniques to obtain the desired information. The main advantage of the proposed methodology is that no prior chromatographic separation and no sample manipulation are required. The method is rapid, simple and, in view of the results, highly promising for the implementation of a new approach for oil spill identification in soils. (orig.)

  17. Reliability Centered Maintenance (RCM) Methodology and Application to the Shutdown Cooling System for APR-1400 Reactors

    Faragalla, Mohamed M.; Emmanuel, Efenji; Alhammadi, Ibrahim; Awwal, Arigi M.; Lee, Yong Kwan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    Shutdown Cooling System (SCS) is a safety-related system that is used in conjunction with the Main Steam and Main or Auxiliary Feedwater Systems to reduce the temperature of the Reactor Coolant System (RCS) in post shutdown periods from the hot shutdown operating temperature to the refueling temperature. In this paper RCM methodology is applied to (SCS). RCM analysis is performed based on evaluation of Failure Modes Effects and Criticality Analysis (FME and CA) on the component, system and plant. The Logic Tree Analysis (LTA) is used to determine the optimum maintenance tasks. The main objectives of RCM is the safety, preserve the System function, the cost-effective maintenance of the plant components and increase the reliability and availability value. The RCM methodology is useful for improving the equipment reliability by strengthening the management of equipment condition, and leads to a significant decrease in the number of periodical maintenance, extended maintenance cycle, longer useful life of equipment, and decrease in overall maintenance cost. It also focuses on the safety of the system by assigning criticality index to the various components and further selecting maintenance activities based on the risk of failure involved. Therefore, it can be said that RCM introduces a maintenance plan designed for maximum safety in an economical manner and making the system more reliable. For the SCP, increasing the number of condition monitoring tasks will improve the availability of the SCP. It is recommended to reduce the number of periodic maintenance activities.

  18. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  19. A Probabilistic Analysis Methodology and Its Application to A Spent Fuel Pool System

    Kim, Hyowon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of); Ryu, Ho G. [Daedeok R and D Center, Daejeon (Korea, Republic of)

    2013-05-15

    There was a similar accident occurring at the 2{sup nd} unit of PAKS nuclear power station in Hungary on the 10{sup th} April 2003. Insufficient cooling of spent fuel caused the spent fuel burn up or partly melting. There were many previous studies performed for analyzing and measuring the risk of spent fuel damage. In the 1980s, there are changes in conditions such as development of high density storage racks and new information concerning the possibility of cladding fires in the drained spent fuel pools. The US NRC assessed the spent fuel pool risk under the Generic Issue 82. In the 1990s, under the US NRC sponsorship, the risk assessment about the spent fuel pool at Susquehanna Steam Electric Station (SSES) has been performed and Analysis Evaluation of Operational Data (AEOD) has been organized for accumulating the reliability data. A methodology for assessing the risk associated with the spent fuel pool facility has been developed and is applied to the reference plant. It is shown that the methodology developed in this study might contribute to assessing these kinds of the SFP facilities. In this probabilistic risk analysis, the LINV Initial event results in the high frequent occurrence. The most dominant cut-sets include the human errors. The result of this analysis might contribute to identifying the weakness of the preventive and mitigating system in the SFP facility.

  20. A technology-assessment methodology for electric utility planning: With application to nuclear power plant decommissioning

    Lough, W.T.

    1987-01-01

    Electric utilities and public service commissions have not taken full advantage of the many proven methodologies and techniques available for evaluating complex technological issues. In addition, evaluations performed are deficient in their use of (1) methods for evaluating public attitudes and (2) formal methods of analysis for decision making. These oversight are substantiated through an examination of the literature relevant to electric utility planning. The assessment process known as technology assessment or TA is proposed, and a TA model is developed for route in use in utility planning by electric utilities and state regulatory commissions. Techniques to facilitate public participation and techniques to aid decision making are integral to the proposed model and are described in detail. Criteria are provided for selecting an appropriate technique on a case-by-case basis. The TA model proved to be an effective methodology for evaluating technological issues associated with electric utility planning such as decommissioning nuclear power plants. Through the use of the nominal group technique, the attitudes of a group of residential ratepayers were successfully identified and included in the decision-making process