WorldWideScience

Sample records for computing methodologies

  1. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  2. Methodology of Implementation of Computer Forensics

    OpenAIRE

    Gelev, Saso; Golubovski, Roman; Hristov, Risto; Nikolov, Elenior

    2013-01-01

    Compared to other sciences, computer forensics (digital forensics) is a relatively young discipline. It was established in 1999 and it has been an irreplaceable tool in sanctioning cybercrime ever since. Good knowledge of computer forensics can be really helpful in uncovering a committed crime. Not adhering to the methodology of computer forensics, however, makes the obtained evidence invalid/irrelevant and as such it cannot be used in legal proceedings. This paper is to explain the methodolo...

  3. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  4. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  5. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  6. Emission computed tomography: methodology and applications

    International Nuclear Information System (INIS)

    Reivich, M.; Alavi, A.; Greenberg, J.; Fowler, J.; Christman, D.; Rosenquist, A.; Rintelmann, W.; Hand, P.; MacGregor, R.; Wolf, A.

    1980-01-01

    A technique for the determination of local cerebral glucose metabolism using positron emission computed tomography is described as an example of the development of use of this methodology for the study of these parameters in man. The method for the determination of local cerebral glucose metabolism utilizes 18 F-2-fluoro-2-deoxyglucose ([ 18 F]-FDG). In this method [ 18 F]-FDG is used as a tracer for the exchange of glucose between plasma and brain and its phosphorylation by hexokinase in the tissue. The labelled product of metabolism, [ 18 F]-FDG phosphate, is essentially trapped in the tissue over the time course of the measurement. The studies demonstrate the potential usefulness of emission computed tomography for the measurement of various biochemical and physiological parameters in man. (Auth.)

  7. MicroComputed Tomography: Methodology and Applications

    International Nuclear Information System (INIS)

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  8. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  9. Thermal sensation prediction by soft computing methodology.

    Science.gov (United States)

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Computer Presentation Programs and Teaching Research Methodologies

    OpenAIRE

    Motamedi, Vahid

    2015-01-01

    Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer pres...

  11. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  12. Methodological testing: Are fast quantum computers illusions?

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Steven [Tachyon Design Automation, San Francisco, CA (United States)

    2013-07-01

    Popularity of the idea for computers constructed from the principles of QM started with Feynman's 'Lectures On Computation', but he called the idea crazy and dependent on statistical mechanics. In 1987, Feynman published a paper in 'Quantum Implications - Essays in Honor of David Bohm' on negative probabilities which he said gave him cultural shock. The problem with imagined fast quantum computers (QC) is that speed requires both statistical behavior and truth of the mathematical formalism. The Swedish Royal Academy 2012 Nobel Prize in physics press release touted the discovery of methods to control ''individual quantum systems'', to ''measure and control very fragile quantum states'' which enables ''first steps towards building a new type of super fast computer based on quantum physics.'' A number of examples where widely accepted mathematical descriptions have turned out to be problematic are examined: Problems with the use of Oracles in P=NP computational complexity, Paul Finsler's proof of the continuum hypothesis, and Turing's Enigma code breaking versus William tutte's Colossus. I view QC research as faith in computational oracles with wished for properties. Arther Fine's interpretation in 'The Shaky Game' of Einstein's skepticism toward QM is discussed. If Einstein's reality as space-time curvature is correct, then space-time computers will be the next type of super fast computer.

  13. New design methods for computer aided architecturald design methodology teaching

    NARCIS (Netherlands)

    Achten, H.H.

    2003-01-01

    Architects and architectural students are exploring new ways of design using Computer Aided Architectural Design software. This exploration is seldom backed up from a design methodological viewpoint. In this paper, a design methodological framework for reflection on innovate design processes by

  14. Reversible logic synthesis methodologies with application to quantum computing

    CERN Document Server

    Taha, Saleem Mohammed Ridha

    2016-01-01

    This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions  are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Rese...

  15. Spent fuel management fee methodology and computer code user's manual

    International Nuclear Information System (INIS)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively

  16. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  17. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  18. The Methodology of Expert Audit in the Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Irina Vladimirovna Mashkina

    2013-12-01

    Full Text Available The problem of information security audit in the cloud computing system is discussed. The methodology of the expert audit is described, it allows to estimate not only the value of information security risk level, but also operative value of information security risk level. The fuzzy cognitive maps and artificial neural network are used for solution of this problem.

  19. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    Science.gov (United States)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  20. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  1. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  2. New computational methodology for large 3D neutron transport problems

    International Nuclear Information System (INIS)

    Dahmani, M.; Roy, R.; Koclas, J.

    2004-01-01

    We present a new computational methodology, based on 3D characteristics method, dedicated to solve very large 3D problems without spatial homogenization. In order to eliminate the input/output problems occurring when solving these large problems, we set up a new computing scheme that requires more CPU resources than the usual one, based on sweeps over large tracking files. The huge capacity of storage needed in some problems and the related I/O queries needed by the characteristics solver are replaced by on-the-fly recalculation of tracks at each iteration step. Using this technique, large 3D problems are no longer I/O-bound, and distributed CPU resources can be efficiently used. (authors)

  3. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  4. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  5. Beam standardization and dosimetric methodology in computed tomography

    International Nuclear Information System (INIS)

    Maia, Ana Figueiredo

    2005-01-01

    Special ionization chambers, named pencil ionization chambers, are used in dosimetric procedures in computed tomography beams (CT). In this work, an extensive study about pencil ionization chambers was performed, as a contribution to the accuracy of the dosimetric procedures in CT beams. The international scientific community has recently been discussing the need of the establishment of a specific calibration procedure for CT ionization chambers, once these chambers present special characteristics that differentiate them from other ionization chambers used in diagnostic radiology beams. In this work, an adequate calibration procedure for pencil ionization chambers was established at the Calibration Laboratory, of the Institute de Pesquisas Energeticas e Nucleares, in accordance with the most recent international recommendations. Two calibration methodologies were tested and analyzed by comparative studies. Moreover, a new extended length parallel plate ionization chamber, with a transversal section very similar to pencil ionization chambers, was developed. The operational characteristics of this chamber were determined and the results obtained showed that its behaviour is adequate as a reference system in CT standard beams. Two other studies were performed during this work, both using CT ionization chambers. The first study was about the performance of a pencil ionization chamber in standard radiation beams of several types and energies, and the results showed that this chamber presents satisfactory behaviour in other radiation qualities as of diagnostic radiology, mammography and radiotherapy. In the second study, a tandem system for verification of hal'-value layer variations in CT equipment, using a pencil ionization chamber, was developed. Because of the X rays tube rotation, the determination of half-value layers in computed tomography equipment is not an easy task, and it is usually not performed within quality control programs. (author)

  6. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  7. Computer-Aided Methodology for Syndromic Strabismus Diagnosis.

    Science.gov (United States)

    Sousa de Almeida, João Dallyson; Silva, Aristófanes Corrêa; Teixeira, Jorge Antonio Meireles; Paiva, Anselmo Cardoso; Gattass, Marcelo

    2015-08-01

    Strabismus is a pathology that affects approximately 4 % of the population, causing aesthetic problems reversible at any age and irreversible sensory alterations that modify the vision mechanism. The Hirschberg test is one type of examination for detecting this pathology. Computer-aided detection/diagnosis is being used with relative success to aid health professionals. Nevertheless, the routine use of high-tech devices for aiding ophthalmological diagnosis and therapy is not a reality within the subspecialty of strabismus. Thus, this work presents a methodology to aid in diagnosis of syndromic strabismus through digital imaging. Two hundred images belonging to 40 patients previously diagnosed by an specialist were tested. The method was demonstrated to be 88 % accurate in esotropias identification (ET), 100 % for exotropias (XT), 80.33 % for hypertropias (HT), and 83.33 % for hypotropias (HoT). The overall average error was 5.6Δ and 3.83Δ for horizontal and vertical deviations, respectively, against the measures presented by the specialist.

  8. Computational optimization of biodiesel combustion using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ganji Prabhakara Rao

    2017-01-01

    Full Text Available The present work focuses on optimization of biodiesel combustion phenomena through parametric approach using response surface methodology. Physical properties of biodiesel play a vital role for accurate simulations of the fuel spray, atomization, combustion, and emission formation processes. Typically methyl based biodiesel consists of five main types of esters: methyl palmitate, methyl oleate, methyl stearate, methyl linoleate, and methyl linolenate in its composition. Based on the amount of methyl esters present the properties of pongamia bio-diesel and its blends were estimated. CONVERGETM computational fluid dynamics software was used to simulate the fuel spray, turbulence and combustion phenomena. The simulation responses such as indicated specific fuel consumption, NOx, and soot were analyzed using design of experiments. Regression equations were developed for each of these responses. The optimum parameters were found out to be compression ratio – 16.75, start of injection – 21.9° before top dead center, and exhaust gas re-circulation – 10.94%. Results have been compared with baseline case.

  9. A computational methodology for formulating gasoline surrogate fuels with accurate physical and chemical kinetic properties

    KAUST Repository

    Ahmed, Ahfaz; Goteng, Gokop; Shankar, Vijai; Al-Qurashi, Khalid; Roberts, William L.; Sarathy, Mani

    2015-01-01

    simpler molecular composition that represent real fuel behavior in one or more aspects are needed to enable repeatable experimental and computational combustion investigations. This study presents a novel computational methodology for formulating

  10. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    Science.gov (United States)

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  11. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  12. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  13. A Computer-Based System Integrating Instruction and Information Retrieval: A Description of Some Methodological Considerations.

    Science.gov (United States)

    Selig, Judith A.; And Others

    This report, summarizing the activities of the Vision Information Center (VIC) in the field of computer-assisted instruction from December, 1966 to August, 1967, describes the methodology used to load a large body of information--a programed text on basic opthalmology--onto a computer for subsequent information retrieval and computer-assisted…

  14. A Comparison of the Methodological Quality of Articles in Computer Science Education Journals and Conference Proceedings

    Science.gov (United States)

    Randolph, Justus J.; Julnes, George; Bednarik, Roman; Sutinen, Erkki

    2007-01-01

    In this study we empirically investigate the claim that articles published in computer science education journals are more methodologically sound than articles published in computer science education conference proceedings. A random sample of 352 articles was selected from those articles published in major computer science education forums between…

  15. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh; Ravva, Mahesh Kumar; Wang, Tonghui; Bredas, Jean-Luc

    2016-01-01

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus

  16. Methodological Potential of Computer Experiment in Teaching Mathematics at University

    Science.gov (United States)

    Lin, Kequan; Sokolova, Anna Nikolaevna; Vlasova, Vera K.

    2017-01-01

    The study is relevant due to the opportunity of increasing efficiency of teaching mathematics at university through integration of students of computer experiment conducted with the use of IT in this process. The problem of there search is defined by a contradiction between great potential opportunities of mathematics experiment for motivating and…

  17. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  18. Fault-tolerant clock synchronization validation methodology. [in computer systems

    Science.gov (United States)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  19. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  20. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  1. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  2. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  3. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  4. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    Science.gov (United States)

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  5. A New Methodology for Fuel Mass Computation of an operating Aircraft

    Directory of Open Access Journals (Sweden)

    M Souli

    2016-03-01

    Full Text Available The paper performs a new computational methodology for an accurate computation of fuel mass inside an aircraft wing during the flight. The computation is carried out using hydrodynamic equations, classically known as Navier-Stokes equations by the CFD community. For this purpose, a computational software is developed, the software computes the fuel mass inside the tank based on experimental data of pressure gages that are inserted in the fuel tank. Actually and for safety reasons, Optical fiber sensor for fluid level sensor detection is used. The optical system consists to an optically controlled acoustic transceiver system which measures the fuel level inside the each compartment of the fuel tank. The system computes fuel volume inside the tank and needs density to compute the total fuel mass. Using optical sensor technique, density measurement inside the tank is required. The method developed in the paper, requires pressure measurements in each tank compartment, the density is then computed based on pressure measurements and hydrostatic assumptions. The methodology is tested using a fuel tank provided by Airbus for time history refueling process.

  6. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  7. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    Science.gov (United States)

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  8. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh

    2016-09-08

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus is on methodologies appropriate for the characterization, at the molecular level, of the morphology in blend systems consisting of an electron donor and electron acceptor, of importance for understanding the performance properties of bulk-heterojunction organic solar cells. The protocol is formulated as an introductory manual for investigators who aim to study the bulk-heterojunction morphology in molecular details, thereby facilitating the development of structure morphology property relationships when used in tandem with experimental results.

  9. Calculation and evaluation methodology of the flawed pipe and the compute program development

    International Nuclear Information System (INIS)

    Liu Chang; Qian Hao; Yao Weida; Liang Xingyun

    2013-01-01

    Background: The crack will grow gradually under alternating load for a pressurized pipe, whereas the load is less than the fatigue strength limit. Purpose: Both calculation and evaluation methodology for a flawed pipe that have been detected during in-service inspection is elaborated here base on the Elastic Plastic Fracture Mechanics (EPFM) criteria. Methods: In the compute, the depth and length interaction of a flaw has been considered and a compute program is developed per Visual C++. Results: The fluctuating load of the Reactor Coolant System transients, the initial flaw shape, the initial flaw orientation are all accounted here. Conclusions: The calculation and evaluation methodology here is an important basis for continue working or not. (authors)

  10. An alternative methodology for planning computer class where teaching means are used

    Directory of Open Access Journals (Sweden)

    Maria del Carmen Carrillo Hernández

    2016-06-01

    Full Text Available Teaching subject of Informatics II, is provided in the fourth year of teaching career Informática- Labor Education, one of the objectives of it is to develop skills in students that allow plan and structure independently, original and creative, Computer class, where the use of computer media education is the guiding element from which students acquire knowledge. Professional practice have been identified limitations in this regard, with the aim of contributing to the development of these skills, the authors of this article propose an alternative methodology that will guide teachers of this subject, to lead the process learning learning so that the goals are met the program guides.

  11. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  12. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  13. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  14. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  15. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    Science.gov (United States)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  16. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  17. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  18. Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    White, M.K.; Buelt, J.L.; Stottlemyre, J.A.

    1991-02-01

    Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets of compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab

  19. Verification of a hybrid adjoint methodology in Titan for single photon emission computed tomography - 316

    International Nuclear Information System (INIS)

    Royston, K.; Haghighat, A.; Yi, C.

    2010-01-01

    The hybrid deterministic transport code TITAN is being applied to a Single Photon Emission Computed Tomography (SPECT) simulation of a myocardial perfusion study. The TITAN code's hybrid methodology allows the use of a discrete ordinates solver in the phantom region and a characteristics method solver in the collimator region. Currently we seek to validate the adjoint methodology in TITAN for this application using a SPECT model that has been created in the MCNP5 Monte Carlo code. The TITAN methodology was examined based on the response of a single voxel detector placed in front of the heart with and without collimation. For the case without collimation, the TITAN response for single voxel-sized detector had a -9.96% difference relative to the MCNP5 response. To simulate collimation, the adjoint source was specified in directions located within the collimator acceptance angle. For a single collimator hole with a diameter matching the voxel dimension, a difference of -0.22% was observed. Comparisons to groupings of smaller collimator holes of two different sizes resulted in relative differences of 0.60% and 0.12%. The number of adjoint source directions within an acceptance angle was increased and showed no significant change in accuracy. Our results indicate that the hybrid adjoint methodology of TITAN yields accurate solutions greater than a factor of two faster than MCNP5. (authors)

  20. Computationally based methodology for reengineering the high-level waste planning process at SRS

    International Nuclear Information System (INIS)

    Paul, P.K.; Gregory, M.V.; Wells, M.N.

    1997-01-01

    The Savannah River Site (SRS) has started processing its legacy of 34 million gallons of high-level radioactive waste into its final disposable form. The SRS high-level waste (HLW) complex consists of 51 waste storage tanks, 3 evaporators, 6 waste treatment operations, and 2 waste disposal facilities. It is estimated that processing wastes to clean up all tanks will take 30+ yr of operation. Integrating all the highly interactive facility operations through the entire life cycle in an optimal fashion-while meeting all the budgetary, regulatory, and operational constraints and priorities-is a complex and challenging planning task. The waste complex operating plan for the entire time span is periodically published as an SRS report. A computationally based integrated methodology has been developed that has streamlined the planning process while showing how to run the operations at economically and operationally optimal conditions. The integrated computational model replaced a host of disconnected spreadsheet calculations and the analysts' trial-and-error solutions using various scenario choices. This paper presents the important features of the integrated computational methodology and highlights the parameters that are core components of the planning process

  1. A computational methodology for formulating gasoline surrogate fuels with accurate physical and chemical kinetic properties

    KAUST Repository

    Ahmed, Ahfaz

    2015-03-01

    Gasoline is the most widely used fuel for light duty automobile transportation, but its molecular complexity makes it intractable to experimentally and computationally study the fundamental combustion properties. Therefore, surrogate fuels with a simpler molecular composition that represent real fuel behavior in one or more aspects are needed to enable repeatable experimental and computational combustion investigations. This study presents a novel computational methodology for formulating surrogates for FACE (fuels for advanced combustion engines) gasolines A and C by combining regression modeling with physical and chemical kinetics simulations. The computational methodology integrates simulation tools executed across different software platforms. Initially, the palette of surrogate species and carbon types for the target fuels were determined from a detailed hydrocarbon analysis (DHA). A regression algorithm implemented in MATLAB was linked to REFPROP for simulation of distillation curves and calculation of physical properties of surrogate compositions. The MATLAB code generates a surrogate composition at each iteration, which is then used to automatically generate CHEMKIN input files that are submitted to homogeneous batch reactor simulations for prediction of research octane number (RON). The regression algorithm determines the optimal surrogate composition to match the fuel properties of FACE A and C gasoline, specifically hydrogen/carbon (H/C) ratio, density, distillation characteristics, carbon types, and RON. The optimal surrogate fuel compositions obtained using the present computational approach was compared to the real fuel properties, as well as with surrogate compositions available in the literature. Experiments were conducted within a Cooperative Fuels Research (CFR) engine operating under controlled autoignition (CAI) mode to compare the formulated surrogates against the real fuels. Carbon monoxide measurements indicated that the proposed surrogates

  2. Smart learning objects for smart education in computer science theory, methodology and robot-based implementation

    CERN Document Server

    Stuikys, Vytautas

    2015-01-01

    This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a

  3. Computational Vibrational Spectroscopy of glycine in aqueous solution - Fundamental considerations towards feasible methodologies

    Science.gov (United States)

    Lutz, Oliver M. D.; Messner, Christoph B.; Hofer, Thomas S.; Canaval, Lorenz R.; Bonn, Guenther K.; Huck, Christian W.

    2014-05-01

    In this work, the mid-infrared spectrum of aqueous glycine is predicted by a number of computational approaches. Velocity autocorrelation functions are applied to ab initio QMCF-MD and QM/MM-MD simulations in order to obtain IR power spectra. Furthermore, continuum solvation model augmented geometry optimizations are studied by anharmonic calculations relying on the PT2-VSCF and the VPT2 formalism. In this context, the potential based EFP hydration technique is discussed and the importance of a Monte Carlo search in conjunction with PT2-VSCF calculations is critically assessed. All results are directly compared to newly recorded experimental FT-IR spectroscopic data, elucidating the qualities of the respective methodology. Moreover, the computational approaches are discussed regarding their usefulness for the interpretation of experimental spectra.

  4. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  5. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  6. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  7. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  8. Wind turbine power coefficient estimation by soft computing methodologies: Comparative study

    International Nuclear Information System (INIS)

    Shamshirband, Shahaboddin; Petković, Dalibor; Saboohi, Hadi; Anuar, Nor Badrul; Inayat, Irum; Akib, Shatirah; Ćojbašić, Žarko; Nikolić, Vlastimir; Mat Kiah, Miss Laiha; Gani, Abdullah

    2014-01-01

    Highlights: • Variable speed operation of wind turbine to increase power generation. • Changeability and fluctuation of wind has to be accounted. • To build an effective prediction model of wind turbine power coefficient. • The impact of the variation in the blade pitch angle and tip speed ratio. • Support vector regression methodology application as predictive methodology. - Abstract: Wind energy has become a large contender of traditional fossil fuel energy, particularly with the successful operation of multi-megawatt sized wind turbines. However, reasonable wind speed is not adequately sustainable everywhere to build an economical wind farm. In wind energy conversion systems, one of the operational problems is the changeability and fluctuation of wind. In most cases, wind speed can vacillate rapidly. Hence, quality of produced energy becomes an important problem in wind energy conversion plants. Several control techniques have been applied to improve the quality of power generated from wind turbines. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of support vector regression (SVR) to estimate optimal power coefficient value of the wind turbines. Instead of minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR approach in compare to other soft computing methodologies

  9. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  10. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  11. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  12. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mathematics; Anderson, D.R. [Sandia National Labs., Albuquerque, NM (United States). WIPP Performance Assessments Departments; Baker, B.L. [Technadyne Engineering Consultants, Albuquerque, NM (United States)] [and others

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  13. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs

  14. A new methodology for the computer-aided construction of fault trees

    International Nuclear Information System (INIS)

    Salem, S.L.; Apostolakis, G.E.; Okrent, D.

    1977-01-01

    A methodology for systematically constructing fault trees for general complex systems is developed. A means of modeling component behaviour via decision tables is presented, and a procedure, and a procedure for constructing and editing fault trees, either manually or by computer, is developed. The techniques employed result in a complete fault tree in standard form. In order to demonstrate the methodology, the computer program CAT was developed and is used to construct trees for a nuclear system. By analyzing and comparing these fault trees, several conclusions are reached. First, such an approach can be used to produce fault trees that accurately describe system behaviour. Second, multiple trees can be rapidly produced by defining various TOP events, including system success. Finally, the accuracy and utility of such trees is shown to depend upon the careful development of the decision table models by the analyst, and of the overall system definition itself. Thus the method is seen to be a tool for assisting in the work of fault tree construction rather than a replacement for the careful work of the fault tree analyst. (author)

  15. Determination of phase diagrams via computer simulation: methodology and applications to water, electrolytes and proteins

    International Nuclear Information System (INIS)

    Vega, C; Sanz, E; Abascal, J L F; Noya, E G

    2008-01-01

    In this review we focus on the determination of phase diagrams by computer simulation, with particular attention to the fluid-solid and solid-solid equilibria. The methodology to compute the free energy of solid phases will be discussed. In particular, the Einstein crystal and Einstein molecule methodologies are described in a comprehensive way. It is shown that both methodologies yield the same free energies and that free energies of solid phases present noticeable finite size effects. In fact, this is the case for hard spheres in the solid phase. Finite size corrections can be introduced, although in an approximate way, to correct for the dependence of the free energy on the size of the system. The computation of free energies of solid phases can be extended to molecular fluids. The procedure to compute free energies of solid phases of water (ices) will be described in detail. The free energies of ices Ih, II, III, IV, V, VI, VII, VIII, IX, XI and XII will be presented for the SPC/E and TIP4P models of water. Initial coexistence points leading to the determination of the phase diagram of water for these two models will be provided. Other methods to estimate the melting point of a solid, such as the direct fluid-solid coexistence or simulations of the free surface of the solid, will be discussed. It will be shown that the melting points of ice Ih for several water models, obtained from free energy calculations, direct coexistence simulations and free surface simulations agree within their statistical uncertainty. Phase diagram calculations can indeed help to improve potential models of molecular fluids. For instance, for water, the potential model TIP4P/2005 can be regarded as an improved version of TIP4P. Here we will review some recent work on the phase diagram of the simplest ionic model, the restricted primitive model. Although originally devised to describe ionic liquids, the model is becoming quite popular to describe the behavior of charged colloids

  16. Evolution of teaching and evaluation methodologies: The experience in the computer programming course at the Universidad Nacional de Colombia

    Directory of Open Access Journals (Sweden)

    Jonatan Gomez Perdomo

    2014-05-01

    Full Text Available In this paper, we present the evolution of a computer-programming course at the Universidad Nacional de Colombia (UNAL. The teaching methodology has evolved from a linear and non-standardized methodology to a flexible, non-linear and student-centered methodology. Our methodology uses an e-learning platform that supports the learning process by offering students and professors custom navigation between the content and material in an interactive way (book chapters, exercises, videos. Moreover, the platform is open access, and approximately 900 students from the university take this course each term. However, our evaluation methodology has evolved from static evaluations based on paper tests to an online process based on computer adaptive testing (CAT that chooses the questions to ask a student and assigns the student a grade according to the student’s ability.

  17. Handbook of research on P2P and grid systems for service-oriented computing : models, methodologies, and applications

    NARCIS (Netherlands)

    Antonopoulos, N.; Exarchakos, G.; Li, Maozhen; Liotta, A.

    2010-01-01

    Introduction: Service-oriented computing is a popular design methodology for large scale business computing systems. A significant number of companies aim to reap the benefit of cost reduction by realizing B2B and B2C processes on large-scale SOA-compliant software system platforms. Peer-to-Peer

  18. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  19. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    International Nuclear Information System (INIS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-01-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic

  20. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  1. Remedial action assessment system (RAAS) - A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    Buelt, J.L.; Stottlemyre, J.A.; White, M.K.

    1991-01-01

    Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating established technology process options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies required by the Resource Conservation and Recovery Act (RCRA). One of the greatest attributes of the RAAS project is that the computer interface with the user is being designed to be friendly, intuitive, and interactive. Consequently, the user interface employs menus, windows, help features, and graphical information while RAAS is in operation. During operation, each technology process option is represented by an open-quotes objectclose quotes module. Object-oriented programming is then used to link these unit processes into remedial alternatives. In this way, various object modules representing technology process options can communicate so that a linked set of compatible processes form an appropriate remedial alternative. Once the remedial alternatives are formed, they can be evaluated in terms of effectiveness, implementability, and cost

  2. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  3. An appraisal of wind speed distribution prediction by soft computing methodologies: A comparative study

    International Nuclear Information System (INIS)

    Petković, Dalibor; Shamshirband, Shahaboddin; Anuar, Nor Badrul; Saboohi, Hadi; Abdul Wahab, Ainuddin Wahid; Protić, Milan; Zalnezhad, Erfan; Mirhashemi, Seyed Mohammad Amin

    2014-01-01

    Highlights: • Probabilistic distribution functions of wind speed. • Two parameter Weibull probability distribution. • To build an effective prediction model of distribution of wind speed. • Support vector regression application as probability function for wind speed. - Abstract: The probabilistic distribution of wind speed is among the more significant wind characteristics in examining wind energy potential and the performance of wind energy conversion systems. When the wind speed probability distribution is known, the wind energy distribution can be easily obtained. Therefore, the probability distribution of wind speed is a very important piece of information required in assessing wind energy potential. For this reason, a large number of studies have been established concerning the use of a variety of probability density functions to describe wind speed frequency distributions. Although the two-parameter Weibull distribution comprises a widely used and accepted method, solving the function is very challenging. In this study, the polynomial and radial basis functions (RBF) are applied as the kernel function of support vector regression (SVR) to estimate two parameters of the Weibull distribution function according to previously established analytical methods. Rather than minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound, so as to achieve generalized performance. According to the experimental results, enhanced predictive accuracy and capability of generalization can be achieved using the SVR approach compared to other soft computing methodologies

  4. Methodologies Related to Computational models in View of Developing Anti-Alzheimer Drugs: An Overview.

    Science.gov (United States)

    Baheti, Kirtee; Kale, Mayura Ajay

    2018-04-17

    Since last two decades, there has been more focus on the development strategies related to Anti-Alzheimer's drug research. This may be attributed to the fact that most of the Alzheimer's cases are still mostly unknown except for a few cases, where genetic differences have been identified. With the progress of the disease, the symptoms involve intellectual deterioration, memory impairment, abnormal personality and behavioural patterns, confusion, aggression, mood swings, irritability Current therapies available for this disease give only symptomatic relief and do not focus on manipulations of biololecular processes. Nearly all the therapies to treat Alzheimer's disease, target to change the amyloid cascade which is considered to be an important in AD pathogenesis. New drug regimens are not able to keep pace with the ever-increasing understanding about dementia at molecular level. Looking into these aggravated problems, we though to put forth molecular modeling as a drug discovery approach for developing novel drugs to treat Alzheimer disease. The disease is incurable and it gets worst as it advances and finally causes death. Due to this, the design of drugs to treat this disease has become an utmost priority for research. One of the most important emerging technologies applied for this has been Computer-assisted drug design (CADD). It is a research tool that employs large scale computing strategies in an attempt to develop a model receptor site which can be used for designing of an anti-Alzheimer drug. The various models of amyloid-based calcium channels have been computationally optimized. Docking and De novo evolution are used to design the compounds. These are further subjected to absorption, distribution, metabolism, excretion and toxicity (ADMET) studies to finally bring about active compounds that are able to cross BBB. Many novel compounds have been designed which might be promising ones for the treatment of AD. The present review describes the research

  5. Methodology and computer program for applying improved, inelastic ERR for the design of mine layouts on planar reefs.

    CSIR Research Space (South Africa)

    Spottiswoode, SM

    2002-08-01

    Full Text Available and the visco-plastic models of Napier and Malan (1997) and Malan (2002). Methodologies and a computer program (MINF) are developed during this project that write synthetic catalogues of seismic events to simulate the rock response to mining...

  6. Analytical and computational methodology to assess the over pressures generated by a potential catastrophic failure of a cryogenic pressure vessel

    Energy Technology Data Exchange (ETDEWEB)

    Zamora, I.; Fradera, J.; Jaskiewicz, F.; Lopez, D.; Hermosa, B.; Aleman, A.; Izquierdo, J.; Buskop, J.

    2014-07-01

    Idom has participated in the risk evaluation of Safety Important Class (SIC) structures due to over pressures generated by a catastrophic failure of a cryogenic pressure vessel at ITER plant site. The evaluation implements both analytical and computational methodologies achieving consistent and robust results. (Author)

  7. Analytical and computational methodology to assess the over pressures generated by a potential catastrophic failure of a cryogenic pressure vessel

    International Nuclear Information System (INIS)

    Zamora, I.; Fradera, J.; Jaskiewicz, F.; Lopez, D.; Hermosa, B.; Aleman, A.; Izquierdo, J.; Buskop, J.

    2014-01-01

    Idom has participated in the risk evaluation of Safety Important Class (SIC) structures due to over pressures generated by a catastrophic failure of a cryogenic pressure vessel at ITER plant site. The evaluation implements both analytical and computational methodologies achieving consistent and robust results. (Author)

  8. Computer assisted diagnosis in renal nuclear medicine: rationale, methodology and interpretative criteria for diuretic renography

    Science.gov (United States)

    Taylor, Andrew T; Garcia, Ernest V

    2014-01-01

    diuretic renography, this review offers a window into the rationale, methodology and broader applications of computer assisted diagnosis in medical imaging. PMID:24484751

  9. Computer modelling of the UK wind energy resource. Phase 2. Application of the methodology

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Makari, M; Newton, K; Ravenscroft, F; Whittaker, J

    1993-12-31

    This report presents the results of the second phase of a programme to estimate the UK wind energy resource. The overall objective of the programme is to provide quantitative resource estimates using a mesoscale (resolution about 1km) numerical model for the prediction of wind flow over complex terrain, in conjunction with digitised terrain data and wind data from surface meteorological stations. A network of suitable meteorological stations has been established and long term wind data obtained. Digitised terrain data for the whole UK were obtained, and wind flow modelling using the NOABL computer program has been performed. Maps of extractable wind power have been derived for various assumptions about wind turbine characteristics. Validation of the methodology indicates that the results are internally consistent, and in good agreement with available comparison data. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicates that 28% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. The results will be useful for broad resource studies and initial site screening. Detailed resource evaluation for local sites will require more detailed local modelling or ideally long term field measurements. (12 figures, 14 tables, 21 references). (Author)

  10. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  11. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    Science.gov (United States)

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  12. A methodology for sunlight urban planning: a computer-based solar and sky vault obstruction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Fernando Oscar Ruttkay; Silva, Carlos Alejandro Nome [Federal Univ. of Santa Catarina (UFSC), Dept. of Architecture and Urbanism, Florianopolis, SC (Brazil); Turkienikz, Benamy [Federal Univ. of Rio Grande do Sul (UFRGS), Faculty of Architecture, Porto Alegre, RS (Brazil)

    2001-07-01

    The main purpose of the present study is to describe a planning methodology to improve the quality of the built environment based on the rational control of solar radiation and the view of the sky vault. The main criterion used to control the access and obstruction of solar radiation was the concept of desirability and undesirability of solar radiation. A case study for implementing the proposed methodology is developed. Although needing further developments to find its way into regulations and practical applications, the methodology has shown a strong potential to deal with an aspect that otherwise would be almost impossible. (Author)

  13. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  14. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  15. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    OpenAIRE

    Pedro Mello Paiva; Alexandre Nunes Barreto; Jader Lugon Junior; Leticia Ferraço de Campos

    2016-01-01

    This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers sim...

  16. Evaluation of dose from kV cone-beam computed tomography during radiotherapy: a comparison of methodologies

    Science.gov (United States)

    Buckley, J.; Wilkinson, D.; Malaroda, A.; Metcalfe, P.

    2017-01-01

    Three alternative methodologies to the Computed-Tomography Dose Index for the evaluation of Cone-Beam Computed Tomography dose are compared, the Cone-Beam Dose Index, IAEA Human Health Report No. 5 recommended methodology and the AAPM Task Group 111 recommended methodology. The protocols were evaluated for Pelvis and Thorax scan modes on Varian® On-Board Imager and Truebeam kV XI imaging systems. The weighted planar average dose was highest for the AAPM methodology across all scans, with the CBDI being the second highest overall. A 17.96% and 1.14% decrease from the TG-111 protocol to the IAEA and CBDI protocols for the Pelvis mode and 18.15% and 13.10% decrease for the Thorax mode were observed for the XI system. For the OBI system, the variation was 16.46% and 7.14% for Pelvis mode and 15.93% to the CBDI protocol in Thorax mode respectively.

  17. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry

    International Nuclear Information System (INIS)

    Brady, S. L.; Kaufman, R. A.

    2012-01-01

    Purpose: The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ∼25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. Methods: The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Results: Calibration precision was measured to be better than 5%–7%, 3%–5%, and 2%–4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy −1 versus the CT scatter phantom 29.2 ± 1.0 mV cGy −1 and FIA with x-ray 29.9 ± 1.1 mV cGy −1 methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ∼3000 mV. Conclusions: The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the

  18. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry.

    Science.gov (United States)

    Brady, S L; Kaufman, R A

    2012-06-01

    The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ~25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Calibration precision was measured to be better than 5%-7%, 3%-5%, and 2%-4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy(-1) versus the CT scatter phantom 29.2 ± 1.0 mV cGy(-1) and FIA with x-ray 29.9 ± 1.1 mV cGy(-1) methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ~3000 mV. The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the eventual use for phantom dosimetry, a measurement error ~12

  19. General design methodology applied to the research domain of physical programming for computer illiterate

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2011-09-01

    Full Text Available The authors discuss the application of the 'general design methodology‘ in the context of a physical computing project. The aim of the project was to design and develop physical objects that could serve as metaphors for computer programming elements...

  20. Development of a methodology to generate materials constant for the FLARE-G computer code

    International Nuclear Information System (INIS)

    Martinez, A.S.; Rosier, C.J.; Schirru, R.; Silva, F.C. da; Thome Filho, Z.D.

    1983-01-01

    The methodology of calculation aiming to determine the parametrization constants of the multiplication factor and migration area is presented. These physical parameters are necessary in the solution of the diffusion equation with the nodal method, and they represent the adequated form of the macrogroup constants in the cell calculation. An automatic system was done to generate the parametrization constants. (E.G.) [pt

  1. Computational intelligence as a platform for data collection methodology in management science

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2006-01-01

    With the increased focus in management science on how to collect data close to the real-world of managers, then agent-based simulations have interesting prospects that are usable for the design of business applications aimed at the collection of data. As a new generation of data collection...... methodologies this chapter discusses and presents a behavioral simulation founded in the agent-based simulation life cycle and supported by Web technology. With agent-based modeling the complexity of the method is increased without limiting the research due to the technological support, because this makes...... it possible to exploit the advantages of a questionnaire, an experimental design, a role-play and a scenario as such gaining the synergy effect of these methodologies. At the end of the chapter an example of a simulation is presented for researchers and practitioners to study....

  2. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  3. An integrated impact assessment and weighting methodology: evaluation of the environmental consequences of computer display technology substitution.

    Science.gov (United States)

    Zhou, Xiaoying; Schoenung, Julie M

    2007-04-01

    Computer display technology is currently in a state of transition, as the traditional technology of cathode ray tubes is being replaced by liquid crystal display flat-panel technology. Technology substitution and process innovation require the evaluation of the trade-offs among environmental impact, cost, and engineering performance attributes. General impact assessment methodologies, decision analysis and management tools, and optimization methods commonly used in engineering cannot efficiently address the issues needed for such evaluation. The conventional Life Cycle Assessment (LCA) process often generates results that can be subject to multiple interpretations, although the advantages of the LCA concept and framework obtain wide recognition. In the present work, the LCA concept is integrated with Quality Function Deployment (QFD), a popular industrial quality management tool, which is used as the framework for the development of our integrated model. The problem of weighting is addressed by using pairwise comparison of stakeholder preferences. Thus, this paper presents a new integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), to assess the environmental behavior of alternative technologies in correlation with their performance and economic characteristics. Computer display technology is used as the case study to further develop our methodology through the modification and integration of various quality management tools (e.g., process mapping, prioritization matrix) and statistical methods (e.g., multi-attribute analysis, cluster analysis). Life cycle thinking provides the foundation for our methodology, as we utilize a published LCA report, which stopped at the characterization step, as our starting point. Further, we evaluate the validity and feasibility of our methodology by considering uncertainty and conducting sensitivity analysis.

  4. Computer-assisted detection (CAD) methodology for early detection of response to pharmaceutical therapy in tuberculosis patients

    Science.gov (United States)

    Lieberman, Robert; Kwong, Heston; Liu, Brent; Huang, H. K.

    2009-02-01

    The chest x-ray radiological features of tuberculosis patients are well documented, and the radiological features that change in response to successful pharmaceutical therapy can be followed with longitudinal studies over time. The patients can also be classified as either responsive or resistant to pharmaceutical therapy based on clinical improvement. We have retrospectively collected time series chest x-ray images of 200 patients diagnosed with tuberculosis receiving the standard pharmaceutical treatment. Computer algorithms can be created to utilize image texture features to assess the temporal changes in the chest x-rays of the tuberculosis patients. This methodology provides a framework for a computer-assisted detection (CAD) system that may provide physicians with the ability to detect poor treatment response earlier in pharmaceutical therapy. Early detection allows physicians to respond with more timely treatment alternatives and improved outcomes. Such a system has the potential to increase treatment efficacy for millions of patients each year.

  5. A Methodology to Reduce the Computational Effort in the Evaluation of the Lightning Performance of Distribution Networks

    Directory of Open Access Journals (Sweden)

    Ilaria Bendato

    2016-11-01

    Full Text Available The estimation of the lightning performance of a power distribution network is of great importance to design its protection system against lightning. An accurate evaluation of the number of lightning events that can create dangerous overvoltages requires a huge computational effort, as it implies the adoption of a Monte Carlo procedure. Such a procedure consists of generating many different random lightning events and calculating the corresponding overvoltages. The paper proposes a methodology to deal with the problem in two computationally efficient ways: (i finding out the minimum number of Monte Carlo runs that lead to reliable results; and (ii setting up a procedure that bypasses the lightning field-to-line coupling problem for each Monte Carlo run. The proposed approach is shown to provide results consistent with existing approaches while exhibiting superior Central Processing Unit (CPU time performances.

  6. Holistic Development of Computer Engineering Curricula Using Y-Chart Methodology

    Science.gov (United States)

    Rashid, Muhammad; Tasadduq, Imran A.

    2014-01-01

    The exponential growth of advancing technologies is pushing curriculum designers in computer engineering (CpE) education to compress more and more content into the typical 4-year program, without necessarily paying much attention to the cohesiveness of those contents. The result has been highly fragmented curricula consisting of various…

  7. Methodological Advances in Political Gaming: The One-Person Computer Interactive, Quasi-Rigid Rule Game.

    Science.gov (United States)

    Shubik, Martin

    The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…

  8. A Methodological Study of a Computer-Managed Instructional Program in High School Physics.

    Science.gov (United States)

    Denton, Jon James

    The purpose of this study was to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides in physics at the secondary school level. The sample consisted of three classes. Of these, two were randomly selected to serve as the treatment groups, e.g., individualized instruction and…

  9. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    Directory of Open Access Journals (Sweden)

    Pedro Mello Paiva

    2016-12-01

    Full Text Available This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers simplification of the spill on the surface, even in the well blowout scenario. Efforts to better understand the oil and gas behavior in the water column and three-dimensional modeling of the trajectory gained strength after the Deepwater Horizon spill in 2010 in the Gulf of Mexico. The data collected and the observations made during the accident were widely used for adjustment of the models, incorporating various factors related to hydrodynamic forcing and weathering processes to which the hydrocarbons are subjected during subsurface leaks. The difficulties show to be even more challenging in the case of blowouts in deep waters, where the uncertainties are still larger. The studies addressed different variables to make adjustments of oil and gas dispersion models along the upward trajectory. Factors that exert strong influences include: speed of the subsurface currents;  gas separation from the main plume; hydrate formation, dissolution of oil and gas droplets; variations in droplet diameter; intrusion of the droplets at intermediate depths; biodegradation; and appropriate parametrization of the density, salinity and temperature profiles of water through the column.

  10. PECULIARITIES OF USING THE METHODOLOGY DISTANCE LEARNING OF THE SUBJECT «ENGINEERING AND COMPUTER GRAPHICS» FOR STUDENTS STUDYING BY CORRESPONDENCE

    OpenAIRE

    Olena V. Slobodianiuk

    2010-01-01

    A great part of the distance course of the subject «Engineering and Computer Graphics» (ECG) placed in Internet looks as an electronic manual. But the distance training process has a complicated structure and combines not only studying theoretical material but also collaboration between students and a teacher, a work in group. The methodology distance learning of ECG is proposed. This methodology developed and researched on Faculty the engineering and computer graphics of Vinnitsa National Te...

  11. Computational methodology of sodium-water reaction phenomenon in steam generator of sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira; Uchibori, Akihiro; Ohshima, Hiroyuki

    2009-01-01

    A new computational methodology of sodium-water reaction (SWR), which occurs in a steam generator of a liquid-sodium-cooled fast reactor when a heat transfer tube in the steam generator fails, has been developed considering multidimensional and multiphysics thermal hydraulics. Two kinds of reaction models are proposed in accordance with a phase of sodium as a reactant. One is the surface reaction model in which water vapor reacts directly with liquid sodium at the interface between the liquid sodium and the water vapor. The reaction heat will lead to a vigorous evaporation of liquid sodium, resulting in a reaction of gas-phase sodium. This is designated as the gas-phase reaction model. These two models are coupled with a multidimensional, multicomponent gas, and multiphase thermal hydraulics simulation method with compressibility (named the 'SERAPHIM' code). Using the present methodology, a numerical investigation of the SWR under a pin-bundle configuration (a benchmark analysis of the SWAT-1R experiment) has been carried out. As a result, the maximum gas temperature of approximately 1,300degC is predicted stably, which lies within the range of previous experimental observations. It is also demonstrated that the maximum temperature of the mass weighted average in the analysis agrees reasonably well with the experimental result measured by thermocouples. The present methodology will be promising to establish a theoretical and mechanical modeling of secondary failure propagation of heat transfer tubes due to such as an overheating rupture and a wastage. (author)

  12. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  13. A Methodological Framework for Software Safety in Safety Critical Computer Systems

    OpenAIRE

    P. V. Srinivas Acharyulu; P. Seetharamaiah

    2012-01-01

    Software safety must deal with the principles of safety management, safety engineering and software engineering for developing safety-critical computer systems, with the target of making the system safe, risk-free and fail-safe in addition to provide a clarified differentaition for assessing and evaluating the risk, with the principles of software risk management. Problem statement: Prevailing software quality models, standards were not subsisting in adequately addressing the software safety ...

  14. Multiscale Computational Fluid Dynamics: Methodology and Application to PECVD of Thin Film Solar Cells

    Directory of Open Access Journals (Sweden)

    Marquis Crose

    2017-02-01

    Full Text Available This work focuses on the development of a multiscale computational fluid dynamics (CFD simulation framework with application to plasma-enhanced chemical vapor deposition of thin film solar cells. A macroscopic, CFD model is proposed which is capable of accurately reproducing plasma chemistry and transport phenomena within a 2D axisymmetric reactor geometry. Additionally, the complex interactions that take place on the surface of a-Si:H thin films are coupled with the CFD simulation using a novel kinetic Monte Carlo scheme which describes the thin film growth, leading to a multiscale CFD model. Due to the significant computational challenges imposed by this multiscale CFD model, a parallel computation strategy is presented which allows for reduced processing time via the discretization of both the gas-phase mesh and microscopic thin film growth processes. Finally, the multiscale CFD model has been applied to the PECVD process at industrially relevant operating conditions revealing non-uniformities greater than 20% in the growth rate of amorphous silicon films across the radius of the wafer.

  15. HRP's Healthcare Spin-Offs Through Computational Modeling and Simulation Practice Methodologies

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Peng, Grace; Morrison, Tina; Erdemir, Ahmet; Myers, Jerry

    2014-01-01

    Spaceflight missions expose astronauts to novel operational and environmental conditions that pose health risks that are currently not well understood, and perhaps unanticipated. Furthermore, given the limited number of humans that have flown in long duration missions and beyond low Earth-orbit, the amount of research and clinical data necessary to predict and mitigate these health and performance risks are limited. Consequently, NASA's Human Research Program (HRP) conducts research and develops advanced methods and tools to predict, assess, and mitigate potential hazards to the health of astronauts. In this light, NASA has explored the possibility of leveraging computational modeling since the 1970s as a means to elucidate the physiologic risks of spaceflight and develop countermeasures. Since that time, substantial progress has been realized in this arena through a number of HRP funded activates such as the Digital Astronaut Project (DAP) and the Integrated Medical Model (IMM). Much of this success can be attributed to HRP's endeavor to establish rigorous verification, validation, and credibility (VV&C) processes that ensure computational models and simulations (M&S) are sufficiently credible to address issues within their intended scope. This presentation summarizes HRP's activities in credibility of modeling and simulation, in particular through its outreach to the community of modeling and simulation practitioners. METHODS: The HRP requires all M&S that can have moderate to high impact on crew health or mission success must be vetted in accordance to NASA Standard for Models and Simulations, NASA-STD-7009 (7009) [5]. As this standard mostly focuses on engineering systems, the IMM and DAP have invested substantial efforts to adapt the processes established in this standard for their application to biological M&S, which is more prevalent in human health and performance (HHP) and space biomedical research and operations [6,7]. These methods have also generated

  16. Computational Efficient Upscaling Methodology for Predicting Thermal Conductivity of Nuclear Waste forms

    International Nuclear Information System (INIS)

    Li, Dongsheng; Sun, Xin; Khaleel, Mohammad A.

    2011-01-01

    This study evaluated different upscaling methods to predict thermal conductivity in loaded nuclear waste form, a heterogeneous material system. The efficiency and accuracy of these methods were compared. Thermal conductivity in loaded nuclear waste form is an important property specific to scientific researchers, in waste form Integrated performance and safety code (IPSC). The effective thermal conductivity obtained from microstructure information and local thermal conductivity of different components is critical in predicting the life and performance of waste form during storage. How the heat generated during storage is directly related to thermal conductivity, which in turn determining the mechanical deformation behavior, corrosion resistance and aging performance. Several methods, including the Taylor model, Sachs model, self-consistent model, and statistical upscaling models were developed and implemented. Due to the absence of experimental data, prediction results from finite element method (FEM) were used as reference to determine the accuracy of different upscaling models. Micrographs from different loading of nuclear waste were used in the prediction of thermal conductivity. Prediction results demonstrated that in term of efficiency, boundary models (Taylor and Sachs model) are better than self consistent model, statistical upscaling method and FEM. Balancing the computation resource and accuracy, statistical upscaling is a computational efficient method in predicting effective thermal conductivity for nuclear waste form.

  17. The mechanical design of a transfemoral prosthesis using computational tools and design methodology

    Directory of Open Access Journals (Sweden)

    John Sánchez Otero

    2012-09-01

    Full Text Available Artificial limb replacement with lower limb prostheses has been widely reported in current scientific literature. There are many lower limb prosthetic designs ranging from a single-axis knee mechanism to complex mechanisms involving microcontrollers, made from many materials ranging from lightweight, high specific strength ones (e.g., carbon fibre to traditional forms (e.g., stainless steel. However, the challenge is to design prostheses whose movement resembles the human body’s natural movement as closely as possible. Advances in prosthetics have enabled many amputees to return to their everyday activities; however, such prostheses are expensive, some costing as much as $60,000. Many of the affected population in Colombia have scarce economic resources; there is therefore a need to develop affordable functional prostheses.The Universidad del Norte’s Materials, Processes and Design Research Group and the Robotics and Intelligent Systems Group have been working on this line of research to develop modular prostheses which can be adjusted to each patient’s requirements. This research represents an initial methodological approach to developing a prosthesis in which software tools have been used (the finite element method with a criteria relationship matrix for selecting the best alternative while considering different aspects such as mod-ularity, cost, stiffness and weight.

  18. Numerical methodologies for investigation of moderate-velocity flow using a hybrid computational fluid dynamics - molecular dynamics simulation approach

    International Nuclear Information System (INIS)

    Ko, Soon Heum; Kim, Na Yong; Nikitopoulos, Dimitris E.; Moldovan, Dorel; Jha, Shantenu

    2014-01-01

    Numerical approaches are presented to minimize the statistical errors inherently present due to finite sampling and the presence of thermal fluctuations in the molecular region of a hybrid computational fluid dynamics (CFD) - molecular dynamics (MD) flow solution. Near the fluid-solid interface the hybrid CFD-MD simulation approach provides a more accurate solution, especially in the presence of significant molecular-level phenomena, than the traditional continuum-based simulation techniques. It also involves less computational cost than the pure particle-based MD. Despite these advantages the hybrid CFD-MD methodology has been applied mostly in flow studies at high velocities, mainly because of the higher statistical errors associated with low velocities. As an alternative to the costly increase of the size of the MD region to decrease statistical errors, we investigate a few numerical approaches that reduce sampling noise of the solution at moderate-velocities. These methods are based on sampling of multiple simulation replicas and linear regression of multiple spatial/temporal samples. We discuss the advantages and disadvantages of each technique in the perspective of solution accuracy and computational cost.

  19. Optimal dose reduction in computed tomography methodologies predicted from real-time dosimetry

    Science.gov (United States)

    Tien, Christopher Jason

    Over the past two decades, computed tomography (CT) has become an increasingly common and useful medical imaging technique. CT is a noninvasive imaging modality with three-dimensional volumetric viewing abilities, all in sub-millimeter resolution. Recent national scrutiny on radiation dose from medical exams has spearheaded an initiative to reduce dose in CT. This work concentrates on dose reduction of individual exams through two recently-innovated dose reduction techniques: organ dose modulation (ODM) and tube current modulation (TCM). ODM and TCM tailor the phase and amplitude of x-ray current, respectively, used by the CT scanner during the scan. These techniques are unique because they can be used to achieve patient dose reduction without any appreciable loss in image quality. This work details the development of the tools and methods featuring real-time dosimetry which were used to provide pioneering measurements of ODM or TCM in dose reduction for CT.

  20. Measurements of air kerma index in computed tomography: a comparison among methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, T. C.; Mourao, A. P.; Da Silva, T. A., E-mail: alonso@cdtn.br [Universidade Federal de Minas Gerais, Programa de Ciencia y Tecnicas Nucleares, Av. Pres. Antonio Carlos 6627, Pampulha, 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2016-10-15

    Computed tomography (CT) has become the most important and widely used technique for diagnosis purpose. As CT exams impart high doses to patients in comparison to other radiologist techniques, reliable dosimetry is required. Dosimetry in CT is done in terms of air kerma index in air or in a phantom measured by a pencil ionization chamber under a single X-ray tube rotation. In this work, a comparison among CT dosimetric quantities measured by an UNFORS pencil ionization chamber, MTS-N RADOS thermoluminescent dosimeters and GAFCHROMIC XR-CT radiochromic film was done. The three dosimetric systems were properly calibrated in X-ray reference radiations in a calibration laboratory. CT dosimetric quantities were measured in CT Bright Speed GE Medical Systems Inc., scanner in a PMMA trunk phantom and a comparison among the three dosimetric techniques was done. (Author)

  1. Measurements of air kerma index in computed tomography: a comparison among methodologies

    International Nuclear Information System (INIS)

    Alonso, T. C.; Mourao, A. P.; Da Silva, T. A.

    2016-10-01

    Computed tomography (CT) has become the most important and widely used technique for diagnosis purpose. As CT exams impart high doses to patients in comparison to other radiologist techniques, reliable dosimetry is required. Dosimetry in CT is done in terms of air kerma index in air or in a phantom measured by a pencil ionization chamber under a single X-ray tube rotation. In this work, a comparison among CT dosimetric quantities measured by an UNFORS pencil ionization chamber, MTS-N RADOS thermoluminescent dosimeters and GAFCHROMIC XR-CT radiochromic film was done. The three dosimetric systems were properly calibrated in X-ray reference radiations in a calibration laboratory. CT dosimetric quantities were measured in CT Bright Speed GE Medical Systems Inc., scanner in a PMMA trunk phantom and a comparison among the three dosimetric techniques was done. (Author)

  2. Selection of meteorological parameters affecting rainfall estimation using neuro-fuzzy computing methodology

    Science.gov (United States)

    Hashim, Roslan; Roy, Chandrabhushan; Motamedi, Shervin; Shamshirband, Shahaboddin; Petković, Dalibor; Gocic, Milan; Lee, Siew Cheng

    2016-05-01

    Rainfall is a complex atmospheric process that varies over time and space. Researchers have used various empirical and numerical methods to enhance estimation of rainfall intensity. We developed a novel prediction model in this study, with the emphasis on accuracy to identify the most significant meteorological parameters having effect on rainfall. For this, we used five input parameters: wet day frequency (dwet), vapor pressure (e̅a), and maximum and minimum air temperatures (Tmax and Tmin) as well as cloud cover (cc). The data were obtained from the Indian Meteorological Department for the Patna city, Bihar, India. Further, a type of soft-computing method, known as the adaptive-neuro-fuzzy inference system (ANFIS), was applied to the available data. In this respect, the observation data from 1901 to 2000 were employed for testing, validating, and estimating monthly rainfall via the simulated model. In addition, the ANFIS process for variable selection was implemented to detect the predominant variables affecting the rainfall prediction. Finally, the performance of the model was compared to other soft-computing approaches, including the artificial neural network (ANN), support vector machine (SVM), extreme learning machine (ELM), and genetic programming (GP). The results revealed that ANN, ELM, ANFIS, SVM, and GP had R2 of 0.9531, 0.9572, 0.9764, 0.9525, and 0.9526, respectively. Therefore, we conclude that the ANFIS is the best method among all to predict monthly rainfall. Moreover, dwet was found to be the most influential parameter for rainfall prediction, and the best predictor of accuracy. This study also identified sets of two and three meteorological parameters that show the best predictions.

  3. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  4. Computational benefits using artificial intelligent methodologies for the solution of an environmental design problem: saltwater intrusion.

    Science.gov (United States)

    Papadopoulou, Maria P; Nikolos, Ioannis K; Karatzas, George P

    2010-01-01

    Artificial Neural Networks (ANNs) comprise a powerful tool to approximate the complicated behavior and response of physical systems allowing considerable reduction in computation time during time-consuming optimization runs. In this work, a Radial Basis Function Artificial Neural Network (RBFN) is combined with a Differential Evolution (DE) algorithm to solve a water resources management problem, using an optimization procedure. The objective of the optimization scheme is to cover the daily water demand on the coastal aquifer east of the city of Heraklion, Crete, without reducing the subsurface water quality due to seawater intrusion. The RBFN is utilized as an on-line surrogate model to approximate the behavior of the aquifer and to replace some of the costly evaluations of an accurate numerical simulation model which solves the subsurface water flow differential equations. The RBFN is used as a local approximation model in such a way as to maintain the robustness of the DE algorithm. The results of this procedure are compared to the corresponding results obtained by using the Simplex method and by using the DE procedure without the surrogate model. As it is demonstrated, the use of the surrogate model accelerates the convergence of the DE optimization procedure and additionally provides a better solution at the same number of exact evaluations, compared to the original DE algorithm.

  5. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    Science.gov (United States)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  6. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    Science.gov (United States)

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths

  7. Computational model and performance optimization methodology of a compact design heat exchanger used as an IHX in HTGR

    International Nuclear Information System (INIS)

    De la Torre V, R.; Francois L, J. L.

    2017-09-01

    The intermediate heat exchangers (IHX) present in high-temperature gas-cooled reactor (HTGR) present complex operating conditions, characterized by temperature values higher than 1073 K. Conventional designs of tubes and shell have shown disadvantages with respect to compact designs. In this work, computational models of a compact heat exchanger design, the printed circuit, were built under IHX conditions in a HTGR installation. In these models, a detailed geometry was considered in three dimensions, corresponding to a transfer unit of the heat exchanger. Computational fluid dynamics techniques and finite element methods were used to study the thermo-hydraulic and mechanical functioning of the equipment, respectively. The properties of the materials were defined as temperature functions. The thermo-hydraulic results obtained were established as operating conditions in the structural calculations. A methodology was developed based on the analysis of capital and operating costs, which takes into account the heat transfer, pressure drop and the mechanical behavior of the structure, in a single optimization variable. By analyzing the experimental results of other authors, a relationship was obtained between the operation time of the equipment and the maximum effort in the structure, which was used in the model. The results show that the model that allows a greater thermal efficiency differs from the one that has lower total cost per year. (Author)

  8. Exploring methodological frameworks for a mental task-based near-infrared spectroscopy brain-computer interface.

    Science.gov (United States)

    Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom

    2015-10-30

    Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    Science.gov (United States)

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  10. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications. [computational fluid dynamics

    Science.gov (United States)

    Taylor, Arthur C., III; Hou, Gene W.

    1992-01-01

    Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.

  11. Effects of megavoltage computed tomographic scan methodology on setup verification and adaptive dose calculation in helical TomoTherapy.

    Science.gov (United States)

    Zhu, Jian; Bai, Tong; Gu, Jiabing; Sun, Ziwen; Wei, Yumei; Li, Baosheng; Yin, Yong

    2018-04-27

    To evaluate the effect of pretreatment megavoltage computed tomographic (MVCT) scan methodology on setup verification and adaptive dose calculation in helical TomoTherapy. Both anthropomorphic heterogeneous chest and pelvic phantoms were planned with virtual targets by TomoTherapy Physicist Station and were scanned with TomoTherapy megavoltage image-guided radiotherapy (IGRT) system consisted of six groups of options: three different acquisition pitches (APs) of 'fine', 'normal' and 'coarse' were implemented by multiplying 2 different corresponding reconstruction intervals (RIs). In order to mimic patient setup variations, each phantom was shifted 5 mm away manually in three orthogonal directions respectively. The effect of MVCT scan options was analyzed in image quality (CT number and noise), adaptive dose calculation deviations and positional correction variations. MVCT scanning time with pitch of 'fine' was approximately twice of 'normal' and 3 times more than 'coarse' setting, all which will not be affected by different RIs. MVCT with different APs delivered almost identical CT numbers and image noise inside 7 selected regions with various densities. DVH curves from adaptive dose calculation with serial MVCT images acquired by varied pitches overlapped together, where as there are no significant difference in all p values of intercept & slope of emulational spinal cord (p = 0.761 & 0.277), heart (p = 0.984 & 0.978), lungs (p = 0.992 & 0.980), soft tissue (p = 0.319 & 0.951) and bony structures (p = 0.960 & 0.929) between the most elaborated and the roughest serials of MVCT. Furthermore, gamma index analysis shown that, compared to the dose distribution calculated on MVCT of 'fine', only 0.2% or 1.1% of the points analyzed on MVCT of 'normal' or 'coarse' do not meet the defined gamma criterion. On chest phantom, all registration errors larger than 1 mm appeared at superior-inferior axis, which cannot be avoided with the smallest AP and RI

  12. Development of multidisciplinary practical lessons through research-action methodology in the faculties of computer science and educational psychology

    OpenAIRE

    Pertegal-Felices, María Luisa; Navarro Soria, Ignasi; Jimeno-Morenilla, Antonio; Gil, David

    2010-01-01

    Computer science studies possess a strong multidisciplinary vocation; most graduates do their professional work elsewhere of a computing environment, in collaboration with professionals from many different areas. However, the training offered in computer science studies lacks that multidisciplinary, focusing more on purely technical aspects. The campus, a place where studies of very different nature exist side by side, may constitute an excellent basis for conducting multidisciplinary trainin...

  13. Implications of the Integration of Computing Methodologies into Conventional Marketing Research upon the Quality of Students' Understanding of the Concept

    Science.gov (United States)

    Ayman, Umut; Serim, Mehmet Cenk

    2004-01-01

    It has been an ongoing concern among academicians teaching social sciences to develop a better methodology to ease understanding of students. Since verbal emphasis is at the core of the concepts within such disciplines it has been observed that the adequate or desired level of conceptual understanding of the students to transforms the theories…

  14. Implementing the flipped classroom methodology to the subject "Applied computing" of the chemical engineering degree at the University of Barcelona

    Directory of Open Access Journals (Sweden)

    Montserrat Iborra

    2017-06-01

    Full Text Available This work is focus on implementation, development, documentation, analysis and assessment of flipped classroom methodology, by means of just in time teaching strategy, in a pilot group (1 of 6 of the subject “Applied Computing” of Chemical Engineering Undergraduate Degree of the University of Barcelona. The results show that this technique promotes self-learning, autonomy, time management as well as an increase in the effectiveness of classroom hours.

  15. A Design Science Research Methodology for Developing a Computer-Aided Assessment Approach Using Method Marking Concept

    Science.gov (United States)

    Genemo, Hussein; Miah, Shah Jahan; McAndrew, Alasdair

    2016-01-01

    Assessment has been defined as an authentic method that plays an important role in evaluating students' learning attitude in acquiring lifelong knowledge. Traditional methods of assessment including the Computer-Aided Assessment (CAA) for mathematics show limited ability to assess students' full work unless multi-step questions are sub-divided…

  16. A new decomposition-based computer-aided molecular/mixture design methodology for the design of optimal solvents and solvent mixtures

    DEFF Research Database (Denmark)

    Karunanithi, A.T.; Achenie, L.E.K.; Gani, Rafiqul

    2005-01-01

    This paper presents a novel computer-aided molecular/mixture design (CAMD) methodology for the design of optimal solvents and solvent mixtures. The molecular/mixture design problem is formulated as a mixed integer nonlinear programming (MINLP) model in which a performance objective is to be optim......This paper presents a novel computer-aided molecular/mixture design (CAMD) methodology for the design of optimal solvents and solvent mixtures. The molecular/mixture design problem is formulated as a mixed integer nonlinear programming (MINLP) model in which a performance objective...... is to be optimized subject to structural, property, and process constraints. The general molecular/mixture design problem is divided into two parts. For optimal single-compound design, the first part is solved. For mixture design, the single-compound design is first carried out to identify candidates...... and then the second part is solved to determine the optimal mixture. The decomposition of the CAMD MINLP model into relatively easy to solve subproblems is essentially a partitioning of the constraints from the original set. This approach is illustrated through two case studies. The first case study involves...

  17. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    International Nuclear Information System (INIS)

    Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A.; Jackson, R.; TenBrook, W.; Russell, J.

    1994-01-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force

  18. Analysis of self-reported versus biomarker based smoking prevalence: methodology to compute corrected smoking prevalence rates.

    Science.gov (United States)

    Jain, Ram B

    2017-07-01

    Prevalence of smoking is needed to estimate the need for future public health resources. To compute and compare smoking prevalence rates by using self-reported smoking statuses, two serum cotinine (SCOT) based biomarker methods, and one urinary 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) based biomarker method. These estimates were then used to develop correction factors to be applicable to self-reported prevalences to arrive at corrected smoking prevalence rates. Data from National Health and Nutrition Examination Survey (NHANES) for 2007-2012 for those aged ≥20 years (N = 16826) were used. Self-reported prevalence rate for the total population computed as the weighted number of self-reported smokers divided by weighted number of all participants was 21.6% and 24% when computed by weighted number of self-reported smokers divided by the weighted number of self-reported smokers and nonsmokers. The corrected prevalence rate was found to be 25.8%. A 1% underestimate in smoking prevalence is equivalent to not being able to identify 2.2 million smokers in US in a given year. This underestimation, if not corrected, could lead to serious gap in the public health services available and needed to provide adequate preventive and corrective treatment to smokers.

  19. Computer aided vertebral visualization and analysis: a methodology using the sand rat, a small animal model of disc degeneration

    Directory of Open Access Journals (Sweden)

    Hanley Edward N

    2003-03-01

    Full Text Available Abstract Background The purpose of this study is to present an automated system that analyzes digitized x-ray images of small animal spines identifying the effects of disc degeneration. The age-related disc and spine degeneration that occurs in the sand rat (Psammomys obesus has previously been documented radiologically; selected representative radiographs with age-related changes were used here to develop computer-assisted vertebral visualization/analysis techniques. Techniques presented here have the potential to produce quantitative algorithms that create more accurate and informative measurements in a time efficient manner. Methods Signal and image processing techniques were applied to digitized spine x-ray images the spine was segmented, and orientation and curvature determined. The image was segmented based on orientation changes of the spine; edge detection was performed to define vertebral boundaries. Once vertebrae were identified, a number of measures were introduced and calculated to retrieve information on the vertebral separation/orientation and sclerosis. Results A method is described which produces computer-generated quantitative measurements of vertebrae and disc spaces. Six sand rat spine radiographs illustrate applications of this technique. Results showed that this method can successfully automate calculation and analysis of vertebral length, vertebral spacing, vertebral angle, and can score sclerosis. Techniques also provide quantitative means to explore the relation between age and vertebral shape. Conclusions This method provides a computationally efficient system to analyze spinal changes during aging. Techniques can be used to automate the quantitative processing of vertebral radiographic images and may be applicable to human and other animal radiologic models of the aging/degenerating spine.

  20. Strategies and methodologies to develop techniques for computer-assisted analysis of gas phase formation during altitude decompression

    Science.gov (United States)

    Powell, Michael R.; Hall, W. A.

    1993-01-01

    It would be of operational significance if one possessed a device that would indicate the presence of gas phase formation in the body during hypobaric decompression. Automated analysis of Doppler gas bubble signals has been attempted for 2 decades but with generally unfavorable results, except with surgically implanted transducers. Recently, efforts have intensified with the introduction of low-cost computer programs. Current NASA work is directed towards the development of a computer-assisted method specifically targeted to EVA, and we are most interested in Spencer Grade 4. We note that Spencer Doppler Grades 1 to 3 have increased in the FFT sonogram and spectrogram in the amplitude domain, and the frequency domain is sometimes increased over that created by the normal blood flow envelope. The amplitude perturbations are of very short duration, in both systole and diastole and at random temporal positions. Grade 4 is characteristic in the amplitude domain but with modest increases in the FFT sonogram and spectral frequency power from 2K to 4K over all of the cardiac cycle. Heart valve motion appears to characteristic display signals: (1) the demodulated Doppler signal amplitude is considerably above the Doppler-shifted blow flow signal (even Grade 4); and (2) demodulated Doppler frequency shifts are considerably greater (often several kHz) than the upper edge of the blood flow envelope. Knowledge of these facts will aid in the construction of a real-time, computer-assisted discriminator to eliminate cardiac motion artifacts. There could also exist perturbations in the following: (1) modifications of the pattern of blood flow in accordance with Poiseuille's Law, (2) flow changes with a change in the Reynolds number, (3) an increase in the pulsatility index, and/or (4) diminished diastolic flow or 'runoff.' Doppler ultrasound devices have been constructed with a three-transducer array and a pulsed frequency generator.

  1. Computer vision system approach in colour measurements of foods: Part II. validation of methodology with real foods

    Directory of Open Access Journals (Sweden)

    Fatih TARLAK

    2016-01-01

    Full Text Available Abstract The colour of food is one of the most important factors affecting consumers’ purchasing decision. Although there are many colour spaces, the most widely used colour space in the food industry is L*a*b* colour space. Conventionally, the colour of foods is analysed with a colorimeter that measures small and non-representative areas of the food and the measurements usually vary depending on the point where the measurement is taken. This leads to the development of alternative colour analysis techniques. In this work, a simple and alternative method to measure the colour of foods known as “computer vision system” is presented and justified. With the aid of the computer vision system, foods that are homogenous and uniform in colour and shape could be classified with regard to their colours in a fast, inexpensive and simple way. This system could also be used to distinguish the defectives from the non-defectives. Quality parameters of meat and dairy products could be monitored without any physical contact, which causes contamination during sampling.

  2. Proposal of a methodology for computing damages from flexo-rotative fatigue considering the theory of the acting average stresses

    International Nuclear Information System (INIS)

    Tanius Rodrigues Mansur; Alvaro Alvarenga Junior; Joao Mario Andrade Pinto; Wellington Antonio Soares; Ernani Sales Palma

    2005-01-01

    The useful life of metallic structures is many times governed by fatigue processes caused by vibrations or by application of dynamic loads, periodic or not. Many times the amplitude of the alternated stress applied to a structural component can vary during its useful life. In this situation the direct use of S-N-P curves cannot be done because they are generated on the basis of alternated stress with constant amplitude. Several theories have been developed during the last times, where some are deterministic and other probabilistic, in order to give a component designer a more efficient and correct tool for approaching the problem. They are called Theories on Damage Accumulation. The phenomenon of creating damages represents the generation of superficial discontinuities caused by micro-cracks or by volumetric cavities [Lemaitre and Chaboche, 1985]. In the continuous mechanics, the superficial density of micro-defects (micro-cracks or cavities) inside the shear plan of the representative element volume of the sample is the variable used to quantify the damage. A methodology for evaluating damage accumulation using average of stresses is proposed in this paper. Results obtained for a 50% failure probability are presented and are compared to those values obtained using the theories of Palmgren-Miner, Henry, Corten-Dolan, Marine, Manson, and Knee-point. (authors)

  3. Compression of Born ratio for fluorescence molecular tomography/x-ray computed tomography hybrid imaging: methodology and in vivo validation.

    Science.gov (United States)

    Mohajerani, Pouyan; Ntziachristos, Vasilis

    2013-07-01

    The 360° rotation geometry of the hybrid fluorescence molecular tomography/x-ray computed tomography modality allows for acquisition of very large datasets, which pose numerical limitations on the reconstruction. We propose a compression method that takes advantage of the correlation of the Born-normalized signal among sources in spatially formed clusters to reduce the size of system model. The proposed method has been validated using an ex vivo study and an in vivo study of a nude mouse with a subcutaneous 4T1 tumor, with and without inclusion of a priori anatomical information. Compression rates of up to two orders of magnitude with minimum distortion of reconstruction have been demonstrated, resulting in large reduction in weight matrix size and reconstruction time.

  4. Logging into the Field—Methodological Reflections on Ethnographic Research in a Pluri-Local and Computer-Mediated Field

    Directory of Open Access Journals (Sweden)

    Heike Mónika Greschke

    2007-09-01

    Full Text Available This article aims to introduce an ethnic group inhabiting a common virtual space in the World Wide Web (WWW, while being physically located in different socio-geographical contexts. Potentially global in its geographical extent, this social formation is constituted by means of interrelating virtual-global dimensions with physically grounded parts of the actors' lifeworlds. In addition, the communities' social life relies on specific communicative practices joining mediated forms of communication with co-presence based encounters. Ethnographic research in a pluri-local and computer-mediated field poses a set of problems which demand thorough reflection as well as a search for creative solutions. How can the boundaries of the field be determined? What does "being there" signify in such a case? Is it possible to enter the field while sitting at my own desk, just by visiting the respective site in the WWW, simply observing the communication going on without even being noticed by the subjects in the field? Or does "being in the field" imply that I ought to turn into a member of the studied community? Am I supposed to effectively live with the others for a while? And then, what can "living together" actually mean in that case? Will I learn enough about the field simply by participating in its virtual activities? Or do I have to account for the physically grounded dimensions of the actors' lifeworlds, as well? Ethnographic research in a pluri-local and computer-mediated field in practice raises a lot of questions regarding the ways of entering the field and being in the field. Some of them will be discussed in this paper by means of reflecting research experiences gained in the context of a recently concluded case study. URN: urn:nbn:de:0114-fqs0703321

  5. Automation of the computational programs and codes used in the methodology of neutronic and thermohydraulic calculation for the IEA-R1 nuclear reactor

    International Nuclear Information System (INIS)

    Stefani, Giovanni Laranjo de

    2009-01-01

    This work proceeds the elaboration of a computational program for execution of various neutron and thermalhydraulic calculation methodology programs of the IEA-R1-Sao Paulo, Brazil, making the process more practical and safe, besides transforming de output data of each program an automatic process. This reactor is largely used for production of radioisotopes for medical use, material irradiation, personnel training and also for basic research. For that purposes it is necessary to change his core configuration in order to adapt the reactor for different uses. The work will transform various existent programs into subroutines of a principal program, i.e.,a program which call each of the programs automatically when necessary, and create another programs for manipulation the output data and therefore making practical the process

  6. Methodology for the computational simulation of the components in photovoltaic systems; Desarrollo de herramientas para la prediccion del comportamiento de sistemas fotovoltaicos

    Energy Technology Data Exchange (ETDEWEB)

    Galimberti, P.; Arcuri, G.; Manno, R.; Fasulo, A. J.

    2004-07-01

    This work presents a methodology for the computational simulation of the components that comprise photovoltaic systems, in order to study the behavior of each component and its relevance in the operation of the whole system, which would allow to make decisions in the selection process of these components and their improvements.As a result of the simulation, files with values of different variables which characterize the behaviour of the components are obtained. Different kind of plots can be drawn, which show the information in a summarized form. Finally, the results are discussed making a comparison with actual data for the city of Rio Cuarto in Argentina (33,1 degree South Latitude) and some advantages of the propose method are mentioned. (Author)

  7. Development of effect assessment methodology for the deployment of fast reactor cycle system with dynamic computable general equilibrium model

    International Nuclear Information System (INIS)

    Shiotani, Hiroki; Ono, Kiyoshi

    2009-01-01

    The Global Trade and Analysis Project (GTAP) is a widely used computable general equilibrium (CGE) model developed by Purdue University. Although the GTAP-E, an energy environmental version of the GTAP model, is useful for surveying the energy-economy-environment-trade linkage is economic policy analysis, it does not have the decomposed model of the electricity sector and its analyses are comparatively static. In this study, a recursive dynamic CGE model with a detailed electricity technology bundle with nuclear power generation including FR was developed based on the GTAP-E to evaluate the long-term socioeconomic effects of FR deployment. The capital stock changes caused by international investments and some dynamic constraints of the FR deployment and operation (e.g., load following capability and plutonium mass balance) were incorporated in the analyses. The long-term socioeconomic effects resulting from the deployment of economic competitive FR with innovative technologies can be assessed; the cumulative effects of the FR deployment on GDP calculated using this model costed over 40 trillion yen in Japan and 400 trillion yen worldwide, which were several times more than the cost of the effects calculated using the conventional cost-benefit analysis tool, because of ripple effects and energy substitutions among others. (author)

  8. Human computer interactions in next-generation of aircraft smart navigation management systems: task analysis and architecture under an agent-oriented methodological approach.

    Science.gov (United States)

    Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B

    2015-03-04

    The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.

  9. Human Computer Interactions in Next-Generation of Aircraft Smart Navigation Management Systems: Task Analysis and Architecture under an Agent-Oriented Methodological Approach

    Science.gov (United States)

    Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.

    2015-01-01

    The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092

  10. Human Computer Interactions in Next-Generation of Aircraft Smart Navigation Management Systems: Task Analysis and Architecture under an Agent-Oriented Methodological Approach

    Directory of Open Access Journals (Sweden)

    José M. Canino-Rodríguez

    2015-03-01

    Full Text Available The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.

  11. A DFT-Based Computational-Experimental Methodology for Synthetic Chemistry: Example of Application to the Catalytic Opening of Epoxides by Titanocene.

    Science.gov (United States)

    Jaraíz, Martín; Enríquez, Lourdes; Pinacho, Ruth; Rubio, José E; Lesarri, Alberto; López-Pérez, José L

    2017-04-07

    A novel DFT-based Reaction Kinetics (DFT-RK) simulation approach, employed in combination with real-time data from reaction monitoring instrumentation (like UV-vis, FTIR, Raman, and 2D NMR benchtop spectrometers), is shown to provide a detailed methodology for the analysis and design of complex synthetic chemistry schemes. As an example, it is applied to the opening of epoxides by titanocene in THF, a catalytic system with abundant experimental data available. Through a DFT-RK analysis of real-time IR data, we have developed a comprehensive mechanistic model that opens new perspectives to understand previous experiments. Although derived specifically from the opening of epoxides, the prediction capabilities of the model, built on elementary reactions, together with its practical side (reaction kinetics simulations of real experimental conditions) make it a useful simulation tool for the design of new experiments, as well as for the conception and development of improved versions of the reagents. From the perspective of the methodology employed, because both the computational (DFT-RK) and the experimental (spectroscopic data) components can follow the time evolution of several species simultaneously, it is expected to provide a helpful tool for the study of complex systems in synthetic chemistry.

  12. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: A methodological and comparative survey

    Science.gov (United States)

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-01-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. J. Comp. Neurol. 523:1281–1295, 2015. © 2015 Wiley Periodicals, Inc. PMID:25728683

  13. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  14. Multidimensional Space-Time Methodology for Development of Planetary and Space Sciences, S-T Data Management and S-T Computational Tomography

    Science.gov (United States)

    Andonov, Zdravko

    This R&D represent innovative multidimensional 6D-N(6n)D Space-Time (S-T) Methodology, 6D-6nD Coordinate Systems, 6D Equations, new 6D strategy and technology for development of Planetary Space Sciences, S-T Data Management and S-T Computational To-mography. . . The Methodology is actual for brain new RS Microwaves' Satellites and Compu-tational Tomography Systems development, aimed to defense sustainable Earth, Moon, & Sun System evolution. Especially, extremely important are innovations for monitoring and protec-tion of strategic threelateral system H-OH-H2O Hydrogen, Hydroxyl and Water), correspond-ing to RS VHRS (Very High Resolution Systems) of 1.420-1.657-22.089GHz microwaves. . . One of the Greatest Paradox and Challenge of World Science is the "transformation" of J. L. Lagrange 4D Space-Time (S-T) System to H. Minkovski 4D S-T System (O-X,Y,Z,icT) for Einstein's "Theory of Relativity". As a global result: -In contemporary Advanced Space Sciences there is not real adequate 4D-6D Space-Time Coordinate System and 6D Advanced Cosmos Strategy & Methodology for Multidimensional and Multitemporal Space-Time Data Management and Tomography. . . That's one of the top actual S-T Problems. Simple and optimal nD S-T Methodology discovery is extremely important for all Universities' Space Sci-ences' Education Programs, for advances in space research and especially -for all young Space Scientists R&D!... The top ten 21-Century Challenges ahead of Planetary and Space Sciences, Space Data Management and Computational Space Tomography, important for successfully de-velopment of Young Scientist Generations, are following: 1. R&D of W. R. Hamilton General Idea for transformation all Space Sciences to Time Sciences, beginning with 6D Eukonal for 6D anisotropic mediums & velocities. Development of IERS Earth & Space Systems (VLBI; LLR; GPS; SLR; DORIS Etc.) for Planetary-Space Data Management & Computational Planetary & Space Tomography. 2. R&D of S. W. Hawking Paradigm for 2D

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  19. An integrated methodological approach to the computer-assisted gas chromatographic screening of basic drugs in biological fluids using nitrogen selective detection.

    Science.gov (United States)

    Dugal, R; Massé, R; Sanchez, G; Bertrand, M J

    1980-01-01

    This paper presents the methodological aspects of a computerized system for the gas-chromatographic screening and primary identification of central nervous system stimulants and narcotic analgesics (including some of their respective metabolites) extracted from urine. The operating conditions of a selective nitrogen detector for optimized analytical functions are discussed, particularly the effect of carrier and fuel gas on the detector's sensitivity to nitrogen-containing molecules and discriminating performance toward biological matrix interferences. Application of simple extraction techniques, combined with rapid derivatization procedures, computer data acquisition, and reduction of chromatographic data are presented. Results show that this system approach allows for the screening of several drugs and their metabolites in a short amount of time. The reliability and stability of the system have been tested by analyzing several thousand samples for doping control at major international sporting events and for monitoring drug intake in addicts participating in a rehabilitation program. Results indicate that these techniques can be used and adapted to many different analytical toxicology situations.

  20. Methodology of the design of an integrated telecommunications and computer network in a control information system for artillery battalion fire support

    Directory of Open Access Journals (Sweden)

    Slobodan M. Miletić

    2012-04-01

    Full Text Available A Command Information System (CIS in a broader sense can be defined as a set of hardware and software solutions by which one achieves real-time integration of organizational structures, doctrine, technical and technological systems and facilities, information flows and processes for efficient and rational decision-making and functioning. Time distribution and quality of information directly affect the implementation of the decision making process and criteria for evaluating the effectiveness of the system in which the achievement of the most important role is an integrated telecommunications and computer network (ITCN, dimensioned to the spatial distribution of tactical combat units connecting all the elements in a communications unit. The aim is to establish the design methodology as a way of the ITCN necessary to conduct analysis and extract all the necessary elements for modeling that are mapped to the elements of network infrastructure, and then analyzed from the perspective of telecommunications communication standards and parameters of the layers of the OSI network model. A relevant way to verify the designed model ITCN is the development of a simulation model with which adequate results can be obtained. Conclusions on the compliance with the requirements of tactical combat and tactical communication requirements are drawn on the basis of these results.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  11. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  15. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  20. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  1. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  2. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  3. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  4. Computational thermal analysis of cylindrical fin design parameters and a new methodology for defining fin structure in LED automobile headlamp cooling applications

    International Nuclear Information System (INIS)

    Sökmen, Kemal Furkan; Yürüklü, Emrah; Yamankaradeniz, Nurettin

    2016-01-01

    Highlights: • In the study, cooling of LED headlamps in automotive is investigated. • The study is based on free convection cooling of LED module. • Besides free convection, Monte Carlo model is used as radiation model as well. • A new algorithm is presented for designing optimum fin structure. • Suggested algorithm for optimum design is verified by various simulations. - Abstract: In this study, the effects of fin design, fin material, and free and forced convection on junction temperature in automotive headlamp cooling applications of LED lights are researched by using ANSYS CFX 14 software. Furthermore a new methodology is presented for defining the optimum cylindrical fin structure within the given limits. For measuring the performance of methodology, analyses are carried out for various ambient temperatures (25 °C, 50 °C and 80 °C) and different LED power dissipations (0.5 W, 0.75 W, 1 W and 1.25 W). Then, analyses are repeated at different heat transfer coefficients and different fin materials in order to calculate LED junction temperature in order to see if the fin structure proposed by the methodology is appropriate for staying below the given safety temperature limit. As a result, the suggested method has always proposed proper fin structures with optimum characteristics for given LED designs. As another result, for safe junction temperature ranges, it is seen that for all LED power dissipations, adding aluminum or copper plate behind the printed circuit board at low ambient temperatures is sufficient. Also, as the ambient temperature increases, especially in high powered LED lights, addition of aluminum is not sufficient and fin usage becomes essential. High heat transfer coefficient and using copper fin affect the junction temperature positively.

  5. A Methodological Study Evaluating a Pretutorial Computer-Compiled Instructional Program in High School Physics Instruction Initiated from Student-Teacher Selected Instructional Objectives. Final Report.

    Science.gov (United States)

    Leonard, B. Charles; Denton, Jon J.

    A study sought to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides to account for the idiosyncratic variations among students in physics classes at the secondary school level. The students in the treatment groups were oriented toward the practices of selecting…

  6. Uncertainty analysis of minimum vessel liquid inventory during a small-break LOCA in a B ampersand W Plant: An application of the CSAU methodology using the RELAP5/MOD3 computer code

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.

    1992-12-01

    The Nuclear Regulatory Commission (NRC) revised the emergency core cooling system licensing rule to allow the use of best estimate computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability, and Uncertainty (CSAU) to evaluate best estimate code uncertainties. The objective of this work was to adapt and demonstrate the CSAU methodology for a small-break loss-of-coolant accident (SBLOCA) in a Pressurized Water Reactor of Babcock ampersand Wilcox Company lowered loop design using RELAP5/MOD3 as the simulation tool. The CSAU methodology was successfully demonstrated for the new set of variants defined in this project (scenario, plant design, code). However, the robustness of the reactor design to this SBLOCA scenario limits the applicability of the specific results to other plants or scenarios. Several aspects of the code were not exercised because the conditions of the transient never reached enough severity. The plant operator proved to be a determining factor in the course of the transient scenario, and steps were taken to include the operator in the model, simulation, and analyses

  7. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  8. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  9. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  10. Vibration computer programs E13101, E13102, E13104, and E13112 and application to the NERVA program. Project 187: Methodology documentation

    Science.gov (United States)

    Mironenko, G.

    1972-01-01

    Programs for the analyses of the free or forced, undamped vibrations of one or two elastically-coupled lumped parameter teams are presented. Bearing nonlinearities, casing and rotor distributed mass and elasticity, rotor imbalance, forcing functions, gyroscopic moments, rotary inertia, and shear and flexural deformations are all included in the system dynamics analysis. All bearings have nonlinear load displacement characteristics, the solution is achieved by iteration. Rotor imbalances allowed by such considerations as pilot tolerances and runouts as well as bearing clearances (allowing concail or cylindrical whirl) determine the forcing function magnitudes. The computer programs first obtain a solution wherein the bearings are treated as linear springs of given spring rates. Then, based upon the computed bearing reactions, new spring rates are predicted and another solution of the modified system is made. The iteration is continued until the changes to bearing spring rates and bearing reactions become negligibly small.

  11. Methodology for a computer-aided design of shrink fits that considers the roughness and form defects of the manufacturing process

    Energy Technology Data Exchange (ETDEWEB)

    Boutoutaou, H. [Universit M' hamed Bougara, Boumerdes (Algeria); Fontaine, J. F. [Universit de Bourgogne, Auxerre (France)

    2015-05-15

    The use of the interference fit (Shrink fit) assembly technique is increasing nowadays. However, design methodologies for such assemblies employ older models based on restrictive assumptions regarding the shape and surface quality of assembled parts. A finite element method software is frequently used to overcome this drawback. Such software allows an approach that is close to the physical reality of the parts (Complex form) of an assembly. However, surfaces are generally considered to be perfect cylinders with a constant tightening. The proposed approach allows both form defects and roughness to be introduced into the design, and thus, the manufacturing process is considered. Form defect is integrated directly into the mesh, whereas roughness is considered by a homogenized finite element of the interface. An application of the approach is presented to determine the specifications that consider the manufacturing process (Honing, grinding, and turning). This simple method can be easily applied in industrial design offices.

  12. Methodology for a computer-aided design of shrink fits that considers the roughness and form defects of the manufacturing process

    International Nuclear Information System (INIS)

    Boutoutaou, H.; Fontaine, J. F.

    2015-01-01

    The use of the interference fit (Shrink fit) assembly technique is increasing nowadays. However, design methodologies for such assemblies employ older models based on restrictive assumptions regarding the shape and surface quality of assembled parts. A finite element method software is frequently used to overcome this drawback. Such software allows an approach that is close to the physical reality of the parts (Complex form) of an assembly. However, surfaces are generally considered to be perfect cylinders with a constant tightening. The proposed approach allows both form defects and roughness to be introduced into the design, and thus, the manufacturing process is considered. Form defect is integrated directly into the mesh, whereas roughness is considered by a homogenized finite element of the interface. An application of the approach is presented to determine the specifications that consider the manufacturing process (Honing, grinding, and turning). This simple method can be easily applied in industrial design offices.

  13. Improving Energy Conservation Using Six Sigma Methodology at Faculty of Computer and Mathematical Sciences (FSKM), Universiti Teknologi Mara (UiTM), Shah Alam

    OpenAIRE

    Nur Hidayah binti Mohd Razali; Wan Mohamad Asyraf

    2014-01-01

    Electrical consumption is increasing rapidly in Malaysia due to the sustenance of a modern economy way of living. Recently, the Vice Chancellor of University Technology MARA, Tan Sri Dato? Professor Ir Dr Sahol Hamid Abu Bakar has shown a great deal of concern regarding the high electrical energy consumption in UiTM?s main campus in Shah Alam. This study seeks to evaluate the factors that contribute to high electrical energy consumption in the Faculty of Computer and Mathematical Sciences (FS...

  14. Examination of tapered plastic multimode fiber-based sensor performance with silver coating for different concentrations of calcium hypochlorite by soft computing methodologies--a comparative study.

    Science.gov (United States)

    Zakaria, Rozalina; Sheng, Ong Yong; Wern, Kam; Shamshirband, Shahaboddin; Wahab, Ainuddin Wahid Abdul; Petković, Dalibor; Saboohi, Hadi

    2014-05-01

    A soft methodology study has been applied on tapered plastic multimode sensors. This study basically used tapered plastic multimode fiber [polymethyl methacrylate (PMMA)] optics as a sensor. The tapered PMMA fiber was fabricated using an etching method involving deionized water and acetone to achieve a waist diameter and length of 0.45 and 10 mm, respectively. In addition, a tapered PMMA probe, which was coated by silver film, was fabricated and demonstrated using a calcium hypochlorite (G70) solution. The working mechanism of such a device is based on the observation increment in the transmission of the sensor that is immersed in solutions at high concentrations. As the concentration was varied from 0 to 6 ppm, the output voltage of the sensor increased linearly. The silver film coating increased the sensitivity of the proposed sensor because of the effective cladding refractive index, which increases with the coating and thus allows more light to be transmitted from the tapered fiber. In this study, the polynomial and radial basis function (RBF) were applied as the kernel function of the support vector regression (SVR) to estimate and predict the output voltage response of the sensors with and without silver film according to experimental tests. Instead of minimizing the observed training error, SVR_poly and SVR_rbf were used in an attempt to minimize the generalization error bound so as to achieve generalized performance. An adaptive neuro-fuzzy interference system (ANFIS) approach was also investigated for comparison. The experimental results showed that improvements in the predictive accuracy and capacity for generalization can be achieved by the SVR_poly approach in comparison to the SVR_rbf methodology. The same testing errors were found for the SVR_poly approach and the ANFIS approach.

  15. A simulation model for visitors’ thermal comfort at urban public squares using non-probabilistic binary-linear classifier through soft-computing methodologies

    International Nuclear Information System (INIS)

    Kariminia, Shahab; Shamshirband, Shahaboddin; Hashim, Roslan; Saberi, Ahmadreza; Petković, Dalibor; Roy, Chandrabhushan; Motamedi, Shervin

    2016-01-01

    Sustaining outdoor life in cities is decreasing because of the recent rapid urbanisation without considering climate-responsive urban design concepts. Such inadvertent climatic modifications at the indoor level have imposed considerable demand on the urban energy resources. It is important to provide comfortable ambient climate at open urban squares. Researchers need to predict the comfortable conditions at such outdoor squares. The main objective of this study is predict the visitors' outdoor comfort indices by using a developed computational model termed as SVM-WAVELET (Support Vector Machines combined with Discrete Wavelet Transform algorithm). For data collection, the field study was conducted in downtown Isfahan, Iran (51°41′ E, 32°37′ N) with hot and arid summers. Based on different environmental elements, four separate locations were monitored across two public squares. Meteorological data were measured simultaneously by surveying the visitors' thermal sensations. According to the subjects' thermal feeling and their characteristics, their level of comfort was estimated. Further, the adapted computational model was used to estimate the visitors’ thermal sensations in terms of thermal comfort indices. The SVM-WAVELET results indicate that R"2 value for input parameters, including Thermal Sensation, PMW (The predicted mean vote), PET (physiologically equivalent temperature), SET (standard effective temperature) and T_m_r_t were estimated at 0.482, 0.943, 0.988, 0.969 and 0.840, respectively. - Highlights: • To explore the visitors' thermal sensation at urban public squares. • This article introduces findings of outdoor comfort prediction. • The developed SVM-WAVELET soft-computing technique was used. • SVM-WAVELET estimation results are more reliable and accurate.

  16. Computation of interactive effects and optimization of process parameters for alkaline lipase production by mutant strain of Pseudomonas aeruginosa using response surface methodology

    Directory of Open Access Journals (Sweden)

    Deepali Bisht

    2013-01-01

    Full Text Available Alkaline lipase production by mutant strain of Pseudomonas aeruginosa MTCC 10,055 was optimized in shake flask batch fermentation using response surface methodology. An empirical model was developed through Box-Behnken experimental design to describe the relationship among tested variables (pH, temperature, castor oil, starch and triton-X-100. The second-order quadratic model determined the optimum conditions as castor oil, 1.77 mL.L-1; starch, 15.0 g.L-1; triton-X-100, 0.93 mL.L-1; incubation temperature, 34.12 ºC and pH 8.1 resulting into maximum alkaline lipase production (3142.57 U.mL-1. The quadratic model was in satisfactory adjustment with the experimental data as evidenced by a high coefficient of determination (R² value (0.9987. The RSM facilitated the analysis and interpretation of experimental data to ascertain the optimum conditions of the variables for the process and recognized the contribution of individual variables to assess the response under optimal conditions. Hence Box-Behnken approach could fruitfully be applied for process optimization.

  17. Computation of interactive effects and optimization of process parameters for alkaline lipase production by mutant strain of Pseudomonas aeruginosa using response surface methodology

    Science.gov (United States)

    Bisht, Deepali; Yadav, Santosh Kumar; Darmwal, Nandan Singh

    2013-01-01

    Alkaline lipase production by mutant strain of Pseudomonas aeruginosa MTCC 10,055 was optimized in shake flask batch fermentation using response surface methodology. An empirical model was developed through Box-Behnken experimental design to describe the relationship among tested variables (pH, temperature, castor oil, starch and triton-X-100). The second-order quadratic model determined the optimum conditions as castor oil, 1.77 mL.L−1; starch, 15.0 g.L−1; triton-X-100, 0.93 mL.L−1; incubation temperature, 34.12 °C and pH 8.1 resulting into maximum alkaline lipase production (3142.57 U.mL−1). The quadratic model was in satisfactory adjustment with the experimental data as evidenced by a high coefficient of determination (R2) value (0.9987). The RSM facilitated the analysis and interpretation of experimental data to ascertain the optimum conditions of the variables for the process and recognized the contribution of individual variables to assess the response under optimal conditions. Hence Box-Behnken approach could fruitfully be applied for process optimization. PMID:24159311

  18. Computational methodology for the Oak Ridge Research Reactor (ORR) and Bulk Shielding Reactor (BSR): cross-section and validation. Volume 1

    International Nuclear Information System (INIS)

    Miller, L.F.; Williams, M.L.

    1986-03-01

    A neutronics library suitable for low-enrichment uranium (LEU) and high-enrichment uranium (HEU) fueled cores for both the Oak Ridge Research Reactor (ORR) and the Bulk Shielding Reactor (BSR) is documented herein. The library is obtained from version V of the Evaluated Nuclear Data File (ENDF/B-V) and contains 223 nuclides weighted over a variety of region-dependent neutron spectra. Self-shielding and zone-weighting effects are incorporated with 227-group calculations for several reactor-core configurations. Libraries are archived for both transport and diffusion theory seven-group calculations. Complete listings of processing details are included so that libraries with different specifications can be easily obtained. Results from validation calculations indicate that the neutronics libraries obtained from this effort are suitable for neutronics computations for the ORR and BSR. 12 refs., 5 figs., 15 tabs

  19. A Nexus between Theory and Experiment: Non-Empirical Quantum Mechanical Computational Methodology Applied to Cucurbit[n]uril⋅Guest Binding Interactions.

    Science.gov (United States)

    Hostaš, Jiří; Sigwalt, David; Šekutor, Marina; Ajani, Haresh; Dubecký, Matúš; Řezáč, Jan; Zavalij, Peter Y; Cao, Liping; Wohlschlager, Christian; Mlinarić-Majerski, Kata; Isaacs, Lyle; Glaser, Robert; Hobza, Pavel

    2016-11-21

    A training set of eleven X-ray structures determined for biomimetic complexes between cucurbit[n]uril (CB[7 or 8]) hosts and adamantane-/diamantane ammonium/aminium guests were studied with DFT-D3 quantum mechanical computational methods to afford ΔG calcd binding energies. A novel feature of this work is that the fidelity of the BLYP-D3/def2-TZVPP choice of DFT functional was proven by comparison with more accurate methods. For the first time, the CB[n]⋅guest complex binding energy subcomponents [for example, ΔE dispersion , ΔE electrostatic , ΔG solvation , binding entropy (-TΔS), and induced fit E deformation(host) , E deformation(guest) ] were calculated. Only a few weeks of computation time per complex were required by using this protocol. The deformation (stiffness) and solvation properties (with emphasis on cavity desolvation) of cucurbit[n]uril (n=5, 6, 7, 8) isolated host molecules were also explored by means of the DFT-D3 method. A high ρ 2 =0.84 correlation coefficient between ΔG exptl and ΔG calcd was achieved without any scaling of the calculated terms (at 298 K). This linear dependence was utilized for ΔG calcd predictions of new complexes. The nature of binding, including the role of high energy water molecules, was also studied. The utility of introduction of tethered [-(CH 2 ) n NH 3 ] + amino loops attached to N,N-dimethyl-adamantane-1-amine and N,N,N',N'-tetramethyl diamantane-4,9-diamine skeletons (both from an experimental and a theoretical perspective) is presented here as a promising tool for the achievement of new ultra-high binding guests to CB[7] hosts. Predictions of not yet measured equilibrium constants are presented herein. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Respiratory-Gated Positron Emission Tomography and Breath-Hold Computed Tomography Coupling to Reduce the Influence of Respiratory Motion: Methodology and Feasibility

    International Nuclear Information System (INIS)

    Daouk, J.; Fin, L.; Bailly, P.; Meyer, M.E.

    2009-01-01

    Background: Respiratory motion causes uptake in positron emission tomography (PET) images of chest and abdominal structures to be blurred and reduced in intensity. Purpose: To compare two respiratory-gated PET binning methods (based on frequency and amplitude analyses of the respiratory signal) and to propose a 'BH-based' method based on an additional breath-hold computed tomography (CT) acquisition. Material and Methods: Respiratory-gated PET consists in list-mode (LM) acquisition with simultaneous respiratory signal recording. A phantom study featured rectilinear movement of a 0.5-ml sphere filled with 18 F-fluorodeoxyglucose ( 18 F-FDG) solution, placed in a radioactive background (sphere-to-background contrast 6:1). Two patients were also examined. Three figures of merit were calculated: the target-to-background ratio profile (TBRP) in the axial direction through the uptake (i.e., the sphere or lesion), full-width-at-half-maximum (FWHM) values, and maximized standard uptake values (SUVmax). Results: In the phantom study, the peak TBRP was 0.9 for non-gated volume, 1.83 for BH-based volume, and varied between 1.13 and 1.73 for Freq-based volumes and between 1.34 and 1.66 for Amp-based volumes. A reference volume (REF-static) was also acquired for the phantom (in a static, 'expiratory' state), with a peak TBRP at 1.88. TBRPs were computed for patient data, with higher peak values for all gated volumes than for non-gated volumes. Conclusion: Respiratory-gated PET acquisition reduces the blurring effect and increases image contrast. However, Freq-based and Amp-based volumes are still influenced by inappropriate attenuation correction and misregistration of mobile lesions on CT images. The proposed BH-based method both reduces motion artifacts and improves PET-CT registration

  1. A computational methodology for a micro launcher engine test bench using a combined linear static and dynamic in frequency response analysis

    Directory of Open Access Journals (Sweden)

    Ion DIMA

    2017-03-01

    Full Text Available This article aims to provide a quick methodology to determine the critical values of the forces, displacements and stress function of frequency, under a combined linear static (101 Solution - Linear Static and dynamic load in frequency response (108 Solution - Frequency Response, Direct Method, applied to a micro launcher engine test bench, using NASTRAN 400 Solution - Implicit Nonlinear. NASTRAN/PATRAN software is used. Practically in PATRAN the preprocessor has to define a linear or nonlinear static load at step 1 and a dynamic in frequency response load (time dependent at step 2. In Analyze the following options are chosen: for Solution Type Implicit Nonlinear Solution (SOL 400 is selected, for Subcases Static Load and Transient Dynamic is chosen and for Subcase Select the two cases static and dynamic will be selected. NASTRAN solver will overlap results from static analysis with the dynamic analysis. The running time will be reduced three times if using Krylov solver. NASTRAN SYSTEM (387 = -1 instruction is used in order to activate Krylov option. Also, in Analysis the OP2 Output Format shall be selected, meaning that in bdf NASTRAN input file the PARAM POST 1 instruction shall be written. The structural damping can be defined in two different ways: either at the material card or using the PARAM, G, 0.05 instruction (in this example a damping coefficient by 5% was used. The SDAMPING instruction in pair with TABDMP1 work only for dynamic in frequency response, modal method, or in direct method with viscoelastic material, not for dynamic in frequency response, direct method (DFREQ, with linear elastic material. The Direct method – DFREQ used in this example is more accurate. A set in translation of boundary conditions was used and defined at the base of the test bench.

  2. Computer aided product design

    DEFF Research Database (Denmark)

    Constantinou, Leonidas; Bagherpour, Khosrow; Gani, Rafiqul

    1996-01-01

    A general methodology for Computer Aided Product Design (CAPD) with specified property constraints which is capable of solving a large range of problems is presented. The methodology employs the group contribution approach, generates acyclic, cyclic and aromatic compounds of various degrees......-liquid equilibria (LLE), solid-liquid equilibria (SLE) and gas solubility. Finally, a computer program based on the extended methodology has been developed and the results from five case studies highlighting various features of the methodology are presented....

  3. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  4. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  5. Tax on Incomes Obtained from Romania by Non-resident Taxpayers – Computation Methodology and the Fiscal Impact upon the Corporate Tax

    Directory of Open Access Journals (Sweden)

    Cristina IOVU

    2017-04-01

    Full Text Available Withholding tax for non-resident taxpayers is a tax due to the Romanian state budget by taxpayers, the expense for the amount related to this tax playing a fiscal impact upon the fiscal result and, therefore, upon the corporate tax due within a fiscal period of time. Most of the times, in practice, a series of questions arise, such as: which is the entity having the obligation to compute and pay the tax, respectively the income payer or the one collecting it; is the transaction taxable from the point of view of the tax on non-resident taxpayers’ income, function of the nature and object of the transaction; which is the tax rate owed, in case the operation has a taxable feature; which is the fiscal treatment applicable in case of expenses incurred in the accounting records, with the amount of the tax owed on non-resident taxpayers income. Due to this reason, in practice there are several approaches which could generate fiscal risks, related to the fiscal treatment applicable in case of different types pf transactions concluded with non-resident tax payers, depending on the nature and scope of the respective transaction.

  6. Methodological inaccuracies in clinical aortic valve severity assessment: insights from computational fluid dynamic modeling of CT-derived aortic valve anatomy

    Science.gov (United States)

    Traeger, Brad; Srivatsa, Sanjay S.; Beussman, Kevin M.; Wang, Yechun; Suzen, Yildirim B.; Rybicki, Frank J.; Mazur, Wojciech; Miszalski-Jamka, Tomasz

    2016-04-01

    Aortic stenosis is the most common valvular heart disease. Assessing the contribution of the valve as a portion to total ventricular load is essential for the aging population. A CT scan for one patient was used to create one in vivo tricuspid aortic valve geometry and assessed with computational fluid dynamics (CFD). CFD simulated the pressure, velocity, and flow rate, which were used to assess the Gorlin formula and continuity equation, current clinical diagnostic standards. The results demonstrate an underestimation of the anatomic orifice area (AOA) by Gorlin formula and overestimation of AOA by the continuity equation, using peak velocities, as would be measured clinically by Doppler echocardiography. As a result, we suggest that the Gorlin formula is unable to achieve the intended estimation of AOA and largely underestimates AOA at the critical low-flow states present in heart failure. The disparity in the use of echocardiography with the continuity equation is due to the variation in velocity profile between the outflow tract and the valve orifice. Comparison of time-averaged orifice areas by Gorlin and continuity with instantaneous orifice areas by planimetry can mask the errors of these methods, which is a result of the assumption that the blood flow is inviscid.

  7. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    Science.gov (United States)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict

  8. Orthopaedic measurements with computed radiography: Methodological development, accuracy, and radiation dose with special reference to the weight-bearing lower extremity and the dislocating patella

    International Nuclear Information System (INIS)

    Sanfridsson, J.

    2001-01-01

    The overall aim of this study was to develop and evaluate a measurement system for computed radiography and Picture Archiving and Communication Systems, permitting measurements of long distances and angles in and between related images. The developed measurement system, which was based on the QUESTOR Precision Radiography system, was applied to the weight-bearing knee with special reference to the dislocating patella. The QPR system modified for CR fulfilled the criteria for measuring the weight-bearing knee. The special measuring assistance tools that were developed were important for the implementation of CR and PACS, particularly in workstations programmed for musculoskeletal radiology. The energy imparted to the patient was reduced by 98% at the lowest exposure of the CR-system, compared with our conventional analogue method, without loss of diagnostic accuracy. The CR technique creates a possibility, to an extent not previously feasible, to differentiate the exposure parameters (and thus minimise the radiation dose to the patient) by carefully considering the purpose of the examination. A radiographic method for measuring the rotation of the femur and the tibia, and the patellar translation was developed and applied to healthy volunteers. The introduced patellar variables have yielded new insights into the complex sequence of motions between the femur, tibia, and patella. The patients with a dislocating patella were subdivided into one 'clean' group of spontaneous dislocations and one group with various traumas in the history, which thus resulted in two groups with distinct radiographic differences. The Q-angle was decreased in knees that had suffered dislocations, and the traditional surgical treatment with a further reduction of the Q-angle must be challenged. The use of clinical measurements of the Q-angle was not an optimal way to evaluate the mechanical alignment in the patellofemoral joint under physiological conditions. In this study, we have proved

  9. Orthopaedic measurements with computed radiography: Methodological development, accuracy, and radiation dose with special reference to the weight-bearing lower extremity and the dislocating patella

    Energy Technology Data Exchange (ETDEWEB)

    Sanfridsson, J. [Univ. Hospital, Lund (Sweden). Dept. of Diagnostic Radiology

    2001-03-01

    The overall aim of this study was to develop and evaluate a measurement system for computed radiography and Picture Archiving and Communication Systems, permitting measurements of long distances and angles in and between related images. The developed measurement system, which was based on the QUESTOR Precision Radiography system, was applied to the weight-bearing knee with special reference to the dislocating patella. The QPR system modified for CR fulfilled the criteria for measuring the weight-bearing knee. The special measuring assistance tools that were developed were important for the implementation of CR and PACS, particularly in workstations programmed for musculoskeletal radiology. The energy imparted to the patient was reduced by 98% at the lowest exposure of the CR-system, compared with our conventional analogue method, without loss of diagnostic accuracy. The CR technique creates a possibility, to an extent not previously feasible, to differentiate the exposure parameters (and thus minimise the radiation dose to the patient) by carefully considering the purpose of the examination. A radiographic method for measuring the rotation of the femur and the tibia, and the patellar translation was developed and applied to healthy volunteers. The introduced patellar variables have yielded new insights into the complex sequence of motions between the femur, tibia, and patella. The patients with a dislocating patella were subdivided into one 'clean' group of spontaneous dislocations and one group with various traumas in the history, which thus resulted in two groups with distinct radiographic differences. The Q-angle was decreased in knees that had suffered dislocations, and the traditional surgical treatment with a further reduction of the Q-angle must be challenged. The use of clinical measurements of the Q-angle was not an optimal way to evaluate the mechanical alignment in the patellofemoral joint under physiological conditions. In this study, we have

  10. Radiotracer methodology

    International Nuclear Information System (INIS)

    Eng, R.R.

    1988-01-01

    In 1923, George Hevesy demonstrated the distribution of radioactive lead in the horsebean plant. This early demonstration of the potential use of radiotracers in biology was reinforced when J.G. Hamilton and colleagues used iodine-131 for diagnostic purposes in patients. Then in 1950 Cassen et al. designed the first scintillation counter for measuring radioiodine in the body, using calcium tungstate crystals coupled to a photomultiplier tube. This was followed by the development of the Anger camera, which permitted visualization of radiotracer distribution in biological systems. From these significant early discoveries to the present, many advances have been made. They include the discovery and production of many useful radioisotopes; the formulation of these radioisotopes into useful radiotracers; the advent of first- , and second-, and third-generation instrumentation for monitoring in vitro and in vivo distributions of new radiotracers; and the application of this knowledge to allow us to better understand physiological processes and treat disease states. Radiotracer techniques are integral to numerous techniques described in this volume. Autoradiography, nuclear scintigraphy, positron emission tomography, and single-photon emission computed tomography (SPECT) are all dependent on an understanding of radiotracer techniques to properly utilize these probe devices

  11. Computational model and performance optimization methodology of a compact design heat exchanger used as an IHX in HTGR; Modelo computacional y metodologia de optimizacion del funcionamiento de un intercambiador de calor de diseno compacto empleado como IHX en HTGR

    Energy Technology Data Exchange (ETDEWEB)

    De la Torre V, R.; Francois L, J. L., E-mail: delatorrevaldes@gmail.com [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    The intermediate heat exchangers (IHX) present in high-temperature gas-cooled reactor (HTGR) present complex operating conditions, characterized by temperature values higher than 1073 K. Conventional designs of tubes and shell have shown disadvantages with respect to compact designs. In this work, computational models of a compact heat exchanger design, the printed circuit, were built under IHX conditions in a HTGR installation. In these models, a detailed geometry was considered in three dimensions, corresponding to a transfer unit of the heat exchanger. Computational fluid dynamics techniques and finite element methods were used to study the thermo-hydraulic and mechanical functioning of the equipment, respectively. The properties of the materials were defined as temperature functions. The thermo-hydraulic results obtained were established as operating conditions in the structural calculations. A methodology was developed based on the analysis of capital and operating costs, which takes into account the heat transfer, pressure drop and the mechanical behavior of the structure, in a single optimization variable. By analyzing the experimental results of other authors, a relationship was obtained between the operation time of the equipment and the maximum effort in the structure, which was used in the model. The results show that the model that allows a greater thermal efficiency differs from the one that has lower total cost per year. (Author)

  12. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    Science.gov (United States)

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-03

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography.

  13. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  14. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  15. Massive computation methodology for reactor operation (MACRO)

    International Nuclear Information System (INIS)

    Gustavsson, Cecilia; Pomp, Stephan; Sjoestrand, Henrik; Wallin, Gustav; Oesterlund, Michael; Koning, Arjan; Rochman, Dimitri; Bejmer, Klaes-Hakan; Henriksson, Hans

    2010-01-01

    Today, nuclear data libraries do not handle uncertainties from nuclear data in a consistent manner and the reactor codes do not request uncertainties in nuclear data input. Thus, the output from these codes have unknown uncertainties. The plan is to use a method proposed by Koning and Rochman to investigate the propagation of nuclear data uncertainties into reactor physics codes and macroscopic parameters. A project (acronym MACRO) has started at Uppsala University in collaboration with A. Koning and with financial support from Vattenfall AB and the Swedish Research Council within the GENIUS (Generation IV research in universities of Sweden) project. In the proposed method the uncertainties in nuclear model parameters will be derived from theoretical considerations and comparisons of nuclear model results with experimental cross-section data. Given the probability distribution in the model parameters a large set of random, complete ENDF-formatted nuclear data libraries will be created using the TALYS code. The generated nuclear data libraries will then be used in neutron transport codes to obtain macroscopic reactor parameters. For this, models of reactor systems with proper geometry and elements will be used. This will be done for all data libraries and the variation of the final results will be regarded as a systematic uncertainty in the investigated reactor parameter. The understanding of these systematic uncertainties is especially important for the design and intercomparison of new reactor concepts, i.e., Generation IV, and optimization applications for current generation reactors is envisaged. (authors)

  16. Massive computation methodology for reactor operation (MACRO)

    Energy Technology Data Exchange (ETDEWEB)

    Gustavsson, Cecilia; Pomp, Stephan; Sjoestrand, Henrik; Wallin, Gustav; Oesterlund, Michael [Division of applied nuclear physics, Department of physics and astronomy, Uppsala University, Laegerhyddsvaegen 1, 751 20 Uppsala (Sweden); Koning, Arjan; Rochman, Dimitri [Nuclear Research and consultancy Group (NRG) Westerduinweg 3, Petten (Netherlands); Bejmer, Klaes-Hakan [Vattenfall Nuclear Fuel AB, Jaemtlandsgatan 99, Vaellingby (Sweden); Henriksson, Hans [Vattenfall Research and Development AB, Jaemtlandsgatan 99, Vaellingby (Sweden)

    2010-07-01

    Today, nuclear data libraries do not handle uncertainties from nuclear data in a consistent manner and the reactor codes do not request uncertainties in nuclear data input. Thus, the output from these codes have unknown uncertainties. The plan is to use a method proposed by Koning and Rochman to investigate the propagation of nuclear data uncertainties into reactor physics codes and macroscopic parameters. A project (acronym MACRO) has started at Uppsala University in collaboration with A. Koning and with financial support from Vattenfall AB and the Swedish Research Council within the GENIUS (Generation IV research in universities of Sweden) project. In the proposed method the uncertainties in nuclear model parameters will be derived from theoretical considerations and comparisons of nuclear model results with experimental cross-section data. Given the probability distribution in the model parameters a large set of random, complete ENDF-formatted nuclear data libraries will be created using the TALYS code. The generated nuclear data libraries will then be used in neutron transport codes to obtain macroscopic reactor parameters. For this, models of reactor systems with proper geometry and elements will be used. This will be done for all data libraries and the variation of the final results will be regarded as a systematic uncertainty in the investigated reactor parameter. The understanding of these systematic uncertainties is especially important for the design and intercomparison of new reactor concepts, i.e., Generation IV, and optimization applications for current generation reactors is envisaged. (authors)

  17. Zooplankton Methodology, Collection & identyification - A field manual

    Digital Repository Service at National Institute of Oceanography (India)

    Goswami, S.C.

    and productivity would largely depend upon the use of correct methodology which involves collection of samples, fixation, preservation, analysis and computation of data. The detailed procedures on all these aspects are given in this manual....

  18. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  19. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  20. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  1. Solar Energy Potential Assessment on Rooftops and Facades in Large Built Environments Based on LiDAR Data, Image Processing, and Cloud Computing. Methodological Background, Application, and Validation in Geneva (Solar Cadaster

    Directory of Open Access Journals (Sweden)

    Gilles Desthieux

    2018-03-01

    Full Text Available The paper presents the core methodology for assessing solar radiation and energy production on building rooftops and vertical facades (still rarely considered of the inner-city. This integrated tool is based on the use of LiDAR, 2D and 3D cadastral data. Together with solar radiation and astronomical models, it calculates the global irradiance for a set of points located on roofs, ground, and facades. Although the tool takes simultaneously roofs, ground, and facades, different methods of shadow casting are applied. Shadow casting on rooftops is based on image processing techniques. On the other hand, the assessment on facade involves first to create and interpolate points along the facades and then to implement a point-by-point shadow casting routine. The paper is structured in five parts: (i state of the art on the use of 3D GIS and automated processes in assessing solar radiation in the built environment, (ii overview on the methodological framework used in the paper, (iii detailed presentation of the method proposed for solar modeling and shadow casting, in particular by introducing an innovative approach for modeling the sky view factor (SVF, (iv demonstration of the solar model introduced in this paper through applications in Geneva’s building roofs (solar cadaster and facades, (v validation of the solar model in some Geneva’s spots, focusing especially on two distinct comparisons: solar model versus fisheye catchments on partially inclined surfaces (roof component; solar model versus photovoltaic simulation tool PVSyst on vertical surfaces (facades. Concerning the roof component, validation results emphasize global sensitivity related to the density of light sources on the sky vault to model the SVF. The low dense sky model with 145 light sources gives satisfying results, especially when processing solar cadasters in large urban areas, thus allowing to save computation time. In the case of building facades, introducing weighting factor

  2. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  3. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  4. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  5. EVALUATION OF THE GRAI INTEGRATED METHODOLOGY AND THE IMAGIM SUPPORTWARE

    Directory of Open Access Journals (Sweden)

    J.M.C. Reid

    2012-01-01

    Full Text Available This paper describes the GRAI Integrated Methodology and identifies the need for computer tools to support enterprise modelling,design and integration. The IMAGIM tool is then evaluated in terms of its ability to support the GRAI Integrated Methodology. The GRAI Integrated Methodology is an Enterprise Integration methodology developed to support the design of CIM systems . The GRAI Integrated Methodology consists of the GRAI model and a structured approach. The latest addition to the methodology is the IMAGIM software tool developed by the GRAI research group for the specific purpose of supporting the methodology.

  6. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  7. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  8. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  9. A dose to curie conversion methodology

    International Nuclear Information System (INIS)

    Stowe, P.A.

    1987-01-01

    Development of the computer code RadCAT (Radioactive waste Classification And Tracking) has led to the development of a simple dose rate to curie content conversion methodology for containers with internally distributed radioactive material. It was determined early on that, if possible, the computerized dose rate to curie evaluation model employed in RadCAT should yield the same results as the hand method utilized and specified in plant procedures. A review of current industry practices indicated two distinct types of computational methodologies are presently in use. The most common methods are computer based calculations utilizing complex mathematical models specifically established for various containers geometries. This type of evaluation is tedious, however, and does not lend itself to repetition by hand. The second method of evaluation, therefore, is simplified expressions that sacrifice accuracy for ease of computation, and generally over estimate container curie content. To meet the aforementioned criterion current computer based models were deemed unacceptably complex and hand computational methods to be too inaccurate for serious consideration. The contact dose rate/curie content analysis methodology presented herein provides an equation that is easy to use in hand calculations yet provides accuracy equivalent to other computer based computations

  10. Computer Assisted Instruction

    Science.gov (United States)

    Higgins, Paul

    1976-01-01

    Methodology for developing a computer assisted instruction (CAI) lesson (scripting, programing, and testing) is reviewed. A project done by Informatics Education Ltd. (IEL) for the Department of National Defense (DND) is used as an example. (JT)

  11. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  12. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  13. Research Methodology in Global Strategy Research

    DEFF Research Database (Denmark)

    Cuervo-Cazurra, Alvaro; Mudambi, Ram; Pedersen, Torben

    2017-01-01

    We review advances in research methodology used in global strategy research and provide suggestions on how researchers can improve their analyses and arguments. Methodological advances in the extraction of information, such as computer-aided text analysis, and in the analysis of datasets......, such as differences-in-differences and propensity score matching, have helped deal with challenges (e.g., endogeneity and causality) that bedeviled earlier studies and resulted in conflicting findings. These methodological advances need to be considered as tools that complement theoretical arguments and well......-explained logics and mechanisms so that researchers can provide better and more relevant recommendations to managers designing the global strategies of their organizations....

  14. Residual radioactive material guidelines: Methodology and applications

    International Nuclear Information System (INIS)

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III.

    1989-01-01

    A methodology to calculate residual radioactive material guidelines was developed for the US Department of Energy (DOE). This methodology is coded in a menu-driven computer program, RESRAD, which can be run on IBM or IBM-compatible microcomputers. Seven pathways of exposure are considered: external radiation, inhalation, and ingestion of plant foods, meat, milk, aquatic foods, and water. The RESRAD code has been applied to several DOE sites to calculate soil cleanup guidelines. This experience has shown that the computer code is easy to use and very user-friendly. 3 refs., 8 figs

  15. Overview of the ISAM safety assessment methodology

    International Nuclear Information System (INIS)

    Simeonov, G.

    2003-01-01

    The ISAM safety assessment methodology consists of the following key components: specification of the assessment context description of the disposal system development and justification of scenarios formulation and implementation of models running of computer codes and analysis and presentation of results. Common issues run through two or more of these assessment components, including: use of methodological and computer tools, collation and use of data, need to address various sources of uncertainty, building of confidence in the individual components, as well as the overall assessment. The importance of the iterative nature of the assessment should be recognised

  16. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  17. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  18. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  19. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  20. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  1. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  3. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  4. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  5. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  6. Computers in writing instruction

    NARCIS (Netherlands)

    Schwartz, Helen J.; van der Geest, Thea; Smit-Kreuzen, Marlies

    1992-01-01

    For computers to be useful in writing instruction, innovations should be valuable for students and feasible for teachers to implement. Research findings yield contradictory results in measuring the effects of different uses of computers in writing, in part because of the methodological complexity of

  7. Design of formulated products: a systematic methodology

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Ng, K.M.

    2011-01-01

    /or verifies a specified set through a sequence of predefined activities (work-flow). Stage-2 and stage-3 (not presented here) deal with the planning and execution of experiments, for product validation. Four case studies have been developed to test the methodology. The computer-aided design (stage-1...

  8. Basic Project Management Methodologies for Survey Researchers.

    Science.gov (United States)

    Beach, Robert H.

    To be effective, project management requires a heavy dependence on the document, list, and computational capability of a computerized environment. Now that microcomputers are readily available, only the rediscovery of classic project management methodology is required for improved resource allocation in small research projects. This paper provides…

  9. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  10. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  11. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  12. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  13. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  14. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  15. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  16. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  17. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  18. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  19. Optimized planning methodologies of ASON implementation

    Science.gov (United States)

    Zhou, Michael M.; Tamil, Lakshman S.

    2005-02-01

    Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.

  20. Methodology for evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.E.

    1992-01-01

    Effort in this project during the past year has focused on the development, refinement, and distribution of computer software that will allow current Receiver Operating Characteristic (ROC) methodology to be used conveniently and reliably by investigators in a variety of evaluation tasks in diagnostic medicine; and on the development of new ROC methodology that will broaden the spectrum of evaluation tasks and/or experimental settings to which the fundamental approach can be applied. Progress has been limited by the amount of financial support made available to the project

  1. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  2. Methodology for the hybrid solution of systems of differential equations

    International Nuclear Information System (INIS)

    Larrinaga, E.F.; Lopez, M.A.

    1993-01-01

    This work shows a general methodology of solution to systems of differential equations in hybrid computers. Taking into account this methodology, a mathematical model was elaborated. It offers wide possibilities of recording and handling the results on the basis of using the hybrid system IBM-VIDAC 1224 which the ISCTN has. It also presents the results gained when simulating a simple model of a nuclear reactor, which was used in the validation of the results of the computational model

  3. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  4. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  5. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  6. Review and evaluation of paleohydrologic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites.

  7. Review and evaluation of paleohydrologic methodologies

    International Nuclear Information System (INIS)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites

  8. A performance assessment methodology for low-level waste facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.; Mattingly, P.A.

    1990-07-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This report provides a summary of background reports on the development of the methodology and an overview of the models and codes selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology and a sequential procedure for applying the methodology. Discussions are provided of models and associated assumptions that are appropriate for each phase of the methodology, the goals of each phase, data required to implement the models, significant sources of uncertainty associated with each phase, and the computer codes used to implement the appropriate models. In addition, a sample demonstration of the methodology is presented for a simple conceptual model. 64 refs., 12 figs., 15 tabs

  9. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  10. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  11. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  12. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  13. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  14. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  15. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  16. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  17. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  18. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  19. Gamma ray auto absorption correction evaluation methodology

    International Nuclear Information System (INIS)

    Gugiu, Daniela; Roth, Csaba; Ghinescu, Alecse

    2010-01-01

    Neutron activation analysis (NAA) is a well established nuclear technique, suited to investigate the microstructural or elemental composition and can be applied to studies of a large variety of samples. The work with large samples involves, beside the development of large irradiation devices with well know neutron field characteristics, the knowledge of perturbing phenomena and adequate evaluation of correction factors like: neutron self shielding, extended source correction, gamma ray auto absorption. The objective of the works presented in this paper is to validate an appropriate methodology for gamma ray auto absorption correction evaluation for large inhomogeneous samples. For this purpose a benchmark experiment has been defined - a simple gamma ray transmission experiment, easy to be reproduced. The gamma ray attenuation in pottery samples has been measured and computed using MCNP5 code. The results show a good agreement between the computed and measured values, proving that the proposed methodology is able to evaluate the correction factors. (authors)

  20. Cross-section methodology in SIMMER

    International Nuclear Information System (INIS)

    Soran, P.D.

    1975-11-01

    The cross-section methodology incorporated in the SIMMER code is described. Data base for all cross sections is the ENDF/B system with various progressing computer codes to group collapse and modify the group constants which are used in SIMMER. Either infinitely dilute cross sections or the Bondarenko formalism can be used in SIMMER. Presently only a microscopic treatment is considered, but preliminary macroscopic algorithms have been investigated

  1. Implied Volatility Surface: Construction Methodologies and Characteristics

    OpenAIRE

    Cristian Homescu

    2011-01-01

    The implied volatility surface (IVS) is a fundamental building block in computational finance. We provide a survey of methodologies for constructing such surfaces. We also discuss various topics which can influence the successful construction of IVS in practice: arbitrage-free conditions in both strike and time, how to perform extrapolation outside the core region, choice of calibrating functional and selection of numerical optimization algorithms, volatility surface dynamics and asymptotics.

  2. Cross-section methodology in SIMMER

    International Nuclear Information System (INIS)

    Soran, P.D.

    1976-05-01

    The cross-section methodology incorporated in the SIMMER code is described. Data base for all cross sections is the ENDF/B system with various progressing computer codes to group collapse and modify the group constants which are used in SIMMER. Either infinitely dilute cross sections or the Bondarenko formalism can be used in SIMMER. Presently only a microscopic treatment is considered, but preliminary macroscopic algorithms have been investigated

  3. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  4. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  5. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  6. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  7. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  8. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  9. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    computing environment the Visual Basic for Applications ( VBA ) programming language presents the option as our programming language of choice. We propose...background, or access to other computational programming environments, to build topic models from free text datasets using a familiar Excel based...environment the restricts access to other software based text analytic tools. Opportunities to deploy developmental versions of the methodology and

  10. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  11. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  12. Granular computing: perspectives and challenges.

    Science.gov (United States)

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  13. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  14. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  15. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  16. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  17. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  18. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  19. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  20. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  1. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  2. Data structures, computer graphics, and pattern recognition

    CERN Document Server

    Klinger, A; Kunii, T L

    1977-01-01

    Data Structures, Computer Graphics, and Pattern Recognition focuses on the computer graphics and pattern recognition applications of data structures methodology.This book presents design related principles and research aspects of the computer graphics, system design, data management, and pattern recognition tasks. The topics include the data structure design, concise structuring of geometric data for computer aided design, and data structures for pattern recognition algorithms. The survey of data structures for computer graphics systems, application of relational data structures in computer gr

  3. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  4. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  5. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  6. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  7. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  8. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  9. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  10. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  11. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  12. Software life cycle methodologies and environments

    Science.gov (United States)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  13. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  14. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  15. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  16. Development of seismic PSA methodology at JAERI

    International Nuclear Information System (INIS)

    Muramatsu, K.; Ebisawa, K.; Matsumoto, K.; Oikawa, T.; Kondo, M.

    1995-01-01

    The Japan Atomic Energy Research Institute (JAERI) is developing a methodology for seismic probabilistic safety assessment (PSA) of nuclear power plants, aiming at providing a set of procedures, computer codes and data suitable for performing seismic PSA in Japan. In order to demonstrate the usefulness of JAERI's methodology and to obtain better understanding on the controlling factors of the results of seismic PSAs, a seismic PSA for a BWR is in progress. In the course of this PSA, various improvements were made on the methodology. In the area of the hazard analysis, the application of the current method to the model plant site is being carried out. In the area of response analysis, the response factor method was modified to consider the non-linear response effect of the building. As for the capacity evaluation of components, since capacity data for PSA in Japan are very scarce, capacities of selected components used in Japan were evaluated. In the systems analysis, the improvement of the SECOM2 code was made to perform importance analysis and sensitivity analysis for the effect of correlation of responses and correlation of capacities. This paper summarizes the recent progress of the seismic PSA research at JAERI with emphasis on the evaluation of component capacity and the methodology improvement of systems reliability analysis. (author)

  17. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  18. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  19. Proposed Methodology for Establishing Area of Applicability

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This paper presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the data validation tasks of a criticality safety computational study. The S/U methods presented are designed to provide a formal means of establishing the area (or range) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters form the key to the technique. These parameters are the so-called D parameters, which represent the differences by energy group of S/U-generated sensitivity profiles, and c parameters, which are the k correlation coefficients, each of which give information relative to the similarity between pairs of selected systems. The use of a Generalized Linear Least-Squares Methodology (GLLSM) tool is also described in this paper. These methods and guidelines are also applied to a sample validation for uranium systems with enrichments greater than 5 wt %

  20. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  1. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  2. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  3. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei

    2016-01-01

    In this paper, we propose a generic computer-aided methodology for synthesis of different processing networks using superstructure optimization. The methodology can handle different network optimization problems of various application fields. It integrates databases with a common data architecture......, a generic model to represent the processing steps, and appropriate optimization tools. A special software interface has been created to automate the steps in the methodology workflow, allow the transfer of data between tools and obtain the mathematical representation of the problem as required...

  4. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  5. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  6. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  7. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  8. Methodologies for local development in smart society

    Directory of Open Access Journals (Sweden)

    Lorena BĂTĂGAN

    2012-07-01

    Full Text Available All of digital devices which are connected through the Internet, are producing a big quantity of data. All this information can be turned into knowledge because we now have the computational power and solutions for advanced analytics to make sense of it. With this knowledge, cities could reduce costs, cut waste, and improve efficiency, productivity and quality of life for their citizens. The efficient/smart cities are characterized by more importance given to environment, resources, globalization and sustainable development. This paper represents a study on the methodologies for urban development that become the central element to our society.

  9. Methodology for coupling computational fluid dynamics and integral transport neutronics

    International Nuclear Information System (INIS)

    Thomas, J. W.; Zhong, Z.; Sofu, T.; Downar, T. J.

    2004-01-01

    The CFD code STAR-CD was coupled to the integral transport code DeCART in order to provide high-fidelity, full physics reactor simulations. An interface program was developed to perform the tasks of mapping the STAR-CD mesh to the DeCART mesh, managing all communication between STAR-CD and DeCART, and monitoring the convergence of the coupled calculations. The interface software was validated by comparing coupled calculation results with those obtained using an independently developed interface program. An investigation into the convergence characteristics of coupled calculations was performed using several test models on a multiprocessor LINUX cluster. The results indicate that the optimal convergence of the coupled field calculation depends on several factors, to include the tolerance of the STAR-CD solution and the number of DeCART transport sweeps performed before exchanging data between codes. Results for a 3D, multi-assembly PWR problem on 12 PEs of the LINUX cluster indicate the best performance is achieved when the STAR-CD tolerance and number of DeCART transport sweeps are chosen such that the two fields converge at approximately the same rate. (authors)

  10. Prototype Methodology for Designing and Developing Computer-Assisted Instruction

    Science.gov (United States)

    1986-08-01

    contains essential information and is not set off with commas. For example: The lawn mower that is broken is in the garage. Use "which" whenever the...phrase that follows contains supplementary or incidental information. "Which" clauses are set of by a pair of commas. For example: The lawn mower , which...is broken, is in the garage. If the lawn mower that is broken is in the garage, whereas the lawn mower that is working is in the yard, then the

  11. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...

  12. Methodological Reflections: Designing and Understanding Computer-Supported Collaborative Learning

    Science.gov (United States)

    Hamalainen, Raija

    2012-01-01

    Learning involves more than just a small group of participants, which makes designing and managing collaborative learning processes in higher education a challenging task. As a result, emerging concerns in current research have pointed increasingly to teacher orchestrated learning processes in naturalistic learning settings. In line with this…

  13. Notes on Computational Methodology and Tools of Thermoelectric Energy Systems

    DEFF Research Database (Denmark)

    Chen, Min; Bach, Inger Palsgaard; Rosendahl, Lasse

    2007-01-01

    The SPICE model allows the concurrent simulation of thermoelectric devices and application electric sub-models. It is an important step to implement the thermoelectric modeling at the system level. In this paper, temperature dependent material properties in the SPICE model, temperature and heat...... flow obtained by the code ANSYS Multiphysics and SPICE (Simulation Program with Integrated Circuit Emphasis), as well as some notes on the 3-D extension of the SPICE model are introduced....

  14. Toward a computer-aided methodology for discourse analysis

    African Journals Online (AJOL)

    aided methods to discourse ... Multilingual Concordancer, NLTK (Natural Language Tool Kit), Simple .... "media discourse" or "learner discourse". ..... The pair 'active euthanasia' occurs seven times in the text used for this .... New York: Pantheon.

  15. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  16. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  17. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  18. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  19. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  20. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  1. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  2. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  3. Methodology of shielding calculation for nuclear reactors

    International Nuclear Information System (INIS)

    Maiorino, J.R.; Mendonca, A.G.; Otto, A.C.; Yamaguchi, Mitsuo

    1982-01-01

    A methodology of calculation that coupling a serie of computer codes in a net that make the possibility to calculate the radiation, neutron and gamma transport, is described, for deep penetration problems, typical of nuclear reactor shielding. This net of calculation begining with the generation of constant multigroups, for neutrons and gamma, by the AMPX system, coupled to ENDF/B-IV data library, the transport calculation of these radiations by ANISN, DOT 3.5 and Morse computer codes, up to the calculation of absorbed doses and/or equivalents buy SPACETRAN code. As examples of the calculation method, results from benchmark n 0 6 of Shielding Benchmark Problems - ORNL - RSIC - 25, namely Neutron and Secondary Gamma Ray fluence transmitted through a Slab of Borated Polyethylene, are presented. (Author) [pt

  4. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  5. Analytical and empirical mathematics with computers

    International Nuclear Information System (INIS)

    Wolfram, S.

    1986-01-01

    In this presentation, some of the practical methodological and theoretical implications of computation for the mathematical sciences are discussed. Computers are becoming an increasingly significant tool for research in the mathematical sciences. This paper discusses some of the fundamental ways in which computers have and can be used to do mathematics

  6. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  7. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  8. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  9. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  10. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  11. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  12. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  13. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  14. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  15. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  16. Advanced Methodologies for NASA Science Missions

    Science.gov (United States)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  17. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  18. New methodology for fast prediction of wheel wear evolution

    Science.gov (United States)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  19. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  20. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  1. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...

  2. A Java-Web-Based-Learning Methodology, Case Study ...

    African Journals Online (AJOL)

    A Java-Web-Based-Learning Methodology, Case Study : Waterborne diseases. The recent advances in web technologies have opened new opportunities for computer-based-education. One can learn independently of time and place constraints, and have instantaneous access to relevant updated material at minimal cost.

  3. Research Methodologies Explored for a Paradigm Shift in University Teaching.

    Science.gov (United States)

    Venter, I. M.; Blignaut, R. J.; Stoltz, D.

    2001-01-01

    Innovative teaching methods such as collaborative learning, teamwork, and mind maps were introduced to teach computer science and statistics courses at a South African university. Soft systems methodology was adapted and used to manage the research process of evaluating the effectiveness of the teaching methods. This research method provided proof…

  4. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  5. Non-invasive cardiac imaging. Spectrum, methodology, indication and interpretation

    International Nuclear Information System (INIS)

    Schaefers, Michael; Flachskampf, Frank; Sechtem, Udo; Achenbach, Stephan; Krause, Bernd J.; Schwaiger, Markus; Breithardt, Guenter

    2008-01-01

    The book contains 13 contributions concerning the following chapters: (1)methodology: echo cardiography; NMR imaging; nuclear medicine; computer tomography, (2) clinical protocols: contraction; cardiac valve function; perfusion and perfusion reserve; vitality; corona imaging; transmitters, receptors, enzymes; (3) clinic: coronary heart diseases; non-ischemic heart diseases. The appendix contains two contributions on future developments and certification/standardization

  6. Methodology is more than research design and technology.

    Science.gov (United States)

    Proctor, Robert W

    2005-05-01

    The Society for Computers in Psychology has been at the forefront of disseminating information about advances in computer technology and their applications for psychologists. Although technological advances, as well as clean research designs, are key contributors to progress in psychological research, the justification of methodological rules for interpreting data and making theory choices is at least as important. Historically, methodological beliefs and practices have been justified through intuition and logic, an approach known as foundationism. However, naturalism, a modern approach in the philosophy of science inspired by the work of Thomas S. Kuhn, indicates that all aspects of scientific practice, including its methodology, should be evaluated empirically. This article examines implications of the naturalistic approach for psychological research methods in general and for the current debate that is often framed as one of qualitative versus quantitative methods.

  7. Prescriptive Training Courseware: IS-Design Methodology

    Directory of Open Access Journals (Sweden)

    Elspeth McKay

    2018-03-01

    Full Text Available Information systems (IS research is found in many diverse communities. This paper explores the human-dimension of human-computer interaction (HCI to present IS-design practice in the light of courseware development. Assumptions are made that online courseware provides the perfect solution for maintaining a knowledgeable, well skilled workforce. However, empirical investigations into the effectiveness of information technology (IT-induced training solutions are scarce. Contemporary research concentrates on information communications technology (ICT training tools without considering their effectiveness. This paper offers a prescriptive IS-design methodology for managing the requirements for efficient and effective courseware development. To develop the methodology, we examined the main instructional design (ID factors that affect the design of IT-induced training programs. We also examined the tension between maintaining a well-skilled workforce and effective instructional systems design (ISD practice by probing the current ID models used by courseware developers since 1990. An empirical research project, which utilized this IS-design methodology investigated the effectiveness of using IT to train government employees in introductory ethics; this was a study that operationalized the interactive effect of cognitive preference and instructional format on training performance outcomes. The data was analysed using Rasch item response theory (IRT that models the discrimination of people’s performance relative to each other’s performance and the test-items’ difficulty relative to each test-item on the same logit scale. The findings revealed that IS training solutions developed using this IS-design methodology can be adapted to provide trainees with their preferred instructional mode and facilitate cost effective eTraining outcomes.

  8. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  9. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  10. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  11. Bridging Minds: A Mixed Methodology to Assess Networked Flow.

    Science.gov (United States)

    Galimberti, Carlo; Chirico, Alice; Brivio, Eleonora; Mazzoni, Elvis; Riva, Giuseppe; Milani, Luca; Gaggioli, Andrea

    2015-01-01

    The main goal of this contribution is to present a methodological framework to study Networked Flow, a bio-psycho-social theory of collective creativity applying it on creative processes occurring via a computer network. First, we draw on the definition of Networked Flow to identify the key methodological requirements of this model. Next, we present the rationale of a mixed methodology, which aims at combining qualitative, quantitative and structural analysis of group dynamics to obtain a rich longitudinal dataset. We argue that this integrated strategy holds potential for describing the complex dynamics of creative collaboration, by linking the experiential features of collaborative experience (flow, social presence), with the structural features of collaboration dynamics (network indexes) and the collaboration outcome (the creative product). Finally, we report on our experience with using this methodology in blended collaboration settings (including both face-to-face and virtual meetings), to identify open issues and provide future research directions.

  12. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  13. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  14. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  15. Una plataforma de evaluación automática con una metodología efectiva para la enseñanza/aprendizaje en programación de computadores An automatic evaluation platform with an effective methodology for teaching/learning computer programming

    Directory of Open Access Journals (Sweden)

    Jorge López Reguera

    2011-08-01

    Full Text Available Aprender a programar computadores es un proceso difícil para los estudiantes novatos y un desafío a las metodologías empleadas por los docentes. En este artículo se presenta una plataforma de evaluación automática que apoya el proceso de enseñanza/aprendizaje en cursos introductorios de programación de computadores para estudiantes de ingeniería de la Universidad de Concepción. Esta plataforma utiliza mecanismos que combinan análisis estático/dinámico y aplican evaluación de comprensión/análisis en línea, permitiendo una retroalimentación personalizada a los alumnos. En este trabajo se presentan los criterios usados para crear secuencias didácticas de problemas y una forma de aplicarlas efectivamente mediante la evaluación automática. Los resultados obtenidos después de seis años de aplicación muestran que la evaluación automática afecta positivamente la motivación, el desempeño y la autoeficacia de los alumnos. La metodología utilizada para estudiar la efectividad de la plataforma incluye análisis cualitativo y cuantitativo. Los aspectos cualitativos se extraen mediante la observación del comportamiento de los estudiantes durante el proceso de aprendizaje, mientras que el análisis cuantitativo está basado en los datos de los estudiantes registrados por la plataforma. Con el fin de acumular información cuantitativa, aplicamos distintos experimentos basados en grupos de estudiantes de control y grupos experimentales.Learning to program is a difficult process for novice students and challenging for teaching methods. In this paper we present an automatic evaluation platform and a methodology to support the teaching/learning process in computer programming introductory courses for engineering students at the Universidad de Concepción. This platform uses static/dynamic techniques and applies comprehension /analysis evaluation enabling personalized feedback to students. This work includes the criteria used to create

  16. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  17. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  18. Reflective Methodology: The Beginning Teacher

    Science.gov (United States)

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  19. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  20. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  1. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  2. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  3. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  4. Intelligence for embedded systems a methodological approach

    CERN Document Server

    Alippi, Cesare

    2014-01-01

    Addressing current issues of which any engineer or computer scientist should be aware, this monograph is a response to the need to adopt a new computational paradigm as the methodological basis for designing pervasive embedded systems with sensor capabilities. The requirements of this paradigm are to control complexity, to limit cost and energy consumption, and to provide adaptation and cognition abilities allowing the embedded system to interact proactively with the real world. The quest for such intelligence requires the formalization of a new generation of intelligent systems able to exploit advances in digital architectures and in sensing technologies. The book sheds light on the theory behind intelligence for embedded systems with specific focus on: ·        robustness (the robustness of a computational flow and its evaluation); ·        intelligence (how to mimic the adaptation and cognition abilities of the human brain), ·        the capacity to learn in non-stationary and evolv...

  5. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  6. Integral Design Methodology of Photocatalytic Reactors for Air Pollution Remediation

    Directory of Open Access Journals (Sweden)

    Claudio Passalía

    2017-06-01

    Full Text Available An integral reactor design methodology was developed to address the optimal design of photocatalytic wall reactors to be used in air pollution control. For a target pollutant to be eliminated from an air stream, the proposed methodology is initiated with a mechanistic derived reaction rate. The determination of intrinsic kinetic parameters is associated with the use of a simple geometry laboratory scale reactor, operation under kinetic control and a uniform incident radiation flux, which allows computing the local superficial rate of photon absorption. Thus, a simple model can describe the mass balance and a solution may be obtained. The kinetic parameters may be estimated by the combination of the mathematical model and the experimental results. The validated intrinsic kinetics obtained may be directly used in the scaling-up of any reactor configuration and size. The bench scale reactor may require the use of complex computational software to obtain the fields of velocity, radiation absorption and species concentration. The complete methodology was successfully applied to the elimination of airborne formaldehyde. The kinetic parameters were determined in a flat plate reactor, whilst a bench scale corrugated wall reactor was used to illustrate the scaling-up methodology. In addition, an optimal folding angle of the corrugated reactor was found using computational fluid dynamics tools.

  7. [Methodological problems in the use of information technologies in physical education].

    Science.gov (United States)

    Martirosov, E G; Zaĭtseva, G A

    2000-01-01

    The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.

  8. MARC - the NRPB methodology for assessing radiological consequences of accidental releases of activity

    International Nuclear Information System (INIS)

    Clarke, R.H.; Kelly, G.N.

    1981-12-01

    The National Radiological Protection Board has developed a methodology for the assessment of the public health related consequences of accidental releases of radionuclides from nuclear facilities. The methodology consists of a suite of computer programs which predict the transfer of activity from the point of release to the atmosphere through to the population. The suite of programs is entitled MARC; Methodology for Assessing Radiological Consequences. This report describes the overall framework and philosophy utilised within MARC. (author)

  9. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  10. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  11. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  12. 2015 Plan. Project 1: methodology and planning process of the Brazilian electric sector expansion

    International Nuclear Information System (INIS)

    1993-10-01

    The Planning Process of Brazilian Electric Sector Expansion, their normative aspects, instruments, main agents and the planning cycles are described. The methodology of expansion planning is shown, with the interactions of several study areas, electric power market and the used computer models. The forecasts of methodology evolution is also presented. (C.G.C.)

  13. What the Current System Development Trends tell us about Systems Development Methodologies: Toward explaining SSDAM, Agile and IDEF0 Methodologies

    Directory of Open Access Journals (Sweden)

    Abdulla F. Ally

    2015-03-01

    Full Text Available Systems integration, customization and component based development approach are of increasing attention. This trend facilitates the research attention to also focus on systems development methodologies. The availability of systems development tools, rapid change in technologies, evolution of mobile computing and the growth of cloud computing have necessitated a move toward systems integration and customization rather than developing systems from scratch. This tendency encourages component based development and discourages traditional systems development approach. The paper presents and evaluates SSADM, IDEF0 and Agile systems development methodologies. More specifically, it examines how they fit or not fit into the current competitive market of systems development. In the view of this perspective, it is anticipated that despite of its popularity, SSADM methodology is becoming obsolete while Agile and IDEF0 methodologies are still gaining acceptance in the current competitive market of systems development. The present study more likely enrich our understanding of the systems development methodologies concepts and draw attention regarding where the current trends in system development are heading.

  14. Modernising educational programmes in ICT based on the Tuning methodology

    Directory of Open Access Journals (Sweden)

    Alexander Bedny

    2014-07-01

    Full Text Available An analysis is presented of the experience of modernising undergraduate educational programs using the TUNING methodology, based on the example of the area of studies “Fundamental computer science and information technology” (FCSIT implemented at Lobachevsky State University of Nizhni Novgorod (Russia. The algorithm for reforming curricula for the subject area of information technology in accordance with the TUNING methodology is explained. A comparison is drawn between the existing Russian and European standards in the area of ICT education, including the European e-Competence Framework, with the focus on relevant competences. Some guidelines for the preparation of educational programmes are also provided.

  15. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  16. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  17. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  18. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  19. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  20. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  1. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  2. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  3. Blanket safety by GEMSAFE methodology

    International Nuclear Information System (INIS)

    Sawada, Tetsuo; Saito, Masaki

    2001-01-01

    General Methodology of Safety Analysis and Evaluation for Fusion Energy Systems (GEMSAFE) has been applied to a number of fusion system designs, such as R-tokamak, Fusion Experimental Reactor (FER), and the International Thermonuclear Experimental Reactor (ITER) designs in the both stages of Conceptual Design Activities (CDA) and Engineering Design Activities (EDA). Though the major objective of GEMSAFE is to reasonably select design basis events (DBEs) it is also useful to elucidate related safety functions as well as requirements to ensure its safety. In this paper, we apply the methodology to fusion systems with future tritium breeding blankets and make clear which points of the system should be of concern from safety ensuring point of view. In this context, we have obtained five DBEs that are related to the blanket system. We have also clarified the safety functions required to prevent accident propagations initiated by those blanket-specific DBEs. The outline of the methodology is also reviewed. (author)

  4. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  5. Methodology, theoretical framework and scholarly significance: An ...

    African Journals Online (AJOL)

    Methodology, theoretical framework and scholarly significance: An overview ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... Keywords: Legal Research, Methodology, Theory, Pedagogy, Legal Training, Scholarship ...

  6. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  7. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  8. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  9. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing rec...... in qualitative research offers a promising avenue to advance the field in this direction.......Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  10. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do......This case study discusses qualitative fieldwork in Malaysia. The trends in higher education led to investigating how and why young Indians and Chinese in Malaysia are using the university to pursue a life strategy. Given the importance of field context in designing and analysing research based...

  11. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  12. Reflections on Design Methodology Research

    DEFF Research Database (Denmark)

    2011-01-01

    We shall reflect on the results of Design Methodology research and their impact on design practice. In the past 50 years the number of researchers in the field has expanded enormously – as has the number of publications. During the same period design practice and its products have changed...... and produced are also now far more complex and distributed, putting designers under ever increasing pressure. We shall address the question: Are the results of Design Methodology research appropriate and are they delivering the expected results in design practice? In our attempt to answer this question we...

  13. Methodologies for rapid evaluation of seismic demand levels in nuclear power plant structures

    International Nuclear Information System (INIS)

    Manrique, M.; Asfura, A.; Mukhim, G.

    1990-01-01

    A methodology for rapid assessment of both acceleration spectral peak and 'zero period acceleration' (ZPA) values for virtually any major structure in a nuclear power plant is presented. The methodology is based on spectral peak and ZPA amplification factors, developed from regression analyses of an analytical database. The developed amplification factors are applied to the plant's design ground spectrum to obtain amplified response parameters. A practical application of the methodology is presented. This paper also presents a methodology for calculating acceleration response spectrum curves at any number of desired damping ratios directly from a single known damping ratio spectrum. The methodology presented is particularly useful and directly applicable to older vintage nuclear power plant facilities (i.e. such as those affected by USI A-46). The methodology is based on principles of random vibration theory. The methodology has been implemented in a computer program (SPECGEN). SPECGEN results are compared with results obtained from time history analyses. (orig.)

  14. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  15. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  16. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  17. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  18. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  19. Methodologic frontiers in environmental epidemiology.

    OpenAIRE

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic re...

  20. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  1. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  2. Environmental Zoning: Some methodological implications

    NARCIS (Netherlands)

    Ike, Paul; Voogd, Henk

    1991-01-01

    The purpose of this article is to discuss some methodological problems of environmental zoning. The principle of environmental zoning will be elaborated. In addition an overview is given of a number of approaches that have been followed in practice to arrive at an integral judgement. Finally some

  3. A methodology for string resolution

    International Nuclear Information System (INIS)

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine's status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records

  4. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  5. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  6. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  7. 16 Offsetting deficit conceptualisations: methodological ...

    African Journals Online (AJOL)

    uses the concepts of literacy practices and knowledge recontextualisation to ... 1996, 2000) theory of 'knowledge recontextualisation' in the development of curricula .... cognitive, social and cultural abilities needed to fit in and thrive in the HE learning .... this argument, that a methodology and analytic framework that seeks to ...

  8. Methodology for the case studies

    NARCIS (Netherlands)

    Smits, M.J.W.; Woltjer, G.B.

    2017-01-01

    This document is about the methodology and selection of the case studies. It is meant as a guideline for the case studies, and together with the other reports in this work package can be a source of inform ation for policy officers, interest groups and researchers evaluating or performing impact

  9. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  10. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  11. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  12. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  13. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  14. A Critique of Methodological Dualism in Education

    Science.gov (United States)

    Yang, Jeong A.; Yoo, Jae-Bong

    2018-01-01

    This paper aims to critically examine the paradigm of methodological dualism and explore whether methodologies in social science currently are appropriate for educational research. There are two primary methodologies for studying education: quantitative and qualitative methods. This is what we mean by "methodological dualism". Is…

  15. Feminist Methodologies and Engineering Education Research

    Science.gov (United States)

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  16. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  17. Solving computationally expensive engineering problems

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2014-01-01

    Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...

  18. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  19. An intelligent design methodology for nuclear power systems

    International Nuclear Information System (INIS)

    Nassersharif, B.; Martin, R.P.; Portal, M.G.; Gaeta, M.J.

    1989-01-01

    The goal of this investigation is to research possible methodologies into automating the design of, specifically, nuclear power facilities; however, it is relevant to all thermal power systems. The strategy of this research has been to concentrate on individual areas of the thermal design process, investigate procedures performed, develop methodology to emulate that behavior, and prototype it in the form of a computer program. The design process has been generalized as follows: problem definition, design definition, component selection procedure, optimization and engineering analysis, testing and final design with the problem definition defining constraints that will be applied to the selection procedure as well as design definition. The result of this research is a prototype computer program applying an original procedure for the selection of the best set of real components that would be used in constructing a system with desired performance characteristics. The mathematical model used for the selection procedure is possibility theory

  20. New methodologies for living material imaging. Compilation of summaries

    International Nuclear Information System (INIS)

    Barabino, Gabriele; Beaurepaire, Emmanuel; Betrouni, Nacim; Montagnat, Johan; Moonen, Chrit; Olivo-Marin, Jean-Christophe; Paul-Gilloteaux, Perrine; Tillement, Olivier; Barbier, Emmanuel; Beuf, Olivier; Chamot, Christophe; Clarysse, Patrick; Coll, Jean-Luc; Dojat, Michel; Lartizien, Carole; Peyrin, Francoise; Ratiney, Helene; Texier-Nogues, Isabelle; Usson, Yves; Vial, Jean-Claude; Gaillard, Sophie; Aubry, Jean-Francois; Barillot, Christian; Betrouni, Nacim; Beloeil, Jean-Claude; Bernard, Monique; Bridal, Lori; Coll, Jean-Luc; Cozzone, Patrick; Cuenod, Charles-Andre; Darrasse, Luc; Franconi, Jean-Michel; Frapart, Yves-Michel; Grenier, Nicolas; Guilloteau, Denis; Laniece, Philippe; Guilloteau, Denis; Laniece, Philippe; Lethimonnier, Franck; Moonen, Chrit; Pain, Frederic; Patat, Frederic; Tanter, Mickael; Trebossen, Regine; Van Beers, Bernard; Visvikis, Dimitris; Buvat, Irene; Carrault, Guy; Frouin, Frederique; Kouame, Denis; Meste, Olivier; Peyrin, Francoise; Brasse, David; Buvat, Irene; Dauvergne, Denis; Haddad, Ferid; Menard, Laurent; Ouadi, Ali; Olivo-Marin, Jean-Christophe; Pansu, Robert; Peyrieras, Nadine; Salamero, Jean; Usson, Yves; Werts, Martin; Beaurepaire, Emmanuel; Blanchoin, Laurent; Boltze, Frederic; Cavalli, Giacomo; Choquet, Daniel; Coppey, Maite; Dahan, Maxime; Dieterlen, Alain; Ducommun, Bernard; Favard, Cyril; Fort, Emmanuel; Gadal, Olivier; Heliot, Laurent; Hofflack, Bernard; Kervrann, Charles; Langowski, Jorg; LeBivic, Andre; Leveque-Fort, Sandrine; Matthews, Cedric; Monneret, Serge; Mordon, Serge; Mely, Yves

    2012-12-01

    Living material imaging, which is essential to medical diagnosis and therapy methods as well as fundamental and applied biology, is necessarily pluri-disciplinary, at the intersection of physics, (bio)chemistry and pharmacy, and requests mathematical and computer processing of signals and images. Image processing techniques may be applied at different levels (molecular, cellular or tissue level) or using various modes (optics, X rays, NMR, PET, US). This conference therefore presents recent methodological developments addressing the study of living material. The program of the conference started with a plenary session (multimode non linear microscopy of tissues and embryonary morphogenesis) followed by 6 sessions which titles are: (1) new microscopies applied to living materials), (2) agents for molecular and functional imaging), (3) recent developments in methodologies and instrumentations, (4) image processing methods and techniques, (5) image aided diagnosis, therapy and medical surveillance, (6) heterogenous data bases and distributed computations

  1. Waste Package Component Design Methodology Report

    International Nuclear Information System (INIS)

    D.C. Mecham

    2004-01-01

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational

  2. Waste Package Component Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety

  3. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  4. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  5. Interaction between core analysis methodology and nuclear design: some PWR examples

    International Nuclear Information System (INIS)

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  6. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  7. Integrated structure/control design - Present methodology and future opportunities

    Science.gov (United States)

    Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.

    1986-01-01

    Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.

  8. CONDOS methodology for evaluation of radiation exposure from consumer products

    International Nuclear Information System (INIS)

    O'Donnell, F.R.

    1979-01-01

    The CONDOS methodology is a tool for estimating radiation doses to man from exposures to radionuclides incorporated in consumer products. It consists of two parts: (1) an outline, checklist, and selected data for modeling the life span of a product or the material from which it is made; and (2) a computer code that uses the life-span model to calculate radiation doses to exposed individuals and population groups

  9. Soft system methodology and decision making in community planning system

    OpenAIRE

    Křupka, Jiří; Kašparová, Miloslava; Jirava, Pavel; Mandys, Jan; Ferynová, Lenka; Duplinský, Josef

    2013-01-01

    A model of community planning was defined in this paper. The model was designed for the city of Pardubice and works with real questionnaire research data sets in its evaluation phase. Questionnaires were submitted to fill users, providers and sponsors of social services. When creating the model was used Checkland’s soft system methodology. Also soft computing methods and decision trees were used to create the model. The model was implemented in the data mining tool IBM SPSS Modeler 14.

  10. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  11. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  12. AGR core safety assessment methodologies

    International Nuclear Information System (INIS)

    McLachlan, N.; Reed, J.; Metcalfe, M.P.

    1996-01-01

    To demonstrate the safety of its gas-cooled graphite-moderated AGR reactors, nuclear safety assessments of the cores are based upon a methodology which demonstrates no component failures, geometrical stability of the structure and material properties bounded by a database. All AGRs continue to meet these three criteria. However, predictions of future core behaviour indicate that the safety case methodology will eventually need to be modified to deal with new phenomena. A new approach to the safety assessment of the cores is currently under development, which can take account of these factors while at the same time providing the same level of protection for the cores. This approach will be based on the functionality of the core: unhindered movement of control rods, continued adequate cooling of the fuel and the core, continued ability to charge and discharge fuel. (author). 5 figs

  13. Methodological update in Medicina Intensiva.

    Science.gov (United States)

    García Garmendia, J L

    2018-04-01

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  14. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  15. Mo(ve)ment-methodology

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Christian Celosse-Andersen, Martin

    2018-01-01

    This paper describes the theoretical basis for and development of a moment-movement research methodology, based on an integration of critical psychological practice research and critical ethnographic social practice theory. Central theoretical conceptualizations, such as human agency, life...... conditions and identity formation, are discussed in relation to criminological theories of gang desistance. The paper illustrates how the mo(ve)ment methodology was applied in a study of comprehensive processes of identity (re)formation and gang exit processes. This study was conducted with Martin, a former....... This is a moment which captures Martin’s complex and ambiguous feelings of conflictual concerns, frustration, anger, and a new feeling of insecurity in his masculinity, as well as engagement and a sense of deep meaningfulness as he becomes a more reflective academic. All these conflicting feelings also give...

  16. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  17. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    , and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  18. Methodological challenges and lessons learned

    DEFF Research Database (Denmark)

    Nielsen, Poul Erik; Gustafsson, Jessica

    2017-01-01

    Taking as point of departure three recently conducted empirical studies, the aim of this article is to theoretically and empirically discuss methodological challenges studying the interrelations between media and social reality and to critically reflect on the methodologies used in the studies....... By deconstructing the studies, the article draws attention to the fact that different methods are able to grasp different elements of social reality. Moreover, by analysing the power relations at play, the article demonstrated that the interplay between interviewer and interviewee, and how both parties fit...... into present power structures, greatly influence the narratives that are co-produced during interviews. The article thus concludes that in order to fully understand complex phenomena it is not just enough to use a mixture of methods, the makeup of the research team is also imperative, as a diverse team...

  19. Microservices Validation: Methodology and Implementation

    OpenAIRE

    Savchenko, D.; Radchenko, G.

    2015-01-01

    Due to the wide spread of cloud computing, arises actual question about architecture, design and implementation of cloud applications. The microservice model describes the design and development of loosely coupled cloud applications when computing resources are provided on the basis of automated IaaS and PaaS cloud platforms. Such applications consist of hundreds and thousands of service instances, so automated validation and testing of cloud applications developed on the basis of microservic...

  20. Implementation impacts of PRL methodology

    International Nuclear Information System (INIS)

    Caudill, J.A.; Krupa, J.F.; Meadors, R.E.; Odum, J.V.; Rodrigues, G.C.

    1993-02-01

    This report responds to a DOE-SR request to evaluate the impacts from implementation of the proposed Plutonium Recovery Limit (PRL) methodology. The PRL Methodology is based on cost minimization for decisions to discard or recover plutonium contained in scrap, residues, and other plutonium bearing materials. Implementation of the PRL methodology may result in decisions to declare as waste certain plutonium bearing materials originally considered to be a recoverable plutonium product. Such decisions may have regulatory impacts, because any material declared to be waste would immediately be subject to provisions of the Resource Conservation and Recovery Act (RCRA). The decision to discard these materials will have impacts on waste storage, treatment, and disposal facilities. Current plans for the de-inventory of plutonium processing facilities have identified certain materials as candidates for discard based upon the economic considerations associated with extending the operating schedules for recovery of the contained plutonium versus potential waste disposal costs. This report evaluates the impacts of discarding those materials as proposed by the F Area De-Inventory Plan and compares the De-Inventory Plan assessments with conclusions from application of the PRL. The impact analysis was performed for those materials proposed as potential candidates for discard by the De-Inventory Plan. The De-Inventory Plan identified 433 items, containing approximately 1% of the current SRS Pu-239 inventory, as not appropriate for recovery as the site moves to complete the mission of F-Canyon and FB-Line. The materials were entered into storage awaiting recovery as product under the Department's previous Economic Discard Limit (EDL) methodology which valued plutonium at its incremental cost of production in reactors. An application of Departmental PRLs to the subject 433 items revealed that approximately 40% of them would continue to be potentially recoverable as product plutonium

  1. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  2. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  3. Soft systems methodology: other voices

    OpenAIRE

    Holwell, Sue

    2000-01-01

    This issue of Systemic Practice and Action Research, celebrating the work of Peter Checkland, in the particular nature and development of soft systems methodology (SSM), would not have happened unless the work was seen by others as being important. No significant contribution to thinking happens without a secondary literature developing. Not surprisingly, many commentaries have accompanied the ongoing development of SSM. Some of these are insightful, some full of errors, and some include both...

  4. Systems engineering agile design methodologies

    CERN Document Server

    Crowder, James A

    2013-01-01

    This book examines the paradigm of the engineering design process. The authors discuss agile systems and engineering design. The book captures the entire design process (functionbases), context, and requirements to affect real reuse. It provides a methodology for an engineering design process foundation for modern and future systems design. This book captures design patterns with context for actual Systems Engineering Design Reuse and contains a new paradigm in Design Knowledge Management.

  5. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  6. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  7. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  8. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  9. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  11. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  12. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  14. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  15. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  16. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  17. Methodological Approaches to Experimental Teaching of Mathematics to University Students

    Directory of Open Access Journals (Sweden)

    Nikolay I.

    2018-03-01

    Full Text Available Introduction: the article imparts authors’ thoughtson a new teaching methodology for mathematical education in universities. The aim of the study is to substantiate the efficiency of the comprehensive usage of mathematical electronic courses, computer tests, original textbooks and methodologies when teaching mathematics to future agrarian engineers. The authors consider this implementation a unified educational process. Materials and Methods: the synthesis of international and domestic pedagogical experience of teaching students in university and the following methods of empirical research were used: pedagogical experiment, pedagogical measurementsand experimental teaching of mathematics. The authors applied the methodology of revealing interdisciplinary links on the continuum of mathematical problems using the key examples and exercises. Results: the online course “Mathematics” was designed and developed on the platform of Learning Management System Moodle. The article presents the results of test assignments assessing students’ intellectual abilities and analysis of solutions of various types of mathematical problems by students. The pedagogical experiment substantiated the integrated selection of textbooks, online course and online tests using the methodology of determination of the key examples and exercises. Discussion and Conclusions: the analysis of the experimental work suggested that the new methodology is able to have positive effect on the learning process. The learning programme determined the problem points for each student. The findings of this study have a number of important implications for future educational practice.

  18. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  19. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  20. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  1. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  2. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  3. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  4. Padronização de um método para mensuração das tábuas ósseas vestibular e lingual dos maxilares na Tomografia Computadorizada de Feixe Cônico (Cone Beam Methodology standardization for measuring buccal and lingual alveolar bone plates using Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Marcos Cezar Ferreira

    2010-02-01

    Full Text Available INTRODUÇÃO: a espessura das tábuas ósseas que recobrem os dentes por vestibular e lingual constitui um dos fatores limitantes da movimentação dentária. O avanço tecnológico em Imaginologia permitiu avaliar detalhadamente essas regiões anatômicas por meio da utilização da tomografia computadorizada de feixe cônico (TCFC. OBJETIVO: descrever e padronizar, pormenorizadamente, um método para mensuração das tábuas ósseas vestibular e lingual dos maxilares nas imagens de tomografia computadorizada de feixe cônico. METODOLOGIA: a padronização digital da posição da imagem da face deve constituir o primeiro passo antes da seleção dos cortes de TCFC. Dois cortes axiais de cada maxilar foram empregados para a mensuração da espessura do osso alveolar vestibular e lingual. Utilizou-se como referência a junção cemento-esmalte dos primeiros molares permanentes, tanto na arcada superior quanto na inferior. RESULTADOS: cortes axiais paralelos ao plano palatino foram indicados para avaliação quantitativa do osso alveolar na maxila. Na arcada inferior, os cortes axiais devem ser paralelos ao plano oclusal funcional. CONCLUSÃO: o método descrito apresenta reprodutibilidade para utilização em pesquisas, assim como para a avaliação clínica das repercussões periodontais da movimentação dentária, ao permitir a comparação de imagens pré e pós-tratamento.INTRODUCTION: The thickness of the buccal and lingual bone plates constitutes one of the limiting factors of the orthodontic movement. The imaging technology has permitted the evaluation of this anatomical region, by means of cone beam computed tomography. PURPOSE: To describe and standardize, in details, a method for measuring the buccal and lingual bone plate thickness in CBCT images. METHODOLOGY: Digital standardization of face image should constitute the first step before the selection of CBCT slices. Two axial sections of each jaw were used for measuring the thickness

  5. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  6. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  7. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  8. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  9. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  10. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  11. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  12. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  13. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  14. A combined reaction class approach with integrated molecular orbital+molecular orbital (IMOMO) methodology: A practical tool for kinetic modeling

    International Nuclear Information System (INIS)

    Truong, Thanh N.; Maity, Dilip K.; Truong, Thanh-Thai T.

    2000-01-01

    We present a new practical computational methodology for predicting thermal rate constants of reactions involving large molecules or a large number of elementary reactions in the same class. This methodology combines the integrated molecular orbital+molecular orbital (IMOMO) approach with our recently proposed reaction class models for tunneling. With the new methodology, we show that it is possible to significantly reduce the computational cost by several orders of magnitude while compromising the accuracy in the predicted rate constants by less than 40% over a wide range of temperatures. Another important result is that the computational cost increases only slightly as the system size increases. (c) 2000 American Institute of Physics

  15. Perceptual Computing Aiding People in Making Subjective Judgments

    CERN Document Server

    Mendel, Jerry

    2010-01-01

    Explains for the first time how "computing with words" can aid in making subjective judgments. Lotfi Zadeh, the father of fuzzy logic, coined the phrase "computing with words" (CWW) to describe a methodology in which the objects of computation are words and propositions drawn from a natural language. Perceptual Computing explains how to implement CWW to aid in the important area of making subjective judgments, using a methodology that leads to an interactive device—a "Perceptual Computer"—that propagates random and linguistic uncertainties into the subjective judg

  16. Radioisotope methodology course radioprotection aspects

    International Nuclear Information System (INIS)

    Bergoc, R.M.; Caro, R.A.; Menossi, C.A.

    1996-01-01

    The advancement knowledge in molecular and cell biology, biochemistry, medicine and pharmacology, which has taken place during the last 50 years, after World War II finalization, is really outstanding. It can be safely said that this fact is principally due to the application of radioisotope techniques. The research on metabolisms, biodistribution of pharmaceuticals, pharmacodynamics, etc., is mostly carried out by means of techniques employing radioactive materials. Radioisotopes and radiation are frequently used in medicine both as diagnostic and therapeutic tools. The radioimmunoanalysis is today a routine method in endocrinology and in general clinical medicine. The receptor determination and characterization is a steadily growing methodology used in clinical biochemistry, pharmacology and medicine. The use of radiopharmaceuticals and radiation of different origins, for therapeutic purposes, should not be overlooked. For these reasons, the importance to teach radioisotope methodology is steadily growing. This is principally the case for specialization at the post-graduate level but at the pre graduate curriculum it is worthwhile to give some elementary theoretical and practical notions on this subject. These observations are justified by a more than 30 years teaching experience at both levels at the School of Pharmacy and Biochemistry of the University of Buenos Aires, Argentina. In 1960 we began to teach Physics III, an obligatory pregraduate course for biochemistry students, in which some elementary notions of radioactivity and measurement techniques were given. Successive modifications of the biochemistry pregraduate curriculum incorporated radiochemistry as an elective subject and since 1978, radioisotope methodology, as obligatory subject for biochemistry students. This subject is given at the radioisotope laboratory during the first semester of each year and its objective is to provide theoretical and practical knowledge to the biochemistry students, even

  17. Simulation statistical foundations and methodology

    CERN Document Server

    Mihram, G Arthur

    1972-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  18. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...

  19. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  20. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    fission product barrier. 4. A structure, system, or component which operating experience or probabilistic risk assessment has shown to be significant to public health and safety. The application of safety analysis methodologies has evolved to become the primary elements in support of a plant's licensing basis. In general, the licensing basis of every plant regulated by the NRC is the evolution of each plant's individual communication with the NRC. A utility's use of a safety analysis methodology may have unique elements to provide the desired licensing basis support; hence, a generic safety analysis methodology must maintain a certain amount of flexibility in anticipation of such plant-specific needs. The second component related to plant-specific RLBLOCA analyses stems from the NRC's generic review of the methodology. A key component of this review was focused on quantifying the methodology's broader range-of-applicability. The broader range-of-applicability includes the quantification of the range-of-applicability of individual models and correlations in terms of limits on parameters considered important in specific models. The broader range-of-applicability also includes qualitative limits based on unquantifiable uncertainties associated with the extension of both test facility and computer code numerical methods to the full-scale nuclear power plant of interest. These uncertainties include those associated with test facility scale effects, computer code nodalization capabilities, and code model compensating errors. The NRC's review culminated with the release of a Safety Evaluation Report (SER) documenting the NRC's conclusions about the suitability of FANP's RLBLOCA methodology. The contents of the SER include discussions on the NRC's approach to the review, review activities, acceptance rationale for key constituents of the methodology, and an itemized list of additional requirements and restrictions. This list, also referred to as the SER restrictions, reconfirms