The Radiological Safety Analysis Computer Program (RSAC-5) user's manual
International Nuclear Information System (INIS)
Wenzel, D.R.
1994-02-01
The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods
Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual
Energy Technology Data Exchange (ETDEWEB)
Dr. Bradley J Schrader
2009-03-01
The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.
Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual
Energy Technology Data Exchange (ETDEWEB)
Dr. Bradley J Schrader
2010-10-01
The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.
Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual
International Nuclear Information System (INIS)
Schrader, Bradley J.
2009-01-01
The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods
RSAC 6.2 with WinRP 2.0 User Manual
Energy Technology Data Exchange (ETDEWEB)
Bradley Schrader
2005-09-01
The Radiological Safety Analysis Computer Program (RSAC-6.2) calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or a nuclear criticality accident. RSAC-6.2 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for resuspension, inhalation, immersion, ground surface, and ingestion pathways. WinRP 2.0, a windows based overlay to RSAC-6.2, assists users in creating and running RSAC-6.2 input files. This users manual contains the mathematical models and operating instructions for RSAC-6.2 and WinRP 2.0. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-6.2 and WinRP 2.0. These programs are designed for users who are familiar with radiological dose assessment methods.
International Nuclear Information System (INIS)
Wenzel, Douglas R.; Schrader, Brad J.
2007-01-01
1 - Description of program or function: RSAC-6 is the latest version of the program RSAC (Radiological Safety Analysis Computer Program). It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface and plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FOR 11 and 12. 2 - Methods: RSAC6 calculates meteorological dispersion in the atmosphere using Gaussian plume diffusion for Pasquill-Gifford, Hilmeier-Gifford and Markee models. A unique capability is the ability to model Class F fumigation conditions, the meteorological condition that causes the highest ground level concentrations from an elevated release. Doses may be calculated for various pathways including inhalation, ingestion, ground surface, air immersion, water immersion pathways. Dose calculations may be made for either acute or chronic releases. Internal doses (inhalation and ingestion) are calculated using the ICRP-30 model with dose conversion factors from FOR 11. External factors are calculated using FOR 12. 3 - Unusual Features: RSAC6 calculates complete progeny in-growth and decay during all accident phases. The calculation of fission product inventories in particularly useful in the analysis of accidents where the
Development of the RSAC Automation System for Reload Core of WH NPP
International Nuclear Information System (INIS)
Choi, Yu Sun; Bae, Sung Man; Koh, Byung Marn; Hong, Sun Kwan
2006-01-01
The Nuclear Design for Reload Core of Westinghouse Nuclear Power Plant consists of 'Reload Core Model Search', 'Safety Analysis(RSAC)', 'NDR(Nuclear Design Report) and OCAP(Operational Core Analysis Package Generation)' phases. Since scores of calculations for various accidents are required to confirm that the safety analysis assumptions are valid, the Safety Analysis(RSAC) is the most important and time and effort consuming phase of reload core design sequence. The Safety Analysis Automation System supports core designer by the automation of safety analysis calculations in 'Safety Analysis' phase(about 20 calculations). More than 10 kinds of codes, APA(ALPHA/PHOENIX/ANC), APOLLO, VENUS, PHIRE XEFIT, INCORE, etc. are being used for Safety Analysis calculations. Westinghouse code system needs numerous inputs and outputs, so the possibility of human errors could not be ignored during Safety Analysis calculations. To remove these inefficiencies, all input files for Safety Analysis calculations are automatically generated and executed by this Safety Analysis Automation System. All calculation notes are generated and the calculation results are summarized in RSAC (Reload Safety Analysis Checklist) by this system. Therefore, The Safety Analysis Automation System helps the reload core designer to perform safety analysis of the reload core model instantly and correctly
75 FR 4904 - Railroad Safety Advisory Committee (RSAC); Working Group Activity Update
2010-01-29
... amend regulations protecting persons who work on, under, or between rolling equipment; and persons...-7257] Railroad Safety Advisory Committee (RSAC); Working Group Activity Update AGENCY: Federal Railroad... Committee (RSAC) Working Group Activities. SUMMARY: The FRA is updating its announcement of RSAC's Working...
International Nuclear Information System (INIS)
Richardson, L.C.
1967-01-01
1 - Description of problem or function: RSAC generates a fission product inventory from a given set of reactor operating conditions and then computes the external gamma dose, the deposition gamma dose, and the inhalation-ingestion dose to critical body organs as a result of exposure to these fission products. Program output includes reactor operating history, fission product inventory, dosages, and ingestion parameters. 2 - Method of solution: The fission product inventory generated by the reactor operating conditions and the inventory remaining at various times after release are computed using the equations of W. Rubinson in Journal of Chemical Physics, Vol. 17, pages 542-547, June 1949. The external gamma dose and the deposition gamma dose are calculated by determining disintegration rates as a function of space and time, then integrating using Hermite's numerical techniques for the spatial dependence. The inhalation-ingestion dose is determined by the type and quantity of activity inhaled and the biological rate of decay following inhalation. These quantities are integrated with respect to time to obtain the dosage. The ingestion dose is related to the inhalation dose by an input constant
75 FR 51525 - Railroad Safety Advisory Committee (RSAC); Working Group Activity Update
2010-08-20
.... The Working Group continues to work on after arrival orders, and at the September 25-26, 2008, meeting... protecting persons who work on, under, or between rolling equipment and persons applying, removing or.... 63] Railroad Safety Advisory Committee (RSAC); Working Group Activity Update AGENCY: Federal Railroad...
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Computer aided safety analysis
International Nuclear Information System (INIS)
1988-05-01
The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs
Shielding Benchmark Computational Analysis
International Nuclear Information System (INIS)
Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.
2000-01-01
Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)
Analysis of computer programming languages
International Nuclear Information System (INIS)
Risset, Claude Alain
1967-01-01
This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language
Gebali, Fayez
2015-01-01
This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies. · Provides techniques for modeling and analysis of network software and switching equipment; · Discusses design options used to build efficient switching equipment; · Includes many worked examples of the application of discrete-time Markov chains to communication systems; · Covers the mathematical theory and techniques necessary for ana...
Affective Computing and Sentiment Analysis
Ahmad, Khurshid
2011-01-01
This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news
Computer codes for safety analysis
International Nuclear Information System (INIS)
Holland, D.F.
1986-11-01
Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans
Systems analysis and the computer
Energy Technology Data Exchange (ETDEWEB)
Douglas, A S
1983-08-01
The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.
Computer aided safety analysis 1989
International Nuclear Information System (INIS)
1990-04-01
The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures
Computational system for geostatistical analysis
Directory of Open Access Journals (Sweden)
Vendrusculo Laurimar Gonçalves
2004-01-01
Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.
Computational analysis of cerebral cortex
Energy Technology Data Exchange (ETDEWEB)
Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)
2010-08-15
Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)
Computational analysis of cerebral cortex
International Nuclear Information System (INIS)
Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni
2010-01-01
Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)
Computer aided analysis of disturbances
International Nuclear Information System (INIS)
Baldeweg, F.; Lindner, A.
1986-01-01
Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)
Personal Computer Transport Analysis Program
DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter
2012-01-01
The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.
Piping stress analysis with personal computers
International Nuclear Information System (INIS)
Revesz, Z.
1987-01-01
The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)
Computer-Based Linguistic Analysis.
Wright, James R.
Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…
Transportation Research & Analysis Computing Center
Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...
Numerical Analysis of Multiscale Computations
Engquist, Björn; Tsai, Yen-Hsi R
2012-01-01
This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.
Batch Computed Tomography Analysis of Projectiles
2016-05-01
ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles
COMPUTER METHODS OF GENETIC ANALYSIS.
Directory of Open Access Journals (Sweden)
A. L. Osipov
2017-02-01
Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.
Impact analysis on a massively parallel computer
International Nuclear Information System (INIS)
Zacharia, T.; Aramayo, G.A.
1994-01-01
Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper
IUE Data Analysis Software for Personal Computers
Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.
1996-01-01
This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.
Computational methods for corpus annotation and analysis
Lu, Xiaofei
2014-01-01
This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.
Applied time series analysis and innovative computing
Ao, Sio-Iong
2010-01-01
This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.
Computational methods in power system analysis
Idema, Reijer
2014-01-01
This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.
A computational description of simple mediation analysis
Directory of Open Access Journals (Sweden)
Caron, Pier-Olivier
2018-04-01
Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.
Distributed computing and nuclear reactor analysis
International Nuclear Information System (INIS)
Brown, F.B.; Derstine, K.L.; Blomquist, R.N.
1994-01-01
Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations
Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse
Energy Technology Data Exchange (ETDEWEB)
Schmidt, H A.E.; Roesler, H
1982-01-01
The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.
DFT computational analysis of piracetam
Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.
2014-11-01
Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.
Automating sensitivity analysis of computer models using computer calculus
International Nuclear Information System (INIS)
Oblow, E.M.; Pin, F.G.
1986-01-01
An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies
Turbo Pascal Computer Code for PIXE Analysis
International Nuclear Information System (INIS)
Darsono
2002-01-01
To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)
Automating sensitivity analysis of computer models using computer calculus
International Nuclear Information System (INIS)
Oblow, E.M.; Pin, F.G.
1985-01-01
An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs
Computer graphics in reactor safety analysis
International Nuclear Information System (INIS)
Fiala, C.; Kulak, R.F.
1989-01-01
This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs
Uncertainty analysis in Monte Carlo criticality computations
International Nuclear Information System (INIS)
Qi Ao
2011-01-01
Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.
ASTEC: Controls analysis for personal computers
Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.
1989-01-01
The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.
Temporal fringe pattern analysis with parallel computing
International Nuclear Information System (INIS)
Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca
2005-01-01
Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis
A computer program for activation analysis
International Nuclear Information System (INIS)
Rantanen, J.; Rosenberg, R.J.
1983-01-01
A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)
Safety analysis of control rod drive computers
International Nuclear Information System (INIS)
Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.
1985-01-01
The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de
Surface computing and collaborative analysis work
Brown, Judith; Gossage, Stevenson; Hack, Chris
2013-01-01
Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
Spatial analysis statistics, visualization, and computational methods
Oyana, Tonny J
2015-01-01
An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...
Computation for the analysis of designed experiments
Heiberger, Richard
2015-01-01
Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.
Computational analysis of a multistage axial compressor
Mamidoju, Chaithanya
Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.
Computation system for nuclear reactor core analysis
International Nuclear Information System (INIS)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.
1977-04-01
This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals
Computer-aided power systems analysis
Kusic, George
2008-01-01
Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti
Plasma geometric optics analysis and computation
International Nuclear Information System (INIS)
Smith, T.M.
1983-01-01
Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described
Analysis of electronic circuits using digital computers
International Nuclear Information System (INIS)
Tapu, C.
1968-01-01
Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr
Computational Chemical Synthesis Analysis and Pathway Design
Directory of Open Access Journals (Sweden)
Fan Feng
2018-06-01
Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.
CMS Computing Software and Analysis Challenge 2006
Energy Technology Data Exchange (ETDEWEB)
De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)
2007-10-15
The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.
CMS Computing Software and Analysis Challenge 2006
International Nuclear Information System (INIS)
De Filippis, N.
2007-01-01
The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work
International Nuclear Information System (INIS)
Norman, C.; Zhao, K.; Christophe, P.; Binner, R.; Iso, S.
2015-01-01
Two of the three generic objectives of safeguards under a comprehensive safeguards agreement (CSA) are to detect any undeclared production or processing of nuclear material in declared facilities and locations outside facilities (LOFs) and to detect any diversion of declared nuclear material at facilities and LOFs. The effectiveness and efficiency of the IAEA in reaching these objectives strongly relies on the quality of the State or regional system of accounting for and control of nuclear material (SSAC/RSAC) which in turn depends on the nuclear fuel cycle facility operators' capabilities to establish accurate and precise estimates of the inventories and flow of nuclear material. To monitor the performance of the State's nuclear fuel cycle facilities' accounting and measurement systems in a collaborative way, the IAEA initiated yearly trilateral liaison meetings with relevant State or regional authorities and bulk handling facilities' operators to review material balance evaluation results for the elapsed material balance period and their trends over the facility lifetime. During these meetings, trends of concern are examined and the IAEA proposes remedial actions, drawing on its expertise and experience of observations in similar facilities. Pilot trilateral meetings held in Japan over the past years demonstrate the benefits of this collaborative framework. Biases in material balance variables are identified, their causes determined and a set of recommendations is drawn to implement remedial actions before they become a safeguards concern. In the margins of these meetings, workshops are also organised to foster exchanges in the fields of measurement and analytical methods as well as statistical methodologies used to determine their uncertainties and assess the sensitivity of material balances to these uncertainties. In the context of its strategy to enhance cooperation with States, reinforce mutual trust and pursue further efficiencies though
Introduction to scientific computing and data analysis
Holmes, Mark H
2016-01-01
This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.
Aerodynamic analysis of Pegasus - Computations vs reality
Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan
1993-01-01
Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.
Computed image analysis of neutron radiographs
International Nuclear Information System (INIS)
Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.
2008-01-01
Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)
Social sciences via network analysis and computation
Kanduc, Tadej
2015-01-01
In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t
Computer network environment planning and analysis
Dalphin, John F.
1989-01-01
The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.
Symbolic Computing in Probabilistic and Stochastic Analysis
Directory of Open Access Journals (Sweden)
Kamiński Marcin
2015-12-01
Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.
Analysis of a Model for Computer Virus Transmission
Directory of Open Access Journals (Sweden)
Peng Qin
2015-01-01
Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.
Computational methods for nuclear criticality safety analysis
International Nuclear Information System (INIS)
Maragni, M.G.
1992-01-01
Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)
Computational advances in transition phase analysis
International Nuclear Information System (INIS)
Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.
1994-01-01
In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)
Computational Analysis of Human Blood Flow
Panta, Yogendra; Marie, Hazel; Harvey, Mark
2009-11-01
Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.
Computer-aided Fault Tree Analysis
International Nuclear Information System (INIS)
Willie, R.R.
1978-08-01
A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense
Computational System For Rapid CFD Analysis In Engineering
Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.
1995-01-01
Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.
Analysis on the security of cloud computing
He, Zhonglin; He, Yuhua
2011-02-01
Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.
Incremental ALARA cost/benefit computer analysis
International Nuclear Information System (INIS)
Hamby, P.
1987-01-01
Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations
Computing in Qualitative Analysis: A Healthy Development?
Richards, Lyn; Richards, Tom
1991-01-01
Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Can cloud computing benefit health services? - a SWOT analysis.
Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth
2011-01-01
In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.
Use of computer codes for system reliability analysis
International Nuclear Information System (INIS)
Sabek, M.; Gaafar, M.; Poucet, A.
1988-01-01
This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared
Ferrofluids: Modeling, numerical analysis, and scientific computation
Tomas, Ignacio
This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a
Research in applied mathematics, numerical analysis, and computer science
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Computational intelligence for big data analysis frontier advances and applications
Dehuri, Satchidananda; Sanyal, Sugata
2015-01-01
The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.
Computational systems analysis of dopamine metabolism.
Directory of Open Access Journals (Sweden)
Zhen Qi
2008-06-01
Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.
Computational Analysis of Pharmacokinetic Behavior of Ampicillin
Directory of Open Access Journals (Sweden)
Mária Ďurišová
2016-07-01
Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.
Noor, Ahmed K.; Housner, Jerrold M.
1993-01-01
Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.
Codesign Analysis of a Computer Graphics Application
DEFF Research Database (Denmark)
Madsen, Jan; Brage, Jens P.
1996-01-01
This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...
Architectural analysis for wirelessly powered computing platforms
Kapoor, A.; Pineda de Gyvez, J.
2013-01-01
We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy
Computational Intelligence in Intelligent Data Analysis
Nürnberger, Andreas
2013-01-01
Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...
Computer vision syndrome (CVS) - Thermographic Analysis
Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.
2017-01-01
The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.
Analysis of Computer Network Information Based on "Big Data"
Li, Tianli
2017-11-01
With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.
Zagami, Jason
2015-01-01
Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…
Computer science: Data analysis meets quantum physics
Schramm, Steven
2017-10-01
A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375
Analysis On Security Of Cloud Computing
Directory of Open Access Journals (Sweden)
Muhammad Zunnurain Hussain
2017-01-01
Full Text Available In this paper Author will be discussing the security issues and challenges faced by the industry in securing the cloud computing and how these problems can be tackled. Cloud computing is modern technique of sharing resources like data sharing file sharing basically sharing of resources without launching own infrastructure and using some third party resources to avoid huge investment . It is very challenging these days to secure the communication between two users although people use different encryption techniques 1.
Schottky signal analysis: tune and chromaticity computation
Chanon, Ondine
2016-01-01
Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.
Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer
International Nuclear Information System (INIS)
Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi
1975-10-01
Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)
Computer-Assisted Linguistic Analysis of the Peshitta
Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.
2014-01-01
CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van
Run 2 analysis computing for CDF and D0
International Nuclear Information System (INIS)
Fuess, S.
1995-11-01
Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail
Frequency modulation television analysis: Threshold impulse analysis. [with computer program
Hodge, W. H.
1973-01-01
A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.
Computational Analysis of SAXS Data Acquisition.
Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S
2015-09-01
Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.
Computational and Physical Analysis of Catalytic Compounds
Wu, Richard; Sohn, Jung Jae; Kyung, Richard
2015-03-01
Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.
Classification and Analysis of Computer Network Traffic
DEFF Research Database (Denmark)
Bujlow, Tomasz
2014-01-01
various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...
Computer programs simplify optical system analysis
1965-01-01
The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.
Analysis of airways in computed tomography
DEFF Research Database (Denmark)
Petersen, Jens
Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...
Affect and Learning : a computational analysis
Broekens, Douwe Joost
2007-01-01
In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation
Adapting computational text analysis to social science (and vice versa
Directory of Open Access Journals (Sweden)
Paul DiMaggio
2015-11-01
Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.
Experience with a distributed computing system for magnetic field analysis
International Nuclear Information System (INIS)
Newman, M.J.
1978-08-01
The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)
Interface between computational fluid dynamics (CFD) and plant analysis computer codes
International Nuclear Information System (INIS)
Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.
1993-01-01
Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past
Computational analysis of ozonation in bubble columns
International Nuclear Information System (INIS)
Quinones-Bolanos, E.; Zhou, H.; Otten, L.
2002-01-01
This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)
LHCb Distributed Data Analysis on the Computing Grid
Paterson, S; Parkes, C
2006-01-01
LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.
Hybrid soft computing systems for electromyographic signals analysis: a review
2014-01-01
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979
Hybrid soft computing systems for electromyographic signals analysis: a review.
Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates
2014-02-03
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.
Accident sequence analysis of human-computer interface design
International Nuclear Information System (INIS)
Fan, C.-F.; Chen, W.-H.
2000-01-01
It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points
Application of microarray analysis on computer cluster and cloud platforms.
Bernau, C; Boulesteix, A-L; Knaus, J
2013-01-01
Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.
Ahn, Sul-Ah; Jung, Youngim
2016-10-01
The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.
Isogeometric analysis : a calculus for computational mechanics
Benson, D.J.; Borst, de R.; Hughes, T.J.R.; Scott, M.A.; Verhoosel, C.V.; Topping, B.H.V.; Adam, J.M.; Pallarés, F.J.; Bru, R.; Romero, M.L.
2010-01-01
The first paper on isogeometric analysis appeared only five years ago [1], and the first book appeared last year [2]. Progress has been rapid. Isogeometric analysis has been applied to a wide variety of problems in solids, fluids and fluid-structure interactions. Superior accuracy to traditional
Computer-Based Interaction Analysis with DEGREE Revisited
Barros, B.; Verdejo, M. F.
2016-01-01
We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…
Isotopic analysis of plutonium by computer controlled mass spectrometry
International Nuclear Information System (INIS)
1974-01-01
Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control
Computer Programme for the Dynamic Analysis of Tall Regular ...
African Journals Online (AJOL)
The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...
Computer use and carpal tunnel syndrome: A meta-analysis.
Shiri, Rahman; Falah-Hassani, Kobra
2015-02-15
Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.
Tolerance analysis through computational imaging simulations
Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon
2017-11-01
The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.
Analysis and Assessment of Computer-Supported Collaborative Learning Conversations
Trausan-Matu, Stefan
2008-01-01
Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.
Surveillance Analysis Computer System (SACS) software requirements specification (SRS)
International Nuclear Information System (INIS)
Glasscock, J.A.; Flanagan, M.J.
1995-09-01
This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project
M. Kasemann
Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...
From Digital Imaging to Computer Image Analysis of Fine Art
Stork, David G.
An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.
Use of computer codes for system reliability analysis
International Nuclear Information System (INIS)
Sabek, M.; Gaafar, M.; Poucet, A.
1989-01-01
This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)
Use of computer codes for system reliability analysis
Energy Technology Data Exchange (ETDEWEB)
Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)
1989-01-01
This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).
Computer System Analysis for Decommissioning Management of Nuclear Reactor
International Nuclear Information System (INIS)
Nurokhim; Sumarbagiono
2008-01-01
Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)
System Matrix Analysis for Computed Tomography Imaging
Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo
2015-01-01
In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482
Computational analysis of sequence selection mechanisms.
Meyerguz, Leonid; Grasso, Catherine; Kleinberg, Jon; Elber, Ron
2004-04-01
Mechanisms leading to gene variations are responsible for the diversity of species and are important components of the theory of evolution. One constraint on gene evolution is that of protein foldability; the three-dimensional shapes of proteins must be thermodynamically stable. We explore the impact of this constraint and calculate properties of foldable sequences using 3660 structures from the Protein Data Bank. We seek a selection function that receives sequences as input, and outputs survival probability based on sequence fitness to structure. We compute the number of sequences that match a particular protein structure with energy lower than the native sequence, the density of the number of sequences, the entropy, and the "selection" temperature. The mechanism of structure selection for sequences longer than 200 amino acids is approximately universal. For shorter sequences, it is not. We speculate on concrete evolutionary mechanisms that show this behavior.
Process for computing geometric perturbations for probabilistic analysis
Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX
2012-04-10
A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.
Development of small scale cluster computer for numerical analysis
Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.
2017-09-01
In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.
Data analysis through interactive computer animation method (DATICAM)
International Nuclear Information System (INIS)
Curtis, J.N.; Schwieder, D.H.
1983-01-01
DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process
Computational Analysis of Spray Jet Flames
Jain, Utsav
There is a boost in the utilization of renewable sources of energy but because of high energy density applications, combustion will never be obsolete. Spray combustion is a type of multiphase combustion which has tremendous engineering applications in different fields, varying from energy conversion devices to rocket propulsion system. Developing accurate computational models for turbulent spray combustion is vital for improving the design of combustors and making them energy efficient. Flamelet models have been extensively used for gas phase combustion because of their relatively low computational cost to model the turbulence-chemistry interaction using a low dimensional manifold approach. This framework is designed for gas phase non-premixed combustion and its implementation is not very straight forward for multiphase and multi-regime combustion such as spray combustion. This is because of the use of a conserved scalar and various flamelet related assumptions. Mixture fraction has been popularly employed as a conserved scalar and hence used to parameterize the characteristics of gaseous flamelets. However, for spray combustion, the mixture fraction is not monotonic and does not give a unique mapping in order to parameterize the structure of spray flames. In order to develop a flamelet type model for spray flames, a new variable called the mixing variable is introduced which acts as an ideal conserved scalar and takes into account the convection and evaporation of fuel droplets. In addition to the conserved scalar, it has been observed that though gaseous flamelets can be characterized by the conserved scalar and its dissipation, this might not be true for spray flamelets. Droplet dynamics has a significant influence on the spray flamelet and because of effects such as flame penetration of droplets and oscillation of droplets across the stagnation plane, it becomes important to accommodate their influence in the flamelet formulation. In order to recognize the
Computational analysis of thresholds for magnetophosphenes
International Nuclear Information System (INIS)
Laakso, Ilkka; Hirata, Akimasa
2012-01-01
In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of
Computer-automated neutron activation analysis system
International Nuclear Information System (INIS)
Minor, M.M.; Garcia, S.R.
1983-01-01
An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references
Computed tomographic analysis of urinary calculi
International Nuclear Information System (INIS)
Naito, Akira; Ito, Katsuhide; Ito, Shouko
1986-01-01
Computed tomography (CT) was employed in an effort to analyze the chemical composition of urinary calculi. Twenty-three surgically removed calculi were scanned in a water bath (in vitro study). Forteen of them in the body were scanned (in vivo study). The calculi consisted of four types: mixed calcium oxalate and phosphate, mixed calcium carbonate and phosphate, magnesium ammonium phosphate, and uric acid. The in vitro study showed that the mean and maximum CT values of uric acid stones were significantly lower than those of the other three types of stones. This indicated that stones with less than 450 HU are composed of uric acid. In an in vivo study, CT did not help to differentiate the three types of urinary calculi, except for uric acid stones. Regarding the mean CT values, there was no correlation between in vitro and in vivo studies. An experiment with commercially available drugs showed that CT values of urinary calculi were not dependent upon the composition, but dependent upon the density of the calculi. (Namekawa, K.)
Analysis of computational vulnerabilities in digital repositories
Directory of Open Access Journals (Sweden)
Valdete Fernandes Belarmino
2015-04-01
Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.
Classification and Analysis of Computer Network Traffic
Bujlow, Tomasz
2014-01-01
Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models of traffic for academic purposes. We define the objective of this thesis as finding a way to evaluate the performance of various applications in a high-speed Internet infrastructure. To satisfy the obje...
AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS
Energy Technology Data Exchange (ETDEWEB)
Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang
2010-08-01
The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.
An approach to quantum-computational hydrologic inverse analysis.
O'Malley, Daniel
2018-05-02
Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.
Cafts: computer aided fault tree analysis
International Nuclear Information System (INIS)
Poucet, A.
1985-01-01
The fault tree technique has become a standard tool for the analysis of safety and reliability of complex system. In spite of the costs, which may be high for a complete and detailed analysis of a complex plant, the fault tree technique is popular and its benefits are fully recognized. Due to this applications of these codes have mostly been restricted to simple academic examples and rarely concern complex, real world systems. In this paper an interactive approach to fault tree construction is presented. The aim is not to replace the analyst, but to offer him an intelligent tool which can assist him in modeling complex systems. Using the CAFTS-method, the analyst interactively constructs a fault tree in two phases: (1) In a first phase he generates an overall failure logic structure of the system; the macrofault tree. In this phase, CAFTS features an expert system approach to assist the analyst. It makes use of a knowledge base containing generic rules on the behavior of subsystems and components; (2) In a second phase the macrofault tree is further refined and transformed in a fully detailed and quantified fault tree. In this phase a library of plant-specific component failure models is used
I. Fisk
2011-01-01
Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...
P. McBride
The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...
Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio
2018-01-01
Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.
Conference “Computational Analysis and Optimization” (CAO 2011)
Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday
2013-01-01
This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.
Computer code for qualitative analysis of gamma-ray spectra
International Nuclear Information System (INIS)
Yule, H.P.
1979-01-01
Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper
A single-chip computer analysis system for liquid fluorescence
International Nuclear Information System (INIS)
Zhang Yongming; Wu Ruisheng; Li Bin
1998-01-01
The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials
A Computational Discriminability Analysis on Twin Fingerprints
Liu, Yu; Srihari, Sargur N.
Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.
Content Analysis of a Computer-Based Faculty Activity Repository
Baker-Eveleth, Lori; Stone, Robert W.
2013-01-01
The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…
Computer-Aided Communication Satellite System Analysis and Optimization.
Stagl, Thomas W.; And Others
Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
Computer-Aided Qualitative Data Analysis with Word
Directory of Open Access Journals (Sweden)
Bruno Nideröst
2002-05-01
Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225
Computer programs for analysis of geophysical data
Energy Technology Data Exchange (ETDEWEB)
Rozhkov, M.; Nakanishi, K.
1994-06-01
This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.
Computer programs for analysis of geophysical data
International Nuclear Information System (INIS)
Rozhkov, M.; Nakanishi, K.
1994-06-01
This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution
Introducing remarks upon the analysis of computer systems performance
International Nuclear Information System (INIS)
Baum, D.
1980-05-01
Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de
Computer-aided visualization and analysis system for sequence evaluation
Energy Technology Data Exchange (ETDEWEB)
Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.
2004-05-11
A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.
Strategic Analysis of Autodesk and the Move to Cloud Computing
Kewley, Kathleen
2012-01-01
This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...
Computational Aspects of Dam Risk Analysis: Findings and Challenges
Directory of Open Access Journals (Sweden)
Ignacio Escuder-Bueno
2016-09-01
Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.
A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS
Monika Raghuvanshi*, Rahul Patel
2016-01-01
In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...
Numeric computation and statistical data analysis on the Java platform
Chekanov, Sergei V
2016-01-01
Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...
PIXAN: the Lucas Heights PIXE analysis computer package
International Nuclear Information System (INIS)
Clayton, E.
1986-11-01
To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis
Automated uncertainty analysis methods in the FRAP computer codes
International Nuclear Information System (INIS)
Peck, S.O.
1980-01-01
A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts
Conceptual design of pipe whip restraints using interactive computer analysis
International Nuclear Information System (INIS)
Rigamonti, G.; Dainora, J.
1975-01-01
Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)
Computer aided plant engineering: An analysis and suggestions for computer use
International Nuclear Information System (INIS)
Leinemann, K.
1979-09-01
To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de
Investigating the computer analysis of eddy current NDT data
International Nuclear Information System (INIS)
Brown, R.L.
1979-01-01
The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications
First Experiences with LHC Grid Computing and Distributed Analysis
Fisk, Ian
2010-01-01
In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.
Visualization and Data Analysis for High-Performance Computing
Energy Technology Data Exchange (ETDEWEB)
Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-09-27
This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.
Analysis of the computed tomography in the acute abdomen
International Nuclear Information System (INIS)
Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias
2007-01-01
Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)
Computational mathematics models, methods, and analysis with Matlab and MPI
White, Robert E
2004-01-01
Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...
Convergence Analysis of a Class of Computational Intelligence Approaches
Directory of Open Access Journals (Sweden)
Junfeng Chen
2013-01-01
Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.
Analysis of Biosignals During Immersion in Computer Games.
Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon
2017-11-17
The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.
I. Fisk
2013-01-01
Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...
I. Fisk
2010-01-01
Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...
SALP-PC, a computer program for fault tree analysis on personal computers
International Nuclear Information System (INIS)
Contini, S.; Poucet, A.
1987-01-01
The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)
Numerical methods design, analysis, and computer implementation of algorithms
Greenbaum, Anne
2012-01-01
Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...
Recent developments of the NESSUS probabilistic structural analysis computer program
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Sentiment analysis and ontology engineering an environment of computational intelligence
Chen, Shyi-Ming
2016-01-01
This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...
Practical computer analysis of switch mode power supplies
Bennett, Johnny C
2006-01-01
When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...
The role of the computer in automated spectral analysis
International Nuclear Information System (INIS)
Rasmussen, S.E.
This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis
Integrating computer programs for engineering analysis and design
Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.
1983-01-01
The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.
Integration of rocket turbine design and analysis through computer graphics
Hsu, Wayne; Boynton, Jim
1988-01-01
An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.
MULGRES: a computer program for stepwise multiple regression analysis
A. Jeff Martin
1971-01-01
MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.
Conversation Analysis in Computer-Assisted Language Learning
González-Lloret, Marta
2015-01-01
The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…
Computational content analysis of European Central Bank statements
Milea, D.V.; Almeida, R.J.; Sharef, N.M.; Kaymak, U.; Frasincar, F.
2012-01-01
In this paper we present a framework for the computational content analysis of European Central Bank (ECB) statements. Based on this framework, we provide two approaches that can be used in a practical context. Both approaches use the content of ECB statements to predict upward and downward movement
Componential analysis of kinship terminology a computational perspective
Pericliev, V
2013-01-01
This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.
HAMOC: a computer program for fluid hammer analysis
International Nuclear Information System (INIS)
Johnson, H.G.
1975-12-01
A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values
Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis
Young, Cristobal; Holsteen, Katherine
2017-01-01
Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…
Informational-computer system for the neutron spectra analysis
International Nuclear Information System (INIS)
Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.
1979-01-01
In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described
A Computer Program for Short Circuit Analysis of Electric Power ...
African Journals Online (AJOL)
The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...
Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing
Energy Technology Data Exchange (ETDEWEB)
Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)
2015-07-29
The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.
COALA--A Computational System for Interlanguage Analysis.
Pienemann, Manfred
1992-01-01
Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)
Computer system for environmental sample analysis and data storage and analysis
International Nuclear Information System (INIS)
Brauer, F.P.; Fager, J.E.
1976-01-01
A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system
Building a Prototype of LHC Analysis Oriented Computing Centers
Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.
2012-12-01
A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.
Building a Prototype of LHC Analysis Oriented Computing Centers
International Nuclear Information System (INIS)
Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M
2012-01-01
A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.
P. McBride
It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...
I. Fisk
2012-01-01
Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...
Benchmarking undedicated cloud computing providers for analysis of genomic datasets.
Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W
2014-01-01
A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.
A Computational Analysis Model for Open-ended Cognitions
Morita, Junya; Miwa, Kazuhisa
In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.
Benchmarking undedicated cloud computing providers for analysis of genomic datasets.
Directory of Open Access Journals (Sweden)
Seyhan Yazar
Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.
Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes
International Nuclear Information System (INIS)
Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.
2002-01-01
A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)
Available computer codes and data for radiation transport analysis
International Nuclear Information System (INIS)
Trubey, D.K.; Maskewitz, B.F.; Roussin, R.W.
1975-01-01
The Radiation Shielding Information Center (RSIC), sponsored and supported by the Energy Research and Development Administration (ERDA) and the Defense Nuclear Agency (DNA), is a technical institute serving the radiation transport and shielding community. It acquires, selects, stores, retrieves, evaluates, analyzes, synthesizes, and disseminates information on shielding and ionizing radiation transport. The major activities include: (1) operating a computer-based information system and answering inquiries on radiation analysis, (2) collecting, checking out, packaging, and distributing large computer codes, and evaluated and processed data libraries. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results
Computational analysis in support of the SSTO flowpath test
Duncan, Beverly S.; Trefny, Charles J.
1994-10-01
A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.
Automated procedure for performing computer security risk analysis
International Nuclear Information System (INIS)
Smith, S.T.; Lim, J.J.
1984-05-01
Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures
Computation system for nuclear reactor core analysis. [LMFBR
Energy Technology Data Exchange (ETDEWEB)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.
1977-04-01
This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.
M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley
Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...
Application of computer aided tolerance analysis in product design
International Nuclear Information System (INIS)
Du Hua
2009-01-01
This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)
I. Fisk
2011-01-01
Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...
Computational image analysis of Suspension Plasma Sprayed YSZ coatings
Directory of Open Access Journals (Sweden)
Michalak Monika
2017-01-01
Full Text Available The paper presents the computational studies of microstructure- and topography- related features of suspension plasma sprayed (SPS coatings of yttria-stabilized zirconia (YSZ. The study mainly covers the porosity assessment, provided by ImageJ software analysis. The influence of boundary conditions, defined by: (i circularity and (ii size limits, on the computed values of porosity is also investigated. Additionally, the digital topography evaluation is performed: confocal laser scanning microscope (CLSM and scanning electron microscope (SEM operating in Shape from Shading (SFS mode measure surface roughness of deposited coatings. Computed values of porosity and roughness are referred to the variables of the spraying process, which influence the morphology of coatings and determines the possible fields of their applications.
A comparative analysis of soft computing techniques for gene prediction.
Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand
2013-07-01
The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.
Global sensitivity analysis of computer models with functional inputs
International Nuclear Information System (INIS)
Iooss, Bertrand; Ribatet, Mathieu
2009-01-01
Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.
A Research Roadmap for Computation-Based Human Reliability Analysis
Energy Technology Data Exchange (ETDEWEB)
Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.
A Research Roadmap for Computation-Based Human Reliability Analysis
International Nuclear Information System (INIS)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina
2015-01-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.
Visual Analysis of Cloud Computing Performance Using Behavioral Lines.
Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu
2016-02-29
Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.
Ubiquitous computing in sports: A review and analysis.
Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp
2009-10-01
Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.
Gas analysis by computer-controlled microwave rotational spectrometry
International Nuclear Information System (INIS)
Hrubesh, L.W.
1978-01-01
Microwave rotational spectrometry has inherently high resolution and is thus nearly ideal for qualitative gas mixture analysis. Quantitative gas analysis is also possible by a simplified method which utilizes the ease with which molecular rotational transitions can be saturated at low microwave power densities. This article describes a computer-controlled microwave spectrometer which is used to demonstrate for the first time a totally automated analysis of a complex gas mixture. Examples are shown for a complete qualitative and quantitative analysis, in which a search of over 100 different compounds is made in less than 7 min, with sensitivity for most compounds in the 10 to 100 ppm range. This technique is expected to find increased use in view of the reduced complexity and increased reliabiity of microwave spectrometers and because of new energy-related applications for analysis of mixtures of small molecules
Thermohydraulic analysis of nuclear power plant accidents by computer codes
International Nuclear Information System (INIS)
Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.
1982-01-01
RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)
Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser
Adib, M. A. H. M.; Adnan, F.; Ismail, A. R.; Kardigama, K.; Salaam, H. A.; Ahmad, Z.; Johari, N. H.; Anuar, Z.; Azmi, N. S. N.
2012-09-01
Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ~ 60%) acceptable compared to diffuser with 6mm ~ 40% and 12mm ~ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).
Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser
International Nuclear Information System (INIS)
Adib, M A H M; Ismail, A R; Kardigama, K; Salaam, H A; Ahmad, Z; Johari, N H; Anuar, Z; Azmi, N S N; Adnan, F
2012-01-01
Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ∼ 60%) acceptable compared to diffuser with 6mm ∼ 40% and 12mm ∼ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).
[Computers in biomedical research: I. Analysis of bioelectrical signals].
Vivaldi, E A; Maldonado, P
2001-08-01
A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.
Computational singular perturbation analysis of stochastic chemical systems with stiffness
Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.
2017-04-01
Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.
Man-machine interfaces analysis system based on computer simulation
International Nuclear Information System (INIS)
Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan
2004-01-01
The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant
Computer based approach to fatigue analysis and design
International Nuclear Information System (INIS)
Comstock, T.R.; Bernard, T.; Nieb, J.
1979-01-01
An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de
A Computable OLG Model for Gender and Growth Policy Analysis
Pierre-Richard Agénor
2012-01-01
This paper develops a computable Overlapping Generations (OLG) model for gender and growth policy analysis. The model accounts for human and physical capital accumulation (both public and private), intra- and inter-generational health persistence, fertility choices, and women's time allocation between market work, child rearing, and home production. Bargaining between spouses and gender bias, in the form of discrimination in the work place and mothers' time allocation between daughters and so...
Computers in activation analysis and gamma-ray spectroscopy
Energy Technology Data Exchange (ETDEWEB)
Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.
1979-01-01
Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)
Computational techniques for inelastic analysis and numerical experiments
International Nuclear Information System (INIS)
Yamada, Y.
1977-01-01
A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)
M. Kasemann
Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...
M. Kasemann
CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes. Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...
M. Kasemann
Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...
Trend Analysis of the Brazilian Scientific Production in Computer Science
Directory of Open Access Journals (Sweden)
TRUCOLO, C. C.
2014-12-01
Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.
A visual interface to computer programs for linkage analysis.
Chapman, C J
1990-06-01
This paper describes a visual approach to the input of information about human families into computer data bases, making use of the GEM graphic interface on the Atari ST. Similar approaches could be used on the Apple Macintosh or on the IBM PC AT (to which it has been transferred). For occasional users of pedigree analysis programs, this approach has considerable advantages in ease of use and accessibility. An example of such use might be the analysis of risk in families with Huntington disease using linked RFLPs. However, graphic interfaces do make much greater demands on the programmers of these systems.
Advances in Computational Stability Analysis of Composite Aerospace Structures
International Nuclear Information System (INIS)
Degenhardt, R.; Araujo, F. C. de
2010-01-01
European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.
Structural mode significance using INCA. [Interactive Controls Analysis computer program
Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.
1990-01-01
Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.
Current topics in pure and computational complex analysis
Dorff, Michael; Lahiri, Indrajit
2014-01-01
The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.
ASAS: Computational code for Analysis and Simulation of Atomic Spectra
Directory of Open Access Journals (Sweden)
Jhonatha R. dos Santos
2017-01-01
Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.
Critical Data Analysis Precedes Soft Computing Of Medical Data
DEFF Research Database (Denmark)
Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.
2000-01-01
extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...
Establishment of computer code system for nuclear reactor design - analysis
International Nuclear Information System (INIS)
Subki, I.R.; Santoso, B.; Syaukat, A.; Lee, S.M.
1996-01-01
Establishment of computer code system for nuclear reactor design analysis is given in this paper. This establishment is an effort to provide the capability in running various codes from nuclear data to reactor design and promote the capability for nuclear reactor design analysis particularly from neutronics and safety points. This establishment is also an effort to enhance the coordination of nuclear codes application and development existing in various research centre in Indonesia. Very prospective results have been obtained with the help of IAEA technical assistance. (author). 6 refs, 1 fig., 1 tab
Analysis and computation of microstructure in finite plasticity
Hackl, Klaus
2015-01-01
This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.
Computer assessment of interview data using latent semantic analysis.
Dam, Gregory; Kaufmann, Stefan
2008-02-01
Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed.
New Mexico district work-effort analysis computer program
Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.
1972-01-01
The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation
2010-01-01
Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...
Analysis of multigrid methods on massively parallel computers: Architectural implications
Matheson, Lesley R.; Tarjan, Robert E.
1993-01-01
We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.
A computational clonal analysis of the developing mouse limb bud.
Directory of Open Access Journals (Sweden)
Luciano Marcon
Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.
Analysis of sponge zones for computational fluid mechanics
International Nuclear Information System (INIS)
Bodony, Daniel J.
2006-01-01
The use of sponge regions, or sponge zones, which add the forcing term -σ(q - q ref ) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer
Computer image analysis of etched tracks from ionizing radiation
Blanford, George E.
1994-01-01
I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.
Computational analysis of the SRS Phase III salt disposition alternatives
International Nuclear Information System (INIS)
Dimenna, R.A.
2000-01-01
In late 1997, the In-Tank Precipitation (ITP), facility was shut down and an evaluation of alternative methods to process the liquid high-level waste stored in the Savannah River Site High-Level Waste storage tanks was begun. The objective was to determine whether another process might avoid the operational difficulties encountered with ITP for a lower cost than modifying the existing structured approach to evaluating proposed alternatives on a common basis to identify the best one. Results from the computational analysis were a key part of the input used to select a primary and a secondary salt disposition alternative. This paper describes the process by which the computation needs were identified, addressed, and accomplished with a limited staff under stringent schedule constraints
Automated differentiation of computer models for sensitivity analysis
International Nuclear Information System (INIS)
Worley, B.A.
1990-01-01
Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems
Automated differentiation of computer models for sensitivity analysis
International Nuclear Information System (INIS)
Worley, B.A.
1991-01-01
Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab
Computer code for general analysis of radon risks (GARR)
International Nuclear Information System (INIS)
Ginevan, M.
1984-09-01
This document presents a computer model for general analysis of radon risks that allow the user to specify a large number of possible models with a small number of simple commands. The model is written in a version of BASIC which conforms closely to the American National Standards Institute (ANSI) definition for minimal BASIC and thus is readily modified for use on a wide variety of computers and, in particular, microcomputers. Model capabilities include generation of single-year life tables from 5-year abridged data, calculation of multiple-decrement life tables for lung cancer for the general population, smokers, and nonsmokers, and a cohort lung cancer risk calculation that allows specification of level and duration of radon exposure, the form of the risk model, and the specific population assumed at risk. 36 references, 8 figures, 7 tables
Advanced data analysis in neuroscience integrating statistical and computational models
Durstewitz, Daniel
2017-01-01
This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering. Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...
Numerical analysis of boosting scheme for scalable NMR quantum computation
International Nuclear Information System (INIS)
SaiToh, Akira; Kitagawa, Masahiro
2005-01-01
Among initialization schemes for ensemble quantum computation beginning at thermal equilibrium, the scheme proposed by Schulman and Vazirani [in Proceedings of the 31st ACM Symposium on Theory of Computing (STOC'99) (ACM Press, New York, 1999), pp. 322-329] is known for the simple quantum circuit to redistribute the biases (polarizations) of qubits and small time complexity. However, our numerical simulation shows that the number of qubits initialized by the scheme is rather smaller than expected from the von Neumann entropy because of an increase in the sum of the binary entropies of individual qubits, which indicates a growth in the total classical correlation. This result--namely, that there is such a significant growth in the total binary entropy--disagrees with that of their analysis
Computer aided analysis, simulation and optimisation of thermal sterilisation processes.
Narayanan, C M; Banerjee, Arindam
2013-04-01
Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.
Analysis of pellet coating uniformity using a computer scanner.
Šibanc, Rok; Luštrik, Matevž; Dreu, Rok
2017-11-30
A fast method for pellet coating uniformity analysis, using a commercial computer scanner was developed. The analysis of the individual particle coating thicknesses was based on using a transparent orange colored coating layer deposited on white pellet cores. Besides the analysis of the coating thickness the information of pellet size and shape was obtained as well. Particle size dependent coating thickness and particle size independent coating variability was calculated by combining the information of coating thickness and pellet size. Decoupling coating thickness variation sources is unique to presented method. For each coating experiment around 10000 pellets were analyzed, giving results with a high statistical confidence. Proposed method was employed for the performance evaluation of classical Wurster and swirl enhanced Wurster coater operated at different gap settings and air flow rates. Copyright © 2017 Elsevier B.V. All rights reserved.
Quantitative analysis by computer controlled X-ray fluorescence spectrometer
International Nuclear Information System (INIS)
Balasubramanian, T.V.; Angelo, P.C.
1981-01-01
X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)
Overview of adaptive finite element analysis in computational geodynamics
May, D. A.; Schellart, W. P.; Moresi, L.
2013-10-01
The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a
International Nuclear Information System (INIS)
Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide
2002-03-01
The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)
Computational Fatigue Life Analysis of Carbon Fiber Laminate
Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.
2018-02-01
In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.
P. MacBride
The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...
I. Fisk
2012-01-01
Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently. Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...
Computer content analysis of schizophrenic speech: a preliminary report.
Tucker, G J; Rosenberg, S D
1975-06-01
Computer analysis significantly differtiated the thermatic content of the free speech of 10 schizophrenic patients from that of 10 nonschizophrenic patients and from the content of transcripts of dream material from 10 normal subjects. Schizophrenic patients used the thematic categories in factor 1 (the "schizophrenic factor") 3 times more frequently than the nonschizophrenics and 10 times more frequently than the normal subjects (p smaller than 01). In general, the language content of the schizophrenic patient mirrored an almost agitated attempt to locate oneself in time and space and to defend against internal discomfort and confusion. The authors discuss the implications of this study for future research.
Spatial Analysis Along Networks Statistical and Computational Methods
Okabe, Atsuyuki
2012-01-01
In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process
Integrated computer codes for nuclear power plant severe accident analysis
International Nuclear Information System (INIS)
Jordanov, I.; Khristov, Y.
1995-01-01
This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs
Integrated computer codes for nuclear power plant severe accident analysis
Energy Technology Data Exchange (ETDEWEB)
Jordanov, I; Khristov, Y [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika
1996-12-31
This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs.
RADTRAN 5: A computer code for transportation risk analysis
International Nuclear Information System (INIS)
Neuhauser, K.S.; Kanipe, F.L.
1991-01-01
RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air
Computer Tomography Analysis of Fastrac Composite Thrust Chamber Assemblies
Beshears, Ronald D.
2000-01-01
Computed tomography (CT) inspection has been integrated into the production process for NASA's Fastrac composite thrust chamber assemblies (TCAs). CT has been proven to be uniquely qualified to detect the known critical flaw for these nozzles, liner cracks that are adjacent to debonds between the liner and overwrap. CT is also being used as a process monitoring tool through analysis of low density indications in the nozzle overwraps. 3d reconstruction of CT images to produce models of flawed areas is being used to give program engineers better insight into the location and nature of nozzle flaws.
Development validation and use of computer codes for inelastic analysis
International Nuclear Information System (INIS)
Jobson, D.A.
1983-01-01
A finite element scheme is a system which provides routines so carry out the operations which are common to all finite element programs. The list of items that can be provided as standard by the finite element scheme is surprisingly large and the list provided by the UNCLE finite element scheme is unusually comprehensive. This presentation covers the following: construction of the program, setting up a finite element mesh, generation of coordinates, incorporating boundary and load conditions. Program validation was done by creep calculations performed using CAUSE code. Program use is illustrated by calculating a typical inelastic analysis problem. This includes computer model of the PFR intermediate heat exchanger
DYNAPO 4 - a fluid system and frames analysis computer program
International Nuclear Information System (INIS)
Lefter, J.D.; Ahdout, H.
1982-01-01
DYNAPO 4 is a user oriented specialized computer program, capable of analyzing three-dimensional linear elastic piping systems or frames for static loads, dynamic loads represented by acceleration response spectra, transient dynamic loads represented by harmonic, polynomial of second order, and time history forcing functions. DYNAPO 4 has plotting capability, which plots the input configuration of the piping system or of the structure and also plots its deformed shape after the load is applied. DYNAPO 4 performs the analysis for ASME Section III Class 1, Class 2, and 3, piping, and provides the user with stress reports as per ASME and ANSI Code requirements. 3 refs
Computer compensation for NMR quantitative analysis of trace components
International Nuclear Information System (INIS)
Nakayama, T.; Fujiwara, Y.
1981-01-01
A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA
A computer program for automatic gamma-ray spectra analysis
International Nuclear Information System (INIS)
Hiromura, Kazuyuki
1975-01-01
A computer program for automatic analysis of gamma-ray spectra obtained with a Ge(Li) detector is presented. The program includes a method by comparing the successive values of experimental data for the automatic peak finding and method of leastsquares for the peak fitting. The peak shape in the fitting routine is a 'modified Gaussian', which consists of two different Gaussians with the same height joined at the centroid. A quadratic form is chosen as a function representing the background. A maximum of four peaks can be treated in the fitting routine by the program. Some improvements in question are described. (auth.)
MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.
Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro
2018-06-01
The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.
Modern EMC analysis I time-domain computational schemes
Kantartzis, Nikolaos V
2008-01-01
The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i
I. Fisk
2011-01-01
Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...
Cepstrum analysis and applications to computational fluid dynamic solutions
Meadows, Kristine R.
1990-04-01
A novel approach to the problem of spurious reflections introduced by artificial boundary conditions in computational fluid dynamic (CFD) solutions is proposed. Instead of attempting to derive non-reflecting boundary conditions, the approach is to accept the fact that spurious reflections occur, but to remove these reflections with cepstrum analysis, a signal processing technique which has been successfully used to remove echoes from experimental data. First, the theory of the cepstrum method is presented. This includes presentation of two types of cepstra: The Power Cepstrum and the Complex Cepstrum. The definitions of the cepstrum methods are applied theoretically and numerically to the analytical solution of sinusoidal plane wave propagation in a duct. One-D and 3-D time dependent solutions to the Euler equations are computed, and hard-wall conditions are prescribed at the numerical boundaries. The cepstrum method is applied, and the reflections from the boundaries are removed from the solutions. One-D and 3-D solutions are computed with so called nonreflecting boundary conditions, and these solutions are compared to those obtained by prescribing hard wall conditions and processing with the cepstrum.
G-computation demonstration in causal mediation analysis
International Nuclear Information System (INIS)
Wang, Aolin; Arah, Onyebuchi A.
2015-01-01
Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings
System and software safety analysis for the ERA control computer
International Nuclear Information System (INIS)
Beerthuizen, P.G.; Kruidhof, W.
2001-01-01
The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used
Intra-articualr calcaneal fractures: Computed tomographic analysis
International Nuclear Information System (INIS)
Rosenberg, Z.S.; Feldman, F.; Singson, R.D.
1987-01-01
Computed tomography (CT) analysis of 21 intra-articular calcaneal fractures categorized according to the Essex-Lopresti classification revealed the following distribution: joint depression-type 57%, comminuted type 43%, tongue-type 0%. The posterior calcaneal facet was fractured and/or depressed in 100% of the cases while the medial facet was involved in only 25% of the cases. CT proved superior to plain films by consistently demonstrating additional fracture components within each major category suggesting subclassifications which have potential prognostic value. CT allowed more expeditious handling of acutely injured patients, and improved preoperative planning, postoperative follow-up, and detailed analysis of causes for chronic residual pain. CT further identified significant soft tissue injuries such as peroneal tendon displacement which cannot be delineated on plain films. (orig.)
GUI program to compute probabilistic seismic hazard analysis
International Nuclear Information System (INIS)
Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.
2005-12-01
The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility
Computer-Aided Sustainable Process Synthesis-Design and Analysis
DEFF Research Database (Denmark)
Kumar Tula, Anjan
-groups is that, the performance of the entire process can be evaluated from the contributions of the individual process-groups towards the selected flowsheet property (for example, energy consumed). The developed flowsheet property models include energy consumption, carbon footprint, product recovery, product......Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...
Computational Approaches for Integrative Analysis of the Metabolome and Microbiome
Directory of Open Access Journals (Sweden)
Jasmine Chong
2017-11-01
Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.
GUI program to compute probabilistic seismic hazard analysis
International Nuclear Information System (INIS)
Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.
2006-12-01
The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility
Development Of The Computer Code For Comparative Neutron Activation Analysis
International Nuclear Information System (INIS)
Purwadi, Mohammad Dhandhang
2001-01-01
The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation
Analysis of CERN computing infrastructure and monitoring data
Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.
2015-12-01
Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.
Computer-aided analysis of cutting processes for brittle materials
Ogorodnikov, A. I.; Tikhonov, I. N.
2017-12-01
This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.
Compendium of computer codes for the safety analysis of LMFBR's
International Nuclear Information System (INIS)
1975-06-01
A high level of mathematical sophistication is required in the safety analysis of LMFBR's to adequately meet the demands for realism and confidence in all areas of accident consequence evaluation. The numerical solution procedures associated with these analyses are generally so complex and time consuming as to necessitate their programming into computer codes. These computer codes have become extremely powerful tools for safety analysis, combining unique advantages in accuracy, speed and cost. The number, diversity and complexity of LMFBR safety codes in the U. S. has grown rapidly in recent years. It is estimated that over 100 such codes exist in various stages of development throughout the country. It is inevitable that such a large assortment of codes will require rigorous cataloguing and abstracting to aid individuals in identifying what is available. It is the purpose of this compendium to provide such a service through the compilation of code summaries which describe and clarify the status of domestic LMFBR safety codes. (U.S.)
Computational design analysis for deployment of cardiovascular stents
International Nuclear Information System (INIS)
Tammareddi, Sriram; Sun Guangyong; Li Qing
2010-01-01
Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.
Computer enhanced release scenario analysis for a nuclear waste repository
International Nuclear Information System (INIS)
Stottlemyre, J.A.; Petrie, G.M.; Mullen, M.F.
1979-01-01
An interactive (user-oriented) computer tool is being developed at PNL to assist in the analysis of release scenarios for long-term safety assessment of a continental geologic nuclear waste repository. Emphasis is on characterizing the various ways the geologic and hydrologic system surrounding a repository might vary over the 10 6 to 10 7 years subsequent to final closure of the cavern. The potential disruptive phenomena are categorized as natural geologic and man-caused and tend to be synergistic in nature. The computer tool is designed to permit simulation of the system response as a function of the ongoing disruptive phenomena and time. It is designed to be operated in a determinatic manner, i.e., user selection of the desired scenarios and associated rate, magnitude, and lag time data; or in a stochastic mode. The stochastic mode involves establishing distributions for individual phenomena occurrence probabilities, rates, magnitudes, and phase relationships. A Monte-Carlo technique is then employed to generate a multitude of disruptive event scenarios, scan for breaches of the repository isolation, and develop input to the release consequence analysis task. To date, only a simplified one-dimensional version of the code has been completed. Significant modification and development is required to expand its dimensionality and apply the tool to any specific site
A compendium of computer codes in fault tree analysis
International Nuclear Information System (INIS)
Lydell, B.
1981-03-01
In the past ten years principles and methods for a unified system reliability and safety analysis have been developed. Fault tree techniques serve as a central feature of unified system analysis, and there exists a specific discipline within system reliability concerned with the theoretical aspects of fault tree evaluation. Ever since the fault tree concept was established, computer codes have been developed for qualitative and quantitative analyses. In particular the presentation of the kinetic tree theory and the PREP-KITT code package has influenced the present use of fault trees and the development of new computer codes. This report is a compilation of some of the better known fault tree codes in use in system reliability. Numerous codes are available and new codes are continuously being developed. The report is designed to address the specific characteristics of each code listed. A review of the theoretical aspects of fault tree evaluation is presented in an introductory chapter, the purpose of which is to give a framework for the validity of the different codes. (Auth.)
Automatic quantitative analysis of liver functions by a computer system
International Nuclear Information System (INIS)
Shinpo, Takako
1984-01-01
In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)
Nuclear power reactor analysis, methods, algorithms and computer programs
International Nuclear Information System (INIS)
Matausek, M.V
1981-01-01
Full text: For a developing country buying its first nuclear power plants from a foreign supplier, disregarding the type and scope of the contract, there is a certain number of activities which have to be performed by local stuff and domestic organizations. This particularly applies to the choice of the nuclear fuel cycle strategy and the choice of the type and size of the reactors, to bid parameters specification, bid evaluation and final safety analysis report evaluation, as well as to in-core fuel management activities. In the Nuclear Engineering Department of the Boris Kidric Institute of Nuclear Sciences (NET IBK) the continual work is going on, related to the following topics: cross section and resonance integral calculations, spectrum calculations, generation of group constants, lattice and cell problems, criticality and global power distribution search, fuel burnup analysis, in-core fuel management procedures, cost analysis and power plant economics, safety and accident analysis, shielding problems and environmental impact studies, etc. The present paper gives the details of the methods developed and the results achieved, with the particular emphasis on the NET IBK computer program package for the needs of planning, construction and operation of nuclear power plants. The main problems encountered so far were related to small working team, lack of large and powerful computers, absence of reliable basic nuclear data and shortage of experimental and empirical results for testing theoretical models. Some of these difficulties have been overcome thanks to bilateral and multilateral cooperation with developed countries, mostly through IAEA. It is the authors opinion, however, that mutual cooperation of developing countries, having similar problems and similar goals, could lead to significant results. Some activities of this kind are suggested and discussed. (author)
Analysis of parallel computing performance of the code MCNP
International Nuclear Information System (INIS)
Wang Lei; Wang Kan; Yu Ganglin
2006-01-01
Parallel computing can reduce the running time of the code MCNP effectively. With the MPI message transmitting software, MCNP5 can achieve its parallel computing on PC cluster with Windows operating system. Parallel computing performance of MCNP is influenced by factors such as the type, the complexity level and the parameter configuration of the computing problem. This paper analyzes the parallel computing performance of MCNP regarding with these factors and gives measures to improve the MCNP parallel computing performance. (authors)
Multiscale analysis of nonlinear systems using computational homology
Energy Technology Data Exchange (ETDEWEB)
Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner
2010-05-24
This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure
Multiscale analysis of nonlinear systems using computational homology
Energy Technology Data Exchange (ETDEWEB)
Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University
2010-05-19
This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure
Markov analysis of different standby computer based systems
International Nuclear Information System (INIS)
Srinivas, G.; Guptan, Rajee; Mohan, Nalini; Ghadge, S.G.; Bajaj, S.S.
2006-01-01
As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR
Novel computational approaches for the analysis of cosmic magnetic fields
Energy Technology Data Exchange (ETDEWEB)
Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)
2016-07-01
In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.
NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)
Energy Technology Data Exchange (ETDEWEB)
Handler, B.H. (Oak Ridge K-25 Site, TN (USA)); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. (Oak Ridge Associated Universities, Inc., TN (USA)); Hunnum, W.H. (North Carolina Univ., Chapel Hill, NC (USA)); Smith, D.L. (Memphis State Univ., TN (USA))
1990-07-01
Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.
Shell stability analysis in a computer aided engineering (CAE) environment
Arbocz, J.; Hol, J. M. A. M.
1993-01-01
The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.
Computational Analysis of the G-III Laminar Flow Glove
Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan
2011-01-01
Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.
National survey on dose data analysis in computed tomography.
Heilmaier, Christina; Treier, Reto; Merkle, Elmar Max; Alkhadi, Hatem; Weishaupt, Dominik; Schindera, Sebastian
2018-05-28
A nationwide survey was performed assessing current practice of dose data analysis in computed tomography (CT). All radiological departments in Switzerland were asked to participate in the on-line survey composed of 19 questions (16 multiple choice, 3 free text). It consisted of four sections: (1) general information on the department, (2) dose data analysis, (3) use of a dose management software (DMS) and (4) radiation protection activities. In total, 152 out of 241 Swiss radiological departments filled in the whole questionnaire (return rate, 63%). Seventy-nine per cent of the departments (n = 120/152) analyse dose data on a regular basis with considerable heterogeneity in the frequency (1-2 times per year, 45%, n = 54/120; every month, 35%, n = 42/120) and method of analysis. Manual analysis is carried out by 58% (n = 70/120) compared with 42% (n = 50/120) of departments using a DMS. Purchase of a DMS is planned by 43% (n = 30/70) of the departments with manual analysis. Real-time analysis of dose data is performed by 42% (n = 21/50) of the departments with a DMS; however, residents can access the DMS in clinical routine only in 20% (n = 10/50) of the departments. An interdisciplinary dose team, which among other things communicates dose data internally (63%, n = 76/120) and externally, is already implemented in 57% (n = 68/120) departments. Swiss radiological departments are committed to radiation safety. However, there is high heterogeneity among them regarding the frequency and method of dose data analysis as well as the use of DMS and radiation protection activities. • Swiss radiological departments are committed to and interest in radiation safety as proven by a 63% return rate of the survey. • Seventy-nine per cent of departments analyse dose data on a regular basis with differences in the frequency and method of analysis: 42% use a dose management software, while 58% currently perform manual dose data analysis. Of the latter, 43% plan to buy a dose
Computer codes for the analysis of flask impact problems
International Nuclear Information System (INIS)
Neilson, A.J.
1984-09-01
This review identifies typical features of the design of transportation flasks and considers some of the analytical tools required for the analysis of impact events. Because of the complexity of the physical problem, it is unlikely that a single code will adequately deal with all the aspects of the impact incident. Candidate codes are identified on the basis of current understanding of their strengths and limitations. It is concluded that the HONDO-II, DYNA3D AND ABAQUS codes which ar already mounted on UKAEA computers will be suitable tools for use in the analysis of experiments conducted in the proposed AEEW programme and of general flask impact problems. Initial attention should be directed at the DYNA3D and ABAQUS codes with HONDO-II being reserved for situations where the three-dimensional elements of DYNA3D may provide uneconomic simulations in planar or axisymmetric geometries. Attention is drawn to the importance of access to suitable mesh generators to create the nodal coordinate and element topology data required by these structural analysis codes. (author)
Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.
Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei
2016-01-01
Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.
Automated computer analysis of plasma-streak traces from SCYLLAC
International Nuclear Information System (INIS)
Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.
1977-01-01
An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique
Automated computer analysis of plasma-streak traces from SCYLLAC
International Nuclear Information System (INIS)
Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.
1977-11-01
An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique
Contingency Analysis Post-Processing With Advanced Computing and Visualization
Energy Technology Data Exchange (ETDEWEB)
Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin
2017-07-01
Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.
Computer-aided target tracking in motion analysis studies
Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.
1990-08-01
Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.
Data analysis using the Gnu R system for statistical computation
Energy Technology Data Exchange (ETDEWEB)
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective
Chen, Shyi-Ming
2013-01-01
Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...
Computer and Internet Addiction: Analysis and Classification of Approaches
Directory of Open Access Journals (Sweden)
Zaretskaya O.V.
2017-08-01
Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.
Computational Fluid Dynamics Analysis of an Evaporative Cooling System
Directory of Open Access Journals (Sweden)
Kapilan N.
2016-11-01
Full Text Available The use of chlorofluorocarbon based refrigerants in the air-conditioning system increases the global warming and causes the climate change. The climate change is expected to present a number of challenges for the built environment and an evaporative cooling system is one of the simplest and environmentally friendly cooling system. The evaporative cooling system is most widely used in summer and in rural and urban areas of India for human comfort. In evaporative cooling system, the addition of water into air reduces the temperature of the air as the energy needed to evaporate the water is taken from the air. Computational fluid dynamics is a numerical analysis and was used to analyse the evaporative cooling system. The CFD results are matches with the experimental results.
Web Pages Content Analysis Using Browser-Based Volunteer Computing
Directory of Open Access Journals (Sweden)
Wojciech Turek
2013-01-01
Full Text Available Existing solutions to the problem of ﬁnding valuable information on the Websuﬀers from several limitations like simpliﬁed query languages, out-of-date in-formation or arbitrary results sorting. In this paper a diﬀerent approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide suﬃcient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.
TEABAGS: computer programs for instrumental neutron activation analysis
Energy Technology Data Exchange (ETDEWEB)
Lindstrom, D J [Washington Univ., St. Louis, MO (USA); Korotev, R L [Washington Univ., St. Louis, MO (USA). McDonnell Center for the Space Sciences
1982-01-01
Described is a series of INAA data reduction programs collectively known as TEABAGS (Trace Element Analysis By Automated Gamma-ray Spectrometry). The programs are written in FORTRAN and run on a Nuclear Data ND-6620 computer system, but should be adaptable to any medium-sized minicomputer. They are designed to monitor the status of all spectra obtained from samples and comparison standards irradiated together and to do all pending calculations without operator intervention. Major emphasis is placed on finding all peaks in the spectrum, properly identifying all nuclides present and all contributors to each peak, determining accurate estimates of the background continua under peaks, and producing realistic uncertainties on peak areas and final abundances.
Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety
International Nuclear Information System (INIS)
Broadhead, B.L.; Childs, R.L.; Rearden, B.T.
1999-01-01
Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community
Analysis of 3D crack propagation by microfocus computed tomography
International Nuclear Information System (INIS)
Ao Bo; Chen Fuxing; Deng Cuizhen; Zeng Yabin
2014-01-01
The three-point bending test of notched specimens of 2A50 forging aluminum was performed by high frequency fatigue tester, and the surface cracks of different stages were analyzed and contrasted by SEM. The crack was reconstructed by microfocus computed tomography, and its size, position and distribution were visually displayed through 3D visualization. The crack propagation behaviors were researched through gray value and position of crack front of 2D CT images in two adjacent stages, and the results show that crack propagation is irregular. The projection image of crack was obtained if crack of two stages projected onto the reference plane respectively, a significant increase of new crack propagation was observed compared with the previous projection of crack, and the distribution curve of crack front of two stages was displayed. The 3D increment distribution of the crack front propagation was obtained through the 3D crack analysis of two stages. (authors)
Satellite interference analysis and simulation using personal computers
Kantak, Anil
1988-03-01
This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.
M. Kasemann
CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...
Computer-assisted sperm analysis (CASA): capabilities and potential developments.
Amann, Rupert P; Waberski, Dagmar
2014-01-01
Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time
Lonchamp, Jacques
2010-01-01
Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…
Probabilistic evaluations for CANTUP computer code analysis improvement
International Nuclear Information System (INIS)
Florea, S.; Pavelescu, M.
2004-01-01
Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for
Summary of research in applied mathematics, numerical analysis, and computer sciences
1986-01-01
The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.
Computer assisted analysis of medical x-ray images
Bengtsson, Ewert
1996-01-01
X-rays were originally used to expose film. The early computers did not have enough capacity to handle images with useful resolution. The rapid development of computer technology over the last few decades has, however, led to the introduction of computers into radiology. In this overview paper, the various possible roles of computers in radiology are examined. The state of the art is briefly presented, and some predictions about the future are made.
Integrated severe accident containment analysis with the CONTAIN computer code
International Nuclear Information System (INIS)
Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.
1985-12-01
Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite
Reliability of Computer Analysis of Electrocardiograms (ECG) of ...
African Journals Online (AJOL)
Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...
RADTRAN 5 - A computer code for transportation risk analysis
International Nuclear Information System (INIS)
Neuhauser, K.S.; Kanipe, F.L.
1993-01-01
The RADTRAN 5 computer code has been developed to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI standard FORTRAN 77; the code contains significant advances in the methodology first pioneered with the LINK option of RADTRAN 4. A major application of the LINK methodology is route-specific analysis. Another application is comparisons of attributes along the same route segments. Nonradiological risk factors have been incorporated to allow users to estimate nonradiological fatalities and injuries that might occur during the transportation event(s) being analyzed. These fatalities include prompt accidental fatalities from mechanical causes. Values of these risk factors for the United States have been made available in the code as optional defaults. Several new health effects models have been published in the wake of the Hiroshima-Nagasaki dosimetry reassessment, and this has emphasized the need for flexibility in the RADTRAN approach to health-effects calculations. Therefore, the basic set of health-effects conversion equations in RADTRAN have been made user-definable. All parameter values can be changed by the user, but a complete set of default values are available for both the new International Commission on Radiation Protection model (ICRP Publication 60) and the recent model of the U.S. National Research Council's Committee on the Biological Effects of Radiation (BEIR V). The meteorological input data tables have been modified to permit optional entry of maximum downwind distances for each dose isopleth. The expected dose to an individual in each isodose area is also calculated and printed automatically. Examples are given that illustrate the power and flexibility of the RADTRAN 5 computer code. (J.P.N.)
Genome Assembly and Computational Analysis Pipelines for Bacterial Pathogens
Rangkuti, Farania Gama Ardhina
2011-06-01
Pathogens lie behind the deadliest pandemics in history. To date, AIDS pandemic has resulted in more than 25 million fatal cases, while tuberculosis and malaria annually claim more than 2 million lives. Comparative genomic analyses are needed to gain insights into the molecular mechanisms of pathogens, but the abundance of biological data dictates that such studies cannot be performed without the assistance of computational approaches. This explains the significant need for computational pipelines for genome assembly and analyses. The aim of this research is to develop such pipelines. This work utilizes various bioinformatics approaches to analyze the high-throughput genomic sequence data that has been obtained from several strains of bacterial pathogens. A pipeline has been compiled for quality control for sequencing and assembly, and several protocols have been developed to detect contaminations. Visualization has been generated of genomic data in various formats, in addition to alignment, homology detection and sequence variant detection. We have also implemented a metaheuristic algorithm that significantly improves bacterial genome assemblies compared to other known methods. Experiments on Mycobacterium tuberculosis H37Rv data showed that our method resulted in improvement of N50 value of up to 9697% while consistently maintaining high accuracy, covering around 98% of the published reference genome. Other improvement efforts were also implemented, consisting of iterative local assemblies and iterative correction of contiguated bases. Our result expedites the genomic analysis of virulent genes up to single base pair resolution. It is also applicable to virtually every pathogenic microorganism, propelling further research in the control of and protection from pathogen-associated diseases.
M. Kasemann
Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...
1975-12-01
Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...
Basic design of parallel computational program for probabilistic structural analysis
International Nuclear Information System (INIS)
Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi
1999-06-01
In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)
Basic design of parallel computational program for probabilistic structural analysis
Energy Technology Data Exchange (ETDEWEB)
Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi
1999-06-01
In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)
COMTA - a computer code for fuel mechanical and thermal analysis
International Nuclear Information System (INIS)
Basu, S.; Sawhney, S.S.; Anand, A.K.; Anantharaman, K.; Mehta, S.K.
1979-01-01
COMTA is a generalized computer code for integrity analysis of the free standing fuel cladding, with natural UO 2 or mixed oxide fuel pellets. Thermal and Mechanical analysis is done simultaneously for any power history of the fuel pin. For analysis, the fuel cladding is assumed to be axisymmetric and is subjected to axisymmetric load due to contact pressure, gas pressure, coolant pressure and thermal loads. Axial variation of load is neglected and creep and plasticity are assumed to occur at constant volume. The pellet is assumed to be made of concentric annuli. The fission gas release integral is dependent on the temperature and the power produced in each annulus. To calculate the temperature distribution in the fuel pin, the variation of bulk coolant temperature is given as an input to the code. Gap conductance is calculated at every time step, considering fuel densification, fuel relocation and gap closure, filler gas dilution by released fission gas, gap closure by expansion and irradiation swelling. Overall gap conductance is contributed by heat transfer due to the three modes; conduction convection and radiation as per modified Ross and Stoute model. Equilibrium equations, compatibility equations, stress strain relationships (including thermal strains and permanent strains due to creep and plasticity) are used to obtain triaxial stresses and strains. Thermal strain is assumed to be zero at hot zero power conditions. The boundary conditions are obtained for radial stresses at outside and inside surfaces by making these equal to coolant pressure and internal pressure respectively. A multi-mechanism creep model which accounts for thermal and irradiation creep is used to calculate the overall creep rate. Effective plastic strain is a function of effective stress and material constants. (orig.)
Bonham, Kevin S; Stefan, Melanie I
2017-10-01
While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.
Directory of Open Access Journals (Sweden)
Kevin S Bonham
2017-10-01
Full Text Available While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.
Trident: scalable compute archives: workflows, visualization, and analysis
Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel
2016-08-01
The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub
Computer analysis and comparison of chess players' game-playing styles
Krevs, Urša
2015-01-01
Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...
Performance analysis of cloud computing services for many-tasks scientific computing
Iosup, A.; Ostermann, S.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.
2011-01-01
Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike. Through the use of virtualization and resource time sharing, clouds serve with a single set of physical resources a
A performance analysis of EC2 cloud computing services for scientific computing
Ostermann, S.; Iosup, A.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.; Avresky, D.; Diaz, M.; Bode, A.; Bruno, C.; Dekel, E.
2010-01-01
Cloud Computing is emerging today as a commercial infrastructure that eliminates the need for maintaining expensive computing hardware. Through the use of virtualization, clouds promise to address with the same shared set of physical resources a large user base with different needs. Thus, clouds
Gold-standard for computer-assisted morphological sperm analysis.
Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen
2017-04-01
Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm
Computational analysis on plug-in hybrid electric motorcycle chassis
Teoh, S. J.; Bakar, R. A.; Gan, L. M.
2013-12-01
Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.
Automatic analysis of gamma spectra using a desk computer
International Nuclear Information System (INIS)
Rocca, H.C.
1976-10-01
A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author) [es
Recent advances in computational structural reliability analysis methods
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-10-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Methods and computer codes for probabilistic sensitivity and uncertainty analysis
International Nuclear Information System (INIS)
Vaurio, J.K.
1985-01-01
This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables
COMPUTATIONAL ANALYSIS OF BACKWARD-FACING STEP FLOW
Directory of Open Access Journals (Sweden)
Erhan PULAT
2001-01-01
Full Text Available In this study, backward-facing step flow that are encountered in electronic systems cooling, heat exchanger design, and gas turbine cooling are investigated computationally. Steady, incompressible, and two-dimensional air flow is analyzed. Inlet velocity is assumed uniform and it is obtained from parabolic profile by using maximum velocity. In the analysis, the effects of channel expansion ratio and Reynolds number to reattachment length are investigated. In addition, pressure distribution throughout the channel length is also obtained and flow is analyzed for the Reynolds number values of 50 and 150 and channel expansion ratios of 1.5 and 2. Governing equations are solved by using Galerkin finite element mothod of ANSYS-FLOTRAN code. Obtained results are compared with the solutions of lattice BGK method that is relatively new method in fluid dynamics and other numerical and experimental results. It is concluded that reattachment length increases with increasing Reynolds number and at the same Reynolds number it decreases with increasing channel expansion ratio.
Design of airborne wind turbine and computational fluid dynamics analysis
Anbreen, Faiqa
Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.
A computer language for reducing activation analysis data
International Nuclear Information System (INIS)
Friedman, M.H.; Tanner, J.T.
1978-01-01
A program, written in FORTRAN, which defines a language for reducing activation analysis data is described. An attempt was made to optimize the choice of commands and their definitions so as to concisely express what should be done, make the language natural to use and easy to learn, arranqe a system of checks to guard against communication errors and have the language be inclusive. Communications are effected through commands, and these can be given in almost any order. Consistency checks are done and diagnostic messages are printed automatically to guard against the incorrect use of commands. Default options on the commands allow instructions to be expressed concisely while providing a capability to specify details for the data reduction process. The program has been implemented on a UNIVAC 1108 computer. A complete description of the commands, the algorithms used, and the internal consistency checks used are given elsewhere. The applications of the program and the methods for obtaining data automatically have already been described. (T.G.)
A fast reactor transient analysis methodology for personal computers
International Nuclear Information System (INIS)
Ott, K.O.
1993-01-01
A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)
Recent Developments in Complex Analysis and Computer Algebra
Kajiwara, Joji; Xu, Yongzhi
1999-01-01
This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...
Comparison of two three-dimensional cephalometric analysis computer software.
Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek
2014-10-01
Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.
Analisis cualitativo asistido por computadora Computer-assisted qualitative analysis
Directory of Open Access Journals (Sweden)
César A. Cisneros Puebla
2003-01-01
Full Text Available Los objetivos de este ensayo son: por un lado, presentar una aproximación a la experiencia hispanoamericana en el Análisis Cualitativo Asistido por Computadora (ACAC al agrupar mediante un ejercicio de sistematización los trabajos realizados por diversos colegas provenientes de disciplinas afines. Aunque hubiese querido ser exhaustivo y minucioso, como cualquier intento de sistematización de experiencias, en este ejercicio son notables las ausencias y las omisiones. Introducir algunas reflexiones teóricas en torno al papel del ACAC en el desarrollo de la investigación cualitativa a partir de esa sistematización y con particular énfasis en la producción del dato es, por otro lado, objetivo central de esta primera aproximación.The aims of this article are: on the one hand, to present an approximation to the Hispano-American experience on Computer-Assisted Qualitative Data Analysis (CAQDAS, grouping as a systematization exercise the works carried out by several colleagues from related disciplines. Although attempting to be exhaustive and thorough - as in any attempt at systematizing experiences - this exercise presents clear lacks and omissions. On the other hand, to introduce some theoretical reflections about the role played by CAQDAS in the development of qualitative investigation after that systematization, with a specific focus on data generation.
Applied and computational harmonic analysis on graphs and networks
Irion, Jeff; Saito, Naoki
2015-09-01
In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.
Reliability analysis framework for computer-assisted medical decision systems
International Nuclear Information System (INIS)
Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.
2007-01-01
We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional
Computer-aided pulmonary image analysis in small animal models
Energy Technology Data Exchange (ETDEWEB)
Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)
2015-07-15
Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.
Clinical diagnosis and computer analysis of headache symptoms.
Drummond, P D; Lance, J W
1984-01-01
The headache histories obtained from clinical interviews of 600 patients were analysed by computer to see whether patients could be separated systematically into clinical categories and to see whether sets of symptoms commonly reported together differed in distribution among the categories. The computer classification procedure assigned 537 patients to the same category as their clinical diagnosis, the majority of discrepancies between clinical and computer classifications involving common mi...
Analysis of school furniture used in computer classrooms
Jiří Tauber
2011-01-01
With the respect to the fast development of new computer technologies, it is unconditionally necessary that school furniture reflected this trend and adapted to it. Our use of computer technologies and utilities in teaching is increasing. Therefore, it is necessary to improve school desks so that they would be fit for new computer technology. Creation of a compact set of information relative to the issue concerned, which would comprise of needs and requirements for individual pieces of furnit...
Formal Specification and Analysis of Cloud Computing Management
2012-01-24
te r Cloud Computing in a Nutshell We begin this introduction to Cloud Computing with a famous quote by Larry Ellison: “The interesting thing about...the wording of some of our ads.” — Larry Ellison, Oracle CEO [106] In view of this statement, we summarize the essential aspects of Cloud Computing...1] M. Abadi, M. Burrows , M. Manasse, and T. Wobber. Moderately hard, memory-bound functions. ACM Transactions on Internet Technology, 5(2):299–327
Energy Technology Data Exchange (ETDEWEB)
Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d' %C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)
2011-06-01
This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the
Computer Models for IRIS Control System Transient Analysis
International Nuclear Information System (INIS)
Gary D Storrick; Bojan Petrovic; Luca Oriani
2007-01-01
This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled 'Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor' focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design--such as the lack of a detailed secondary system or I and C system designs--makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I and C development process. Section
Thorp, Scott A.
1992-01-01
This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.
DEFF Research Database (Denmark)
Windfeld, Kristian
1992-01-01
Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...
Comparative Analysis on the Utilization of Computers | Nkata ...
African Journals Online (AJOL)
The findings reveal among others that extent of usability of computers in the two universities had a significant difference. It was concluded that the level of computer utilization in UNIPORT is more than in the RUST. It was recommended that periodical, pre and post qualification seminars be organized for the 2 university ...
From handwriting analysis to pen-computer applications
Schomaker, L
1998-01-01
In this paper, pen computing, i.e. the use of computers and applications in which the pen is the main input device, will be described from four different viewpoints. Firstly a brief overview of the hardware developments in pen systems is given, leading to the conclusion that the technological
International Nuclear Information System (INIS)
Cate, C.L.; Wagner, D.P.; Fussell, J.B.
1977-01-01
Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis
Big data mining analysis method based on cloud computing
Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao
2017-08-01
Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.
High throughput computing: a solution for scientific analysis
O'Donnell, M.
2011-01-01
Public land management agencies continually face resource management problems that are exacerbated by climate warming, land-use change, and other human activities. As the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) works with managers in U.S. Department of the Interior (DOI) agencies and other federal, state, and private entities, researchers are finding that the science needed to address these complex ecological questions across time and space produces substantial amounts of data. The additional data and the volume of computations needed to analyze it require expanded computing resources well beyond single- or even multiple-computer workstations. To meet this need for greater computational capacity, FORT investigated how to resolve the many computational shortfalls previously encountered when analyzing data for such projects. Our objectives included finding a solution that would:
Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences
Directory of Open Access Journals (Sweden)
Gregor Wiedemann
2013-05-01
Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231
Digital image processing and analysis human and computer vision applications with CVIPtools
Umbaugh, Scott E
2010-01-01
Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read
MMA, A Computer Code for Multi-Model Analysis
Poeter, Eileen P.; Hill, Mary C.
2007-01-01
This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will
International Nuclear Information System (INIS)
Vyropaev, V.Ya.; Zlokazov, V.B.; Kul'kina, L.I.; Maslov, O.D.; Fefilov, B.V.
1977-01-01
A computer program is described for processing gamma spectra in the instrumental activation analysis of multicomponent objects. Structural diagrams of various variants of connection with the computer are presented. The possibility of using a mini-computer as an analyser and for preliminary processing of gamma spectra is considered
Report--COMOLA: A Computer System for the Analysis of Interlanguage Data.
Jagtman, Margriet; Bongaerts, Theo
1994-01-01
Discusses the design and use of the Computer Model for Language Acquisition (COMOLA), a computer program designed to analyze syntactic development in second-language learners by examining their oral utterances. Also compares COMOLA to the recently developed Computer-Aides Linguistic Analysis (COALA) program. (MDM)
Interface design of VSOP'94 computer code for safety analysis
International Nuclear Information System (INIS)
Natsir, Khairina; Andiwijayakusuma, D.; Wahanani, Nursinta Adi; Yazid, Putranto Ilham
2014-01-01
Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects
Interface design of VSOP'94 computer code for safety analysis
Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi
2014-09-01
Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.
Computational identification and analysis of novel sugarcane microRNAs
Directory of Open Access Journals (Sweden)
Thiebaut Flávia
2012-07-01
Full Text Available Abstract Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species and DNA methylation (24-nt-species, respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome.
Quantitative analysis of left ventricular strain using cardiac computed tomography
Energy Technology Data Exchange (ETDEWEB)
Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)
2014-03-15
Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain
Task analysis and computer aid development for human reliability analysis in nuclear power plants
Energy Technology Data Exchange (ETDEWEB)
Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)
2001-04-01
Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)
Multimedia Image Technology and Computer Aided Manufacturing Engineering Analysis
Nan, Song
2018-03-01
Since the reform and opening up, with the continuous development of science and technology in China, more and more advanced science and technology have emerged under the trend of diversification. Multimedia imaging technology, for example, has a significant and positive impact on computer aided manufacturing engineering in China. From the perspective of scientific and technological advancement and development, the multimedia image technology has a very positive influence on the application and development of computer-aided manufacturing engineering, whether in function or function play. Therefore, this paper mainly starts from the concept of multimedia image technology to analyze the application of multimedia image technology in computer aided manufacturing engineering.
Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine
International Nuclear Information System (INIS)
Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie
2014-01-01
With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.
Sensitivity Analysis and Error Control for Computational Aeroelasticity, Phase I
National Aeronautics and Space Administration — The objective of this proposal is the development of a next-generation computational aeroelasticity code, suitable for real-world complex geometries, and...
Cloud computing for genomic data analysis and collaboration.
Langmead, Ben; Nellore, Abhinav
2018-04-01
Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.
Cost/Benefit Analysis of Leasing Versus Purchasing Computers
National Research Council Canada - National Science Library
Arceneaux, Alan
1997-01-01
.... In constructing this model, several factors were considered, including: The purchase cost of computer equipment, annual lease payments, depreciation costs, the opportunity cost of purchasing, tax revenue implications and various leasing terms...
Discrete calculus applied analysis on graphs for computational science
Grady, Leo J
2010-01-01
This unique text brings together into a single framework current research in the three areas of discrete calculus, complex networks, and algorithmic content extraction. Many example applications from several fields of computational science are provided.
Computational Analysis of Flow Through a Transonic Compressor Rotor
National Research Council Canada - National Science Library
Bochette, Nikolaus J
2005-01-01
.... In examining this problem two Computational Fluid Dynamic (CFD) codes have been used by the Naval Postgraduate School to predict the performance of a transonic compressor rotor that is being tested with steam ingestion...
Automated computation of autonomous spectral submanifolds for nonlinear modal analysis
Ponsioen, Sten; Pedergnana, Tiemo; Haller, George
2018-04-01
We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.
Hierarchical nanoreinforced composites: Computational analysis of damage mechanisms
DEFF Research Database (Denmark)
Mishnaevsky, Leon; Pontefisso, Alessandro; Dai, Gaoming
2016-01-01
of distribution, shape, orientation of nanoparticles (carbon nanotube, graphene) in unidirectional polymer matrix composites on the strength and damage resistance of the composites is studied in computational studies. The possible directions of the improvement of nanoreinforced composites by controlling shapes...
76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software
2011-09-30
... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...
Computational analysis of difenoconazole interaction with soil chitinases
International Nuclear Information System (INIS)
Vlǎdoiu, D L; Filimon, M N; Ostafe, V; Isvoran, A
2015-01-01
This study focusses on the investigation of the potential binding of the fungicide difenoconazole to soil chitinases using a computational approach. Computational characterization of the substrate binding sites of Serratia marcescens and Bacillus cereus chitinases using Fpocket tool reflects the role of hydrophobic residues for the substrate binding and the high local hydrophobic density of both sites. Molecular docking study reveals that difenoconazole is able to bind to Serratia marcescens and Bacillus cereus chitinases active sites, the binding energies being comparable
Development of Computer Science Disciplines - A Social Network Analysis Approach
Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias
2011-01-01
In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...
Heat Transfer treatment in computer codes for safety analysis
International Nuclear Information System (INIS)
Jerele, A.; Gregoric, M.
1984-01-01
Increased number of operating nuclear power plants has stressed importance of nuclear safety evaluation. For this reason, accordingly to regulatory commission request, safety analyses with computer codes are preformed. In this paper part of this thermohydraulic models dealing with wall-to-fluid heat transfer correlations in computer codes TRAC=PF1, RELAP4/MOD5, RELAP5/MOD1 and COBRA-IV is discussed. (author)
I. Fisk
2010-01-01
Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...
Analysis of Sci-Hub downloads of computer science papers
Directory of Open Access Journals (Sweden)
Andročec Darko
2017-07-01
Full Text Available The scientific knowledge is disseminated by research papers. Most of the research literature is copyrighted by publishers and avail- able only through paywalls. Recently, some websites offer most of the recent content for free. One of them is the controversial website Sci-Hub that enables access to more than 47 million pirated research papers. In April 2016, Science Magazine published an article on Sci-Hub activity over the period of six months and publicly released the Sci-Hub’s server log data. The mentioned paper aggregates the view that relies on all downloads and for all fields of study, but these findings might be hiding interesting patterns within computer science. The mentioned Sci-Hub log data was used in this paper to analyse downloads of computer science papers based on DBLP’s list of computer science publications. The top downloads of computer science papers were analysed, together with the geographical location of Sci-Hub users, the most downloaded publishers, types of papers downloaded, and downloads of computer science papers per publication year. The results of this research can be used to improve legal access to the most relevant scientific repositories or journals for the computer science field.
Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D
2015-04-01
Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.
Domain analysis of computational science - Fifty years of a scientific computing group
Energy Technology Data Exchange (ETDEWEB)
Tanaka, M.
2010-02-23
I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.
Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will
Uncertainty analysis of NDA waste measurements using computer simulations
International Nuclear Information System (INIS)
Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.
2000-01-01
Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of
Nondestructive analysis of urinary calculi using micro computed tomography
Directory of Open Access Journals (Sweden)
Lingeman James E
2004-12-01
Full Text Available Abstract Background Micro computed tomography (micro CT has been shown to provide exceptionally high quality imaging of the fine structural detail within urinary calculi. We tested the idea that micro CT might also be used to identify the mineral composition of urinary stones non-destructively. Methods Micro CT x-ray attenuation values were measured for mineral that was positively identified by infrared microspectroscopy (FT-IR. To do this, human urinary stones were sectioned with a diamond wire saw. The cut surface was explored by FT-IR and regions of pure mineral were evaluated by micro CT to correlate x-ray attenuation values with mineral content. Additionally, intact stones were imaged with micro CT to visualize internal morphology and map the distribution of specific mineral components in 3-D. Results Micro CT images taken just beneath the cut surface of urinary stones showed excellent resolution of structural detail that could be correlated with structure visible in the optical image mode of FT-IR. Regions of pure mineral were not difficult to find by FT-IR for most stones and such regions could be localized on micro CT images of the cut surface. This was not true, however, for two brushite stones tested; in these, brushite was closely intermixed with calcium oxalate. Micro CT x-ray attenuation values were collected for six minerals that could be found in regions that appeared to be pure, including uric acid (3515 – 4995 micro CT attenuation units, AU, struvite (7242 – 7969 AU, cystine (8619 – 9921 AU, calcium oxalate dihydrate (13815 – 15797 AU, calcium oxalate monohydrate (16297 – 18449 AU, and hydroxyapatite (21144 – 23121 AU. These AU values did not overlap. Analysis of intact stones showed excellent resolution of structural detail and could discriminate multiple mineral types within heterogeneous stones. Conclusions Micro CT gives excellent structural detail of urinary stones, and these results demonstrate the feasibility
Cluster Computing For Real Time Seismic Array Analysis.
Martini, M.; Giudicepietro, F.
A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by
Cost-Benefit Analysis of Computer Resources for Machine Learning
Champion, Richard A.
2007-01-01
Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.
High-performance computing in accelerating structure design and analysis
International Nuclear Information System (INIS)
Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok
2006-01-01
Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)
The analysis of gastric function using computational techniques
International Nuclear Information System (INIS)
Young, Paul
2002-01-01
The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that
Heat exchanger performance analysis programs for the personal computer
International Nuclear Information System (INIS)
Putman, R.E.
1992-01-01
Numerous utility industry heat exchange calculations are repetitive and thus lend themselves to being performed on a Personal Computer. These programs may be regarded as engineering tools which, when put together, can form a Toolbox. However, the practicing Results Engineer in the utility industry desires not only programs that are robust as well as easy to use but can also be used both on desktop and laptop PC's. The latter also offer the opportunity to take the computer into the plant or control room, and use it there to process test or operating data right on the spot. Most programs evolve through the needs which arise in the course of day-to-day work. This paper describes several of the more useful programs of this type and outlines some of the guidelines to be followed when designing personal computer programs for use by the practicing Results Engineer
Low-frequency computational electromagnetics for antenna analysis
Energy Technology Data Exchange (ETDEWEB)
Miller, E.K. (Los Alamos National Lab., NM (USA)); Burke, G.J. (Lawrence Livermore National Lab., CA (USA))
1991-01-01
An overview of low-frequency, computational methods for modeling the electromagnetic characteristics of antennas is presented here. The article presents a brief analytical background, and summarizes the essential ingredients of the method of moments, for numerically solving low-frequency antenna problems. Some extensions to the basic models of perfectly conducting objects in free space are also summarized, followed by a consideration of some of the same computational issues that affect model accuracy, efficiency and utility. A variety of representative computations are then presented to illustrate various modeling aspects and capabilities that are currently available. A fairly extensive bibliography is included to suggest further reference material to the reader. 90 refs., 27 figs.
Radiographic test phantom for computed tomographic lung nodule analysis
International Nuclear Information System (INIS)
Zerhouni, E.A.
1987-01-01
This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs
Contributions from I. Fisk
2012-01-01
Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences. Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...
M. Kasemann
Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...
Matthias Kasemann
Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...
I. Fisk
2013-01-01
Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites. Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month. Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB. Figure 3: The volume of data moved between CMS sites in the last six months The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...
New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity
Pak, Chan-Gi; Lung, Shun-Fat
2017-01-01
A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.
Verification of structural analysis computer codes in nuclear engineering
International Nuclear Information System (INIS)
Zebeljan, Dj.; Cizelj, L.
1990-01-01
Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)
Multi-scale analysis of lung computed tomography images
Gori, I; Fantacci, M E; Preite Martinez, A; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C
2007-01-01
A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.
Computer codes for beam dynamics analysis of cyclotronlike accelerators
Smirnov, V.
2017-12-01
Computer codes suitable for the study of beam dynamics in cyclotronlike (classical and isochronous cyclotrons, synchrocyclotrons, and fixed field alternating gradient) accelerators are reviewed. Computer modeling of cyclotron segments, such as the central zone, acceleration region, and extraction system is considered. The author does not claim to give a full and detailed description of the methods and algorithms used in the codes. Special attention is paid to the codes already proven and confirmed at the existing accelerating facilities. The description of the programs prepared in the worldwide known accelerator centers is provided. The basic features of the programs available to users and limitations of their applicability are described.
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
Analysis of Computer Experiments with Multiple Noise Sources
DEFF Research Database (Denmark)
Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae
2010-01-01
In this paper we present a modeling framework for analyzing computer models with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled...
Microscopes and computers combined for analysis of chromosomes
Butler, J. W.; Butler, M. K.; Stroud, A. N.
1969-01-01
Scanning machine CHLOE, developed for photographic use, is combined with a digital computer to obtain quantitative and statistically significant data on chromosome shapes, distribution, density, and pairing. CHLOE permits data acquisition about a chromosome complement to be obtained two times faster than by manual pairing.
Computational Modeling and Analysis of Mechanically Painful Stimulations
DEFF Research Database (Denmark)
Manafi Khanian, Bahram
Cuff algometry is used for quantitative assessment of deep-tissue sensitivity. The main purpose of this PhD dissertation is to provide a novel insight into the intrinsic and extrinsic factors which are involved in mechanically induced pain during cuff pressure algometry. A computational 3D finite...
Three-dimensional analysis of cellular microstructures by computer simulation
International Nuclear Information System (INIS)
Hanson, K.; Morris, J.W. Jr.
1977-06-01
For microstructures of the ''cellular'' type (isotropic growth from a distribution of nuclei which form simultaneously), it is possible to construct an efficient code which will completely analyze the microstructure in three dimensions. Such a computer code for creating and storing the connected graph was constructed
Two computer programs for the analysis of marine magnetic data
Digital Repository Service at National Institute of Oceanography (India)
Rao, M.M.M.; Lakshminarayana, S.; Murthy, K.S.R.; Subrahmanyam, A.S.
. 48, no. 6, p. 754-774. Marquardt, D. W., 1963, An algorithm for least-squares estimation of nonlinear parameters: Jour. Soc. Indust. Appl. Maths., v. l l, p. 431-441. Talwani, M., and Heirtzler, J. R., 1964, Computation of magnetic anomalies...
Computer-aided analysis of grain growth in metals
DEFF Research Database (Denmark)
Klimanek, P.; May, C.; Richter, H.
1993-01-01
Isothermal grain growth in aluminium, copper and alpha-iron was investigated experimentally at elevated temperatures and quantitatively interpreted by computer simulation on the base of a statistical model described in [4,5,6]. As it is demonstrated for the grain growth kinetics, the experimental...... data can be fitted satisfactorly....
Introduction to Numerical Computation - analysis and Matlab illustrations
DEFF Research Database (Denmark)
Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun
In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...... are illustrated by examples in MATLAB....
Computational analysis of turbulent flow in hydroelectric plant intakes
Energy Technology Data Exchange (ETDEWEB)
Bouhadji, L.; Lemon, D.D.; Billenness, D.; Fissel, D. [ASL Environmental Sciences Inc., Sidney, British Columbia (Canada)]. E-mail: lbouhadji@aslenv.com; Djilali, N. [Univ. of Victoria, Dept. of Mechanical Engineering, Victoria, British Columbia (Canada)]. E-mail: ndjilali@uvic.ca
2003-07-01
Turbulent flows in the Lower Monumental powerhouse intake are investigated using computational fluid dynamics. Simulations are carried out to gain an understanding into the impact of a grid-like trash rack on the downstream turbulent flow characteristics within the intake. (author)
Computer Tools for Construction, Modification and Analysis of Petri Nets
DEFF Research Database (Denmark)
Jensen, Kurt
1987-01-01
The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...
Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques
Ramers, Douglas L.
2005-01-01
The purpose of this project was to try to interpret the results of some tests that were performed earlier this year and to demonstrate a possible use of emergence in computing to solve IVHM problems. The test data used was collected with piezoelectric sensors to detect mechanical changes in structures. This project team was included of Dr. Doug Ramers and Dr. Abdul Jallob of the Summer Faculty Fellowship Program, Arnaldo Colon-Lopez - a student intern from the University of Puerto Rico of Turabo, and John Lassister and Bob Engberg of the Structural and Dynamics Test Group. The tests were performed by Bob Engberg to compare the performance two types of piezoelectric (piezo) sensors, Pb(Zr(sub 1-1)Ti(sub x))O3, which we will label PZT, and Pb(Zn(sub 1/3)Nb(sub 2/3))O3-PbTiO, which we will label SCP. The tests were conducted under varying temperature and pressure conditions. One set of tests was done by varying water pressure inside an aluminum liner covered with carbon-fiber composite layers (a cylindrical "bottle" with domed ends) and the other by varying temperatures down to cryogenic levels on some specially prepared composite panels. This report discusses the data from the pressure study. The study of the temperature results was not completed in time for this report. The particular sensing done with these piezo sensors is accomplished by the sensor generating an controlled vibration that is transmitted into the structure to which the sensor is attached, and the same sensor then responding to the induced vibration of the structure. There is a relationship between the mechanical impedance of the structure and the resulting electrical impedance produced in the in the piezo sensor. The impedance is also a function of the excitation frequency. Changes in the real part of impendance signature relative to an original reference signature indicate a change in the coupled structure that could be the results of damage or strain. The water pressure tests were conducted by
Development of computer software for pavement life cycle cost analysis.
1988-01-01
The life cycle cost analysis program (LCCA) is designed to automate and standardize life cycle costing in Virginia. It allows the user to input information necessary for the analysis, and it then completes the calculations and produces a printed copy...
Molecular cloning, expression and computational analysis of a water ...
African Journals Online (AJOL)
User
2012-11-06
Nov 6, 2012 ... analysis of a water stress inducible copper-containing ... Although, in slico analysis of the protein have indicated its probable structure and functions, further ..... based on protein data bank (PDB) template c1ksiA which.
Computer analysis with the CEASEMT finite element system
International Nuclear Information System (INIS)
Bung, H.; Clement, G.; Hoffmann, A.; Jakubowicz, H.
1979-01-01
This section presents results for the analyses of all three international Piping Benchmark Problems. An inelastic analysis of each problem was performed using a full three-dimensional shell analysis (TRICO code) and a simplified piping analysis based on beam theory (TEDEL code)
Computer analysis with the CEASEMT finite element system
Energy Technology Data Exchange (ETDEWEB)
Bung, H; Clement, G; Hoffmann, A; Jakubowicz, H
1979-06-01
This section presents results for the analyses of all three international Piping Benchmark Problems. An inelastic analysis of each problem was performed using a full three-dimensional shell analysis (TRICO code) and a simplified piping analysis based on beam theory (TEDEL code)
Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Schaefer, Gerald; Gulimova, Victoria; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Prun, Victor; Zolotov, Denis; Asadchikov, Victor
2013-01-01
This paper presents the results of a comprehensive analysis of structural changes in the caudal vertebrae of Turner's thick-toed geckos by computer microtomography and X-ray fluorescence analysis. We present algorithms used for the reconstruction of tomographic images which allow to work with high noise level projections that represent typical conditions dictated by the nature of the samples. Reptiles, due to their ruggedness, small size, belonging to the amniote and a number of other valuable features, are an attractive model object for long-orbital experiments on unmanned spacecraft. Issues of possible changes in their bone tissue under the influence of spaceflight are the subject of discussions between biologists from different laboratories around the world.
Directory of Open Access Journals (Sweden)
Tolga Karakan
2016-06-01
Full Text Available Objective: To investigate the ultra-structure of urinary system stones using micro-focus computed tomography (MCT, which makes non-destructive analysis and to compare with wet chemical analysis. Methods: This study was carried out at the Ankara Training and Research hospital. Renal stones, removed from 30 patients during percutaneous nephrolithotomy (PNL surgery, were included in the study. The stones were blindly evaluated by the specialists with MCT and chemical analysis. Results: The comparison of the stone components between chemical analysis and MCT, showed that the rate of consistence was very low (p0.05. It was also seen that there was no significant relation between its 3D structure being heterogeneous or homogenous. Conclusion: The stone analysis with MCT is a time consuming and costly method. This method is useful to understand the mechanisms of stone formation and an important guide to develop the future treatment modalities.
Computational modeling applied to stress gradient analysis for metallic alloys
International Nuclear Information System (INIS)
Iglesias, Susana M.; Assis, Joaquim T. de; Monine, Vladimir I.
2009-01-01
Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)
Superimposed Code Theorectic Analysis of DNA Codes and DNA Computing
2010-03-01
that the hybridization that occurs between a DNA strand and its Watson - Crick complement can be used to perform mathematical computation. This research...ssDNA single stranded DNA WC Watson – Crick A Adenine C Cytosine G Guanine T Thymine ... Watson - Crick (WC) duplex, e.g., TCGCA TCGCA . Note that non-WC duplexes can form and such a formation is called a cross-hybridization. Cross
Comparison of two three-dimensional cephalometric analysis computer software
Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek
2014-01-01
Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...
Reliability analysis of Airbus A-330 computer flight management system
Fajmut, Metod
2010-01-01
Diploma thesis deals with digitized, computerized flight control system »Fly-by-wire« and security aspects of the computer system of an aircraft Airbus A330. As for space and military aircraft structures is also in commercial airplanes, much of the financial contribution devoted to reliability. Conventional aircraft control systems have, and some are still, to rely on mechanical and hydraulic connections between the controls on aircraft operated by the pilot and control surfaces. But newer a...
The analysis of one-dimensional reactor kinetics benchmark computations
International Nuclear Information System (INIS)
Sidell, J.
1975-11-01
During March 1973 the European American Committee on Reactor Physics proposed a series of simple one-dimensional reactor kinetics problems, with the intention of comparing the relative efficiencies of the numerical methods employed in various codes, which are currently in use in many national laboratories. This report reviews the contributions submitted to this benchmark exercise and attempts to assess the relative merits and drawbacks of the various theoretical and computer methods. (author)
Computer assisted analysis of hand radiographs in infantile hypophosphatasia carriers
International Nuclear Information System (INIS)
Chodirker, B.N.; Greenberg, C.R.; Manitoba Univ., Winnipeg, MB; Roy, D.; Cheang, M.; Evans, J.A.; Manitoba Univ., Winnipeg, MB; Manitoba Univ., Winnipeg, MB; Reed, M.H.; Manitoba Univ., Winnipeg, MB
1991-01-01
Hand radiographs of 49 carriers of infantile hypophosphatasia and 67 non-carriers were evaluated using two Apple IIe Computer Programs and Apple Graphics Tablet. CAMPS was used to determine the bone lengths and calculate the metacarpophalangeal profiles. A newly developed program (ADAM) was used to determine bone density based on percent cortical area of the second metacarpal. Carriers of infantile hypophosphatasia had significantly less dense bones. (orig.)
Mathematical modellings and computational methods for structural analysis of LMFBR's
International Nuclear Information System (INIS)
Liu, W.K.; Lam, D.
1983-01-01
In this paper, two aspects of nuclear reactor problems are discussed, modelling techniques and computational methods for large scale linear and nonlinear analyses of LMFBRs. For nonlinear fluid-structure interaction problem with large deformation, arbitrary Lagrangian-Eulerian description is applicable. For certain linear fluid-structure interaction problem, the structural response spectrum can be found via 'added mass' approach. In a sense, the fluid inertia is accounted by a mass matrix added to the structural mass. The fluid/structural modes of certain fluid-structure problem can be uncoupled to get the reduced added mass. The advantage of this approach is that it can account for the many repeated structures of nuclear reactor. In regard to nonlinear dynamic problem, the coupled nonlinear fluid-structure equations usually have to be solved by direct time integration. The computation can be very expensive and time consuming for nonlinear problems. Thus, it is desirable to optimize the accuracy and computation effort by using implicit-explicit mixed time integration method. (orig.)
Analysis of helium-ion scattering with a desktop computer
Butler, J. W.
1986-04-01
This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.
Analysis of control room computers at nuclear power plants
International Nuclear Information System (INIS)
Leijonhufvud, S.; Lindholm, L.
1984-03-01
The following problems are analyzed: - the developing of a system - hardware and software - data - the aquisition of the system - operation and service. The findings are: - most reliability problems can be solved by doubling critical units - reliability in software has a quality that can only be created through development - reliability in computer systems in extremely unusual situations can not be quantified or verified, except possibly for very small and functionally simple systems - to attain the highest possible reliability by such simple systems these have to: - contian one or very few functions - be functionally simple - be application-transparent, viz. the internal function of the system should be independent of the status of the process - a computer system will compete succesfully with other possible systems regarding reliability for the following reasons: - if the function is simple enough for other systems, the dator system would be small - if the functions cannot be realized by other systems - the computer system would complement the human effort - and the man-machine system would be a better solution than no system, possibly better than human function only. (Aa)
Computational techniques in gamma-ray skyshine analysis
International Nuclear Information System (INIS)
George, D.L.
1988-12-01
Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified to use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs
ELECTRONIC EVIDENCE IN THE JUDICIAL PROCEEDINGS AND COMPUTER FORENSIC ANALYSIS
Directory of Open Access Journals (Sweden)
Marija Boban
2017-01-01
Full Text Available Today’s perspective of the information society is characterized by the terminology of modern dictionaries of globalization including the terms such as convergence, digitization (media, technology and/or telecommunications and mobility of people or technology. Each word with progress, development, a positive sign of the rise of the information society. On the other hand in a virtual environment traditional evidence in judicial proceedings with the document on paper substrate, are becoming electronic evidence, and their management processes and criteria for admissibility are changing over traditional evidence. The rapid growth of computer data created new opportunities and the growth of new forms of computing, and cyber crime, but also the new ways of proof in court cases, which were unavailable just a few decades. The authors of this paper describe new trends in the development of the information society and the emergence of electronic evidence, with emphasis on the impact of the development of computer crime on electronic evidence; the concept, legal regulation and probative value of electronic evidence, and in particular of electronic documents; and the issue of electronic evidence expertise and electronic documents in court proceedings.
Computer-simulated experiments and computer games: a method of design analysis
Directory of Open Access Journals (Sweden)
Jerome J. Leary
1995-12-01
Full Text Available Through the new modularization of the undergraduate science degree at the University of Brighton, larger numbers of students are choosing to take some science modules which include an amount of laboratory practical work. Indeed, within energy studies, the fuels and combustion module, for which the computer simulations were written, has seen a fourfold increase in student numbers from twelve to around fifty. Fitting out additional laboratories with new equipment to accommodate this increase presented problems: the laboratory space did not exist; fitting out the laboratories with new equipment would involve a relatively large capital spend per student for equipment that would be used infrequently; and, because some of the experiments use inflammable liquids and gases, additional staff would be needed for laboratory supervision.
A computer program for structural analysis of fuel elements
International Nuclear Information System (INIS)
Hayashi, I.M.V.; Perrotta, J.A.
1988-01-01
It's presented the code ELCOM for the matrix analysis of tubular structures coupled by rigid spacers, typical of PWR's fuel elements. The code ELCOM makes a static structural analysis, where the displacements and internal forces are obtained for each structure at the joints with the spacers, and also, the natural frequencies and vibrational modes of an equivalent integrated structure are obtained. The ELCOM result is compared to a PWR fuel element structural analysis obtained in published paper. (author) [pt
Data analysis of asymmetric structures advanced approaches in computational statistics
Saito, Takayuki
2004-01-01
Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.
Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment
Grigoryeva, Maria; The ATLAS collaboration
2018-01-01
Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...
CASKETSS: a computer code system for thermal and structural analysis of nuclear fuel shipping casks
International Nuclear Information System (INIS)
Ikushima, Takeshi
1989-02-01
A computer program CASKETSS has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS measn a modular code system for CASK Evaluation code system Thermal and Structural Safety. Main features of CASKETSS are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) Some of the computer programs in the code system has been programmed to provide near optimal speed on vector processing computers. (3) Data libralies fro thermal and structural analysis are provided in the code system. (4) Input data generator is provided in the code system. (5) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)
Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N
2009-06-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).
International Nuclear Information System (INIS)
Heo, Jaeseok; Kim, Kyung Doo
2015-01-01
Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper
Energy Technology Data Exchange (ETDEWEB)
Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr
2015-10-15
Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.
Routing performance analysis and optimization within a massively parallel computer
Archer, Charles Jens; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen
2013-04-16
An apparatus, program product and method optimize the operation of a massively parallel computer system by, in part, receiving actual performance data concerning an application executed by the plurality of interconnected nodes, and analyzing the actual performance data to identify an actual performance pattern. A desired performance pattern may be determined for the application, and an algorithm may be selected from among a plurality of algorithms stored within a memory, the algorithm being configured to achieve the desired performance pattern based on the actual performance data.
Comprehensive analysis of a Radiology Operations Management computer system.
Arenson, R L; London, J W
1979-11-01
The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.
Cost-effectiveness analysis of computer-based assessment
Directory of Open Access Journals (Sweden)
Pauline Loewenberger
2003-12-01
Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.
NEWCAS: an interactive computer program for particle size analysis.
Energy Technology Data Exchange (ETDEWEB)
Hill, M.A.; Watson, C.R.; Moss, O.R.
1977-12-01
NEWCAS, a FORTRAN program executable on PDP 11/70, was used to calculate the respirable fraction of aerosols from cascade impactor data. This report describes how NEWCAS works and how to use it. Included is a complete program listing. A novel feature of the program is the method used to display a log-normal probability plot on an ordinary (non-graphics) computer terminal. Calculations are independent of how the stage activity is measured. NEWCAS always assumes a log-normal distribution. 8 figures. (RWR)
GIANT: a computer code for General Interactive ANalysis of Trajectories
International Nuclear Information System (INIS)
Jaeger, J.; Lee, M.; Servranckx, R.; Shoaee, H.
1985-04-01
Many model-driven diagnostic and correction procedures have been developed at SLAC for the on-line computer controlled operation of SPEAR, PEP, the LINAC, and the Electron Damping Ring. In order to facilitate future applications and enhancements, these procedures are being collected into a single program, GIANT. The program allows interactive diagnosis as well as performance optimization of any beam transport line or circular machine. The test systems for GIANT are those of the SLC project. The organization of this program and some of the recent applications of the procedures will be described in this paper
Analysis of computer images in the presence of metals
Buzmakov, Alexey; Ingacheva, Anastasia; Prun, Victor; Nikolaev, Dmitry; Chukalina, Marina; Ferrero, Claudio; Asadchikov, Victor
2018-04-01
Artifacts caused by intensely absorbing inclusions are encountered in computed tomography via polychromatic scanning and may obscure or simulate pathologies in medical applications. To improve the quality of reconstruction if high-Z inclusions in presence, previously we proposed and tested with synthetic data an iterative technique with soft penalty mimicking linear inequalities on the photon-starved rays. This note reports a test at the tomographic laboratory set-up at the Institute of Crystallography FSRC "Crystallography and Photonics" RAS in which tomographic scans were successfully made of temporary tooth without inclusion and with Pb inclusion.
A Trend Analysis of Computer Literacy Skills of Preservice Teachers During Six Academic Years.
Sheffield, Caryl J.
1998-01-01
Analyzes trends in computer-literacy skills of preservice teachers during the period 1991/92 to 1996/97. A significant linear pattern of increasing means was found in word processing, spreadsheet, hardware, operating system software, and the mouse. Analysis provides a perspective on how increasing access to computers in high school translates into…
Enriquez, Judith Guevarra
2010-01-01
In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…
An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences
Inchamnan, Wilawan
2016-01-01
This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…
Lynne M. Westphal
2000-01-01
By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...
Analysis of irradiated biogenic amines by computational chemistry and spectroscopy
International Nuclear Information System (INIS)
Oliveira, Jorge L.S.P.; Borges Junior, Itamar; Cardozo, Monique; Souza, Stefania P.; Lima, Antonio L.S.; Lima, Keila S.C.
2011-01-01
Biogenic Amines (B A) are nitrogenous compounds able to cause food poisoning. In this work, we studied the tyramine, one of the most common BA present in foods by combining experimental measured IR (Infrared) and GC/MS (Gas Chromatograph / Mass Spectrometry) spectra and computational quantum chemistry. Density Functional Theory (DFT) and the Deformed Atoms in Molecules (DMA) method was used to compute the partition the electronic densities in a chemically-intuitive way and electrostatic potentials of molecule to identify the acid and basic sites. Trading pattern was irradiated using a Cs 137 radiator, and each sample was identified by IR and GC/MS. Calculated and experimental IR spectra were compared. We observed that ionizing gamma irradiation was very effective in decreasing the population of standard amine, resulting in fragments that could be rationalized through the quantum chemistry calculations. In particular, we could locate the acid and basic sites of both molecules and identify possible sites of structural weaknesses, which allowed to propose mechanistic schemes for the breaking of chemical bonds by the irradiation. Moreover, from this work we hope it will be also possible to properly choose the dose of gamma irradiation which should be provided to eliminate each type of contamination. (author)
Quantitative analysis of cholesteatoma using high resolution computed tomography
International Nuclear Information System (INIS)
Kikuchi, Shigeru; Yamasoba, Tatsuya; Iinuma, Toshitaka.
1992-01-01
Seventy-three cases of adult cholesteatoma, including 52 cases of pars flaccida type cholesteatoma and 21 of pars tensa type cholesteatoma, were examined using high resolution computed tomography, in both axial (lateral semicircular canal plane) and coronal sections (cochlear, vestibular and antral plane). These cases were classified into two subtypes according to the presence of extension of cholesteatoma into the antrum. Sixty cases of chronic otitis media with central perforation (COM) were also examined as controls. Various locations of the middle ear cavity were measured in terms of size in comparison with pars flaccida type cholesteatoma, pars tensa type cholesteatoma and COM. The width of the attic was significantly larger in both pars flaccida type and pars tensa type cholesteatoma than in COM. With pars flaccida type cholesteatoma there was a significantly larger distance between the malleus and lateral wall of the attic than with COM. In contrast, the distance between the malleus and medial wall of the attic was significantly larger with pars tensa type cholesteatoma than with COM. With cholesteatoma extending into the antrum, regardless of the type of cholesteatoma, there were significantly larger distances than with COM at the following sites: the width and height of the aditus ad antrum, and the width, height and anterior-posterior diameter of the antrum. However, these distances were not significantly different between cholesteatoma without extension into the antrum and COM. The hitherto demonstrated qualitative impressions of bone destruction in cholesteatoma were quantitatively verified in detail using high resolution computed tomography. (author)
An empirical analysis of journal policy effectiveness for computational reproducibility.
Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun
2018-03-13
A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.
FRANTIC: a computer code for time dependent unavailability analysis
International Nuclear Information System (INIS)
Vesely, W.E.; Goldberg, F.F.
1977-03-01
The FRANTIC computer code evaluates the time dependent and average unavailability for any general system model. The code is written in FORTRAN IV for the IBM 370 computer. Non-repairable components, monitored components, and periodically tested components are handled. One unique feature of FRANTIC is the detailed, time dependent modeling of periodic testing which includes the effects of test downtimes, test overrides, detection inefficiencies, and test-caused failures. The exponential distribution is used for the component failure times and periodic equations are developed for the testing and repair contributions. Human errors and common mode failures can be included by assigning an appropriate constant probability for the contributors. The output from FRANTIC consists of tables and plots of the system unavailability along with a breakdown of the unavailability contributions. Sensitivity studies can be simply performed and a wide range of tables and plots can be obtained for reporting purposes. The FRANTIC code represents a first step in the development of an approach that can be of direct value in future system evaluations. Modifications resulting from use of the code, along with the development of reliability data based on operating reactor experience, can be expected to provide increased confidence in its use and potential application to the licensing process
Computational Fluid Dynamics Analysis of High Injection Pressure Blended Biodiesel
Khalid, Amir; Jaat, Norrizam; Faisal Hushim, Mohd; Manshoor, Bukhari; Zaman, Izzuddin; Sapit, Azwan; Razali, Azahari
2017-08-01
Biodiesel have great potential for substitution with petrol fuel for the purpose of achieving clean energy production and emission reduction. Among the methods that can control the combustion properties, controlling of the fuel injection conditions is one of the successful methods. The purpose of this study is to investigate the effect of high injection pressure of biodiesel blends on spray characteristics using Computational Fluid Dynamics (CFD). Injection pressure was observed at 220 MPa, 250 MPa and 280 MPa. The ambient temperature was kept held at 1050 K and ambient pressure 8 MPa in order to simulate the effect of boost pressure or turbo charger during combustion process. Computational Fluid Dynamics were used to investigate the spray characteristics of biodiesel blends such as spray penetration length, spray angle and mixture formation of fuel-air mixing. The results shows that increases of injection pressure, wider spray angle is produced by biodiesel blends and diesel fuel. The injection pressure strongly affects the mixture formation, characteristics of fuel spray, longer spray penetration length thus promotes the fuel and air mixing.
A handheld computer-aided diagnosis system and simulated analysis
Su, Mingjian; Zhang, Xuejun; Liu, Brent; Su, Kening; Louie, Ryan
2016-03-01
This paper describes a Computer Aided Diagnosis (CAD) system based on cellphone and distributed cluster. One of the bottlenecks in building a CAD system for clinical practice is the storage and process of mass pathology samples freely among different devices, and normal pattern matching algorithm on large scale image set is very time consuming. Distributed computation on cluster has demonstrated the ability to relieve this bottleneck. We develop a system enabling the user to compare the mass image to a dataset with feature table by sending datasets to Generic Data Handler Module in Hadoop, where the pattern recognition is undertaken for the detection of skin diseases. A single and combination retrieval algorithm to data pipeline base on Map Reduce framework is used in our system in order to make optimal choice between recognition accuracy and system cost. The profile of lesion area is drawn by doctors manually on the screen, and then uploads this pattern to the server. In our evaluation experiment, an accuracy of 75% diagnosis hit rate is obtained by testing 100 patients with skin illness. Our system has the potential help in building a novel medical image dataset by collecting large amounts of gold standard during medical diagnosis. Once the project is online, the participants are free to join and eventually an abundant sample dataset will soon be gathered enough for learning. These results demonstrate our technology is very promising and expected to be used in clinical practice.
An Exploratory Analysis of Computer Mediated Communications on Cyberstalking Severity
Directory of Open Access Journals (Sweden)
Stephen D. Barnes
2007-09-01
Full Text Available The interaction between disjunctive interpersonal relationships, those where the parties to the relationship disagree on the goals of the relationship, and the use of computer mediated communications channels is a relatively unexplored domain.Â Bargh (2002 suggests that CMC channels can amplify the development of interpersonal relationships, and notes that the effect is not constant across communications activities.Â This proposal suggests a line of research that explores the interaction between computer mediated communications (CMC and stalking, which is a common form of disjunctive relationships.Â Field data from cyberstalking cases will be used to look at the effects of CMC channels on stalking case severity, and exploring the relative impacts of CMC channel characteristics on such cases.Â To accomplish this, a ratio scaled measure of stalking case severity is proposed for use in exploring the relationship between case severity and CMC media characteristics, anonymity, and the prior relationship between the stalker and the victim.Â Expected results are identified, and follow-up research is proposed.Â
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the
Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter
2012-01-01
Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…
Putten, Jim Vander; Nolen, Amanda L.
2010-01-01
This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…
Toward a computer-aided methodology for discourse analysis ...
African Journals Online (AJOL)
aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...
Inferring Group Processes from Computer-Mediated Affective Text Analysis
Energy Technology Data Exchange (ETDEWEB)
Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University
2011-02-01
Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.
Computational fluid dynamics analysis of a mixed flow pump impeller
African Journals Online (AJOL)
ATHARVA
International Journal of Engineering, Science and Technology ... From the CFD analysis software and advanced post processing tools the complex flow inside the ... The numerical simulation can provide quite accurate information on the fluid ...
Directory of Open Access Journals (Sweden)
Mohamed Kenawey
2016-12-01
Conclusion: Computer assisted lower limb alignment analysis is reliable whether using graphics editing program or specialized planning software. However slight higher variability for angles away from the knee joint can be expected.
Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis
Directory of Open Access Journals (Sweden)
Paul D. Morris, PhD
2017-08-01
Full Text Available Fractional flow reserve (FFR-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel “pseudotransient” analysis protocol for computing virtual fractional flow reserve (vFFR based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33% and more by microvascular physiology (59%. If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.
May Day: A computer code to perform uncertainty and sensitivity analysis. Manuals
International Nuclear Information System (INIS)
Bolado, R.; Alonso, A.; Moya, J.M.
1996-07-01
The computer program May Day was developed to carry out the uncertainty and sensitivity analysis in the evaluation of radioactive waste storage. The May Day was made by the Polytechnical University of Madrid. (Author)
Computational Intelligence Techniques for Electro-Physiological Data Analysis
Riera Sardà, Alexandre
2012-01-01
This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...
Fast Computation and Assessment Methods in Power System Analysis
Nagata, Masaki
Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Computer methods for transient fluid-structure analysis of nuclear reactors
International Nuclear Information System (INIS)
Belytschko, T.; Liu, W.K.
1985-01-01
Fluid-structure interaction problems in nuclear engineering are categorized according to the dominant physical phenomena and the appropriate computational methods. Linear fluid models that are considered include acoustic fluids, incompressible fluids undergoing small disturbances, and small amplitude sloshing. Methods available in general-purpose codes for these linear fluid problems are described. For nonlinear fluid problems, the major features of alternative computational treatments are reviewed; some special-purpose and multipurpose computer codes applicable to these problems are then described. For illustration, some examples of nuclear reactor problems that entail coupled fluid-structure analysis are described along with computational results
Analysis of ring enhancement in the cranial computed tomography
Energy Technology Data Exchange (ETDEWEB)
Huh, Seung Jae; Chung, Yong In; Chang, Kee Hyun [College of Medicine, Seoul National University, Seoul (Korea, Republic of)
1980-12-15
A total of 83 cases with ring enhancement in the cranial computed tomography were radiologically analyzed to determine the specific CT findings of the primary and metastatic brain tumor, inflammatory disease, resolving hematoma, and cerebral infarction. The brief results are as follows. Glioblastoma multiform show a characteristic thick or thin irregular ring enhancement with significant mass effect and surrounding edema. Most of the metastatic tumors also show irregular thick or thin walled ring enhancement with significant surrounding edema. Tumoral hemorrhage was observed in the metastatic melanoma, breast cancer, and lung cancer. The brain abscess usually show characteristic thin regular and smooth ring enhancement with moderate peripheral edema. The parasitic cysts also show thin regular ring enhancement with different degree of surrounding edema. Ring enhancement in resolving hematomas and cerebral infarctions usually occurs about 10-30 days after the onset of symptoms, which shows thin and regular ring pattern without significant surrounding edema.
Analysis of ring enhancement in the cranial computed tomography
International Nuclear Information System (INIS)
Huh, Seung Jae; Chung, Yong In; Chang, Kee Hyun
1980-01-01
A total of 83 cases with ring enhancement in the cranial computed tomography were radiologically analyzed to determine the specific CT findings of the primary and metastatic brain tumor, inflammatory disease, resolving hematoma, and cerebral infarction. The brief results are as follows. Glioblastoma multiform show a characteristic thick or thin irregular ring enhancement with significant mass effect and surrounding edema. Most of the metastatic tumors also show irregular thick or thin walled ring enhancement with significant surrounding edema. Tumoral hemorrhage was observed in the metastatic melanoma, breast cancer, and lung cancer. The brain abscess usually show characteristic thin regular and smooth ring enhancement with moderate peripheral edema. The parasitic cysts also show thin regular ring enhancement with different degree of surrounding edema. Ring enhancement in resolving hematomas and cerebral infarctions usually occurs about 10-30 days after the onset of symptoms, which shows thin and regular ring pattern without significant surrounding edema
Computer analysis of sodium cold trap design and performance
International Nuclear Information System (INIS)
McPheeters, C.C.; Raue, D.J.
1983-11-01
Normal steam-side corrosion of steam-generator tubes in Liquid Metal Fast Breeder Reactors (LMFBRs) results in liberation of hydrogen, and most of this hydrogen diffuses through the tubes into the heat-transfer sodium and must be removed by the purification system. Cold traps are normally used to purify sodium, and they operate by cooling the sodium to temperatures near the melting point, where soluble impurities including hydrogen and oxygen precipitate as NaH and Na 2 O, respectively. A computer model was developed to simulate the processes that occur in sodium cold traps. The Model for Analyzing Sodium Cold Traps (MASCOT) simulates any desired configuration of mesh arrangements and dimensions and calculates pressure drops and flow distributions, temperature profiles, impurity concentration profiles, and impurity mass distributions
Qweak Data Analysis for Target Modeling Using Computational Fluid Dynamics
Moore, Michael; Covrig, Silviu
2015-04-01
The 2.5 kW liquid hydrogen (LH2) target used in the Qweak parity violation experiment is the highest power LH2 target in the world and the first to be designed with Computational Fluid Dynamics (CFD) at Jefferson Lab. The Qweak experiment determined the weak charge of the proton by measuring the parity-violating elastic scattering asymmetry of longitudinally polarized electrons from unpolarized liquid hydrogen at small momentum transfer (Q2 = 0 . 025 GeV2). This target met the design goals of bench-marked with the Qweak target data. This work is an essential ingredient in future designs of very high power low noise targets like MOLLER (5 kW, target noise asymmetry contribution < 25 ppm) and MESA (4.5 kW).
Analysis of Craniofacial Images using Computational Atlases and Deformation Fields
DEFF Research Database (Denmark)
Ólafsdóttir, Hildur
2008-01-01
purposes. The basis for most of the applications is non-rigid image registration. This approach brings one image into the coordinate system of another resulting in a deformation field describing the anatomical correspondence between the two images. A computational atlas representing the average anatomy...... of asymmetry. The analyses are applied to the study of three different craniofacial anomalies. The craniofacial applications include studies of Crouzon syndrome (in mice), unicoronal synostosis plagiocephaly and deformational plagiocephaly. Using the proposed methods, the thesis reveals novel findings about...... the craniofacial morphology and asymmetry of Crouzon mice. Moreover, a method to plan and evaluate treatment of children with deformational plagiocephaly, based on asymmetry assessment, is established. Finally, asymmetry in children with unicoronal synostosis is automatically assessed, confirming previous results...
Computational analysis of the flow field downstream of flow conditioners
Energy Technology Data Exchange (ETDEWEB)
Erdal, Asbjoern
1997-12-31
Technological innovations are essential for maintaining the competitiveness for the gas companies and here metering technology is one important area. This thesis shows that computational fluid dynamic techniques can be a valuable tool for examination of several parameters that may affect the performance of a flow conditioner (FC). Previous design methods, such as screen theory, could not provide fundamental understanding of how a FC works. The thesis shows, among other things, that the flow pattern through a complex geometry, like a 19-hole plate FC, can be simulated with good accuracy by a k-{epsilon} turbulence model. The calculations illuminate how variations in pressure drop, overall porosity, grading of porosity across the cross-section and the number of holes affects the performance of FCs. These questions have been studied experimentally by researchers for a long time. Now an understanding of the important mechanisms behind efficient FCs emerges from the predictions. 179 ref., 110 figs., 8 tabs.
Computed tomographic analysis of calvarial hyperostosis in captive lions.
Gross-Tsubery, Ruth; Chai, Orit; Shilo, Yael; Miara, Limor; Horowitz, Igal H; Shmueli, Ayelet; Aizenberg, Itzhak; Hoffman, Chen; Reifen, Ram; Shamir, Merav H
2010-01-01
Osseous malformations in the skull and cervical vertebrae of lions in captivity are believed to be caused by hypovitaminosis A. These often lead to severe neurologic abnormalities and may result in death. We describe the characterization of these abnormalities based on computed tomography (CT). CT images of two affected and three healthy lions were compared with define the normal anatomy of the skull and cervical vertebrae and provide information regarding the aforementioned osseous malformations. Because bone structure is influenced by various factors other than the aforementioned disease, all values were divided by the skull width that was not affected. The calculated ratios were compared and the most pronounced abnormalities in the affected lions were, narrowing of the foramen magnum, thickening of the tentorium osseus cerebelli and thickening of the dorsal arch of the atlas. CT is useful for detection of the calvarial abnormalities in lions and may be useful in further defining this syndrome.
A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis
Directory of Open Access Journals (Sweden)
Dilip Swaminathan
2009-01-01
kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.
Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding
International Nuclear Information System (INIS)
Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis
2003-01-01
A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)
Development of computational methods of design by analysis for pressure vessel components
International Nuclear Information System (INIS)
Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin
2005-01-01
Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)
Principal Component Analysis - A Powerful Tool in Computing Marketing Information
Directory of Open Access Journals (Sweden)
Constantin C.
2014-12-01
Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.
Deterministic sensitivity and uncertainty analysis for large-scale computer models
International Nuclear Information System (INIS)
Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.
1988-01-01
This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab
16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT)
Lokajicek, M; Tumova, N
2015-01-01
16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT). The ACAT workshop series, formerly AIHENP (Artificial Intelligence in High Energy and Nuclear Physics), was created back in 1990. Its main purpose is to gather researchers related with computing in physics research together, from both physics and computer science sides, and bring them a chance to communicate with each other. It has established bridges between physics and computer science research, facilitating the advances in our understanding of the Universe at its smallest and largest scales. With the Large Hadron Collider and many astronomy and astrophysics experiments collecting larger and larger amounts of data, such bridges are needed now more than ever. The 16th edition of ACAT aims to bring related researchers together, once more, to explore and confront the boundaries of computing, automatic data analysis and theoretical calculation technologies. It will create a forum for exchanging ideas among the fields an...
Evaluation of Cloud Computing Hidden Benefits by Using Real Options Analysis
Directory of Open Access Journals (Sweden)
Pavel Náplava
2016-12-01
Full Text Available Cloud computing technologies have brought new attributes to the IT world. One of them is a flexibility of IT resources. It enables effectively both to downsize and upsize the capacity of IT resources in real time. Requirements for IT size change defines business strategy and actual market state. IT costs are not stable but dynamic in this case. Standard investment valuation methods (both static and dynamic are not able to include the flexibility attribute to the evaluation of IT projects. This article describes the application of the Real Options Analysis method for the valuation of the cloud computing flexibility. The method compares costs of the on-premise and cloud computing solutions by combining put and call option valuation. Cloud computing providers can use the method as an advanced tool that explains hidden benefits of cloud computing. Unexperienced cloud computing customers can simulate the market behavior and better plan necessary IT investments.
Computational methods for criticality safety analysis within the scale system
International Nuclear Information System (INIS)
Parks, C.V.; Petrie, L.M.; Landers, N.F.; Bucholz, J.A.
1986-01-01
The criticality safety analysis capabilities within the SCALE system are centered around the Monte Carlo codes KENO IV and KENO V.a, which are both included in SCALE as functional modules. The XSDRNPM-S module is also an important tool within SCALE for obtaining multiplication factors for one-dimensional system models. This paper reviews the features and modeling capabilities of these codes along with their implementation within the Criticality Safety Analysis Sequences (CSAS) of SCALE. The CSAS modules provide automated cross-section processing and user-friendly input that allow criticality safety analyses to be done in an efficient and accurate manner. 14 refs., 2 figs., 3 tabs
Computational analysis of battery optimized reactor integral system
International Nuclear Information System (INIS)
Hwang, J. S.; Son, H. M.; Jeong, W. S.; Kim, T. W.; Suh, K. Y.
2007-01-01
Battery Optimized Reactor Integral System (BORIS) is being developed as a multi-purpose fast spectrum reactor cooled by lead (Pb). BORIS is an integral optimized reactor with an ultra-long life core. BORIS aims to satisfy various energy demands maintaining inherent safety with the primary coolant Pb, and improving economics. BORIS is being designed to generate 23 MW t h with 10 MW e for at least twenty consecutive years without refueling and to meet the Generation IV Nuclear Energy System goals of sustainability, safety, reliability, and economics. BORIS is conceptualized to be used as the main power and heat source for remote areas and barren lands, and also considered to be deployed for desalinisation purpose. BORIS, based on modular components to be viable for rapid construction and easy maintenance, adopts an integrated heat exchanger system operated by natural circulation of Pb without pumps to realize a small sized reactor. The BORIS primary system is designed through an optimization study. Thermal hydraulic characteristics during a reactor steady state with heat source and sink by core and heat exchanger, respectively, have been carried out by utilizing a computational fluid dynamics code and hand calculations based on first principles. This paper analyzes a transient condition of the BORIS primary system. The Pb coolant was selected for its lower chemical activity with air or water than sodium (Na) and good thermal characteristics. The reactor transient conditions such as core blockage, heat exchanger failure, and loss of heat sink, were selected for this study. Blockage in the core or its inlet structure causes localized flow starvation in one or several fuel assemblies. The coolant loop blockages cause a more or less uniform flow reduction across the core, which may trigger coolant temperature transient. General conservation equations were applied to model the primary system transients. Numerical approaches were adopted to discretized the governing
PAPIRUS - a computer code for FBR fuel performance analysis
International Nuclear Information System (INIS)
Kobayashi, Y.; Tsuboi, Y.; Sogame, M.
1991-01-01
The FBR fuel performance analysis code PAPIRUS has been developed to design fuels for demonstration and future commercial reactors. A pellet structural model was developed to describe the generation, depletion and transport of vacancies and atomic elements in unified fashion. PAPIRUS results in comparison with the power - to - melt test data from HEDL showed validity of the code at the initial reactor startup. (author)
Improving the Computational Morphological Analysis of a Swahili ...
African Journals Online (AJOL)
approach to the morphological analysis of Swahili. We particularly focus our discussion on its ability to retrieve lemmas for word forms and evaluate it as a tool for corpus-based dictionary compilation. Keywords: LEXICOGRAPHY, MORPHOLOGY, CORPUS ANNOTATION, LEMMATIZATION, MACHINE LEARNING, SWAHILI ...
QUASAR - an interactive program for spectrum analysis in personal computers
International Nuclear Information System (INIS)
Auler, L.T.; Nobrega, J.A.W. da.
1991-11-01
The QUASAR software for the interactive analysis and report of energy (pulse-height) and time (multichannel scaling) spectra is described. The operating instructions as well as the mathematical methods and algorithms used by the program are presented in detail. This program is an extension to the PULSAR program. (author)
Computational heat transfer analysis and combined ANN–GA ...
Indian Academy of Sciences (India)
The analysis using the numerical simulation and neural network ... Optimization is the process of finding the most plausible and desirable solution to a problem. ... increased heat transfer and compared the results of regular non-fuzzy model and fuzzy model. ..... network is designed using MATLAB Neural network toolbox.
Scenario analysis of false indication in computer-control systems
International Nuclear Information System (INIS)
Tseng, Wan-Hui; Fan, Chin-Feng
2013-01-01
Highlights: ► A new failure mode and effect for safety-critical systems is proposed. ► False indication is the most dreadful kind of partial failures. ► A model-based simulation approach to generate failure scenarios is proposed. ► Simulation results showed that multiple errors may cause undesired consequences. ► An assertion-based method to detect false indication problems is provided. -- Abstract: Computer control may cause additional failure modes and effects that are new to analogue systems. False indication is one such failure mode that may bring unknown risks to a system. False indication refers to the problem when part of a system fails while other processes still work, and the failure is not revealed to operators. This paper presents a model-based simulation approach to systematically generate potential false indication and unintended consequences. Experiments showed that once a false indication occurs, it may have drastic effects on system safety. False indication can mislead the operator to perform adverse actions or no action. Therefore, we propose an assertion-based detection method to alleviate such failures. Our assertions contain process/device dependencies, timing relations and physical conservation rules. With these assertions, the operator may be alerted at run time. The proposed technique can reduce false indication problem. Moreover, it can also be used to assist the system design.
A Grounded Theory Analysis of Introductory Computer Science Pedagogy
Directory of Open Access Journals (Sweden)
Jonathan Wellons
2011-12-01
Full Text Available Planning is a critical, early step on the path to successful program writing and a skill that is often lacking in novice programmers. As practitioners we are continually searching for or creating interventions to help our students, particularly those who struggle in the early stages of their computer science education. In this paper we report on our ongoing research of novice programming skills that utilizes the qualitative research method of grounded theory to develop theories and inform the construction of these interventions. We describe how grounded theory, a popular research method in the social sciences since the 1960’s, can lend formality and structure to the common practice of simply asking students what they did and why they did it. Further, we aim to inform the reader not only about our emerging theories on interventions for planning but also how they might collect and analyze their own data in this and other areas that trouble novice programmers. In this way those who lecture and design CS1 interventions can do so from a more informed perspective.
Analysis of secondary coxarthrosis by three dimensional computed tomography
Energy Technology Data Exchange (ETDEWEB)
Hemmi, Osamu [Keio Univ., Tokyo (Japan). School of Medicine
1997-11-01
The majority of coxarthrosis in Japan is due to congenital dislocation of the hip and acetabular dysplasia. Until now coxarthrosis has been chiefly analyzed on the basis of anterior-posterior radiographs. By using three-dimensional (3D) CT, it was possible to analyze the morphological features of secondary coxarthrosis more accurately, and by using new computer graphics software, it was possible to display the contact area in the hip joint and observe changes associated with progression of the stages of the disease. There were 34 subjects (68 joints), and all of who were women. The CT data were read into a work station, and 3D reconstruction was achieved with hip surgery simulation software (SurgiPlan). Pelvic inclination, acetabular anteversion, seven parameters indicating the investment of the femoral head and two indicating the position of the hip joint in the pelvis were measured. The results showed that secondary coxarthrosis is characterized not only by lateral malposition of the hip joint according to the pelvic coordinates, but by anterior malposition as well. Many other measurements provided 3D information on the acetabular dysplasia. Many of them were correlated with the CE angle on plain radiographs. Furthermore, a strong correlation was not found between anterior and posterior acetabular coverage of the femoral head. In addition, SurgiPlan`s distance mapping function enabled 3D observation of the pattern of progression of arthrosis based on the pattern of progression of joint space narrowing. (author)
Robust computational analysis of rRNA hypervariable tag datasets.
Directory of Open Access Journals (Sweden)
Maksim Sipos
Full Text Available Next-generation DNA sequencing is increasingly being utilized to probe microbial communities, such as gastrointestinal microbiomes, where it is important to be able to quantify measures of abundance and diversity. The fragmented nature of the 16S rRNA datasets obtained, coupled with their unprecedented size, has led to the recognition that the results of such analyses are potentially contaminated by a variety of artifacts, both experimental and computational. Here we quantify how multiple alignment and clustering errors contribute to overestimates of abundance and diversity, reflected by incorrect OTU assignment, corrupted phylogenies, inaccurate species diversity estimators, and rank abundance distribution functions. We show that straightforward procedural optimizations, combining preexisting tools, are effective in handling large (10(5-10(6 16S rRNA datasets, and we describe metrics to measure the effectiveness and quality of the estimators obtained. We introduce two metrics to ascertain the quality of clustering of pyrosequenced rRNA data, and show that complete linkage clustering greatly outperforms other widely used methods.
Pulmonary nodule characterization, including computer analysis and quantitative features.
Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E
2015-03-01
Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.
High performance computing environment for multidimensional image analysis.
Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo
2007-07-10
The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478x speedup. Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets.
Computer-aided waste management strategic planning and analysis
International Nuclear Information System (INIS)
Avci, H.I.; Kotek, T.J.; Koebnick, B.L.
1995-01-01
A computational model called WASTE-MGMT has been developed to assist in the evaluation of alternative waste management approaches in a complex setting involving multiple sites, waste streams, and processing options. The model provides the quantities and characteristics of wastes processed at any facility or shipped between any two sites as well as environmental emissions at any facility within the waste management system. The model input is defined by three types of fundamental waste management data: (1) waste inventories and characteristics at the point of generation; (2) treatment, storage, and disposal facility characteristics; and (3) definitions of alternative management approaches. The model has been successfully used in the preparation of the US Department of Energy (DOE) Environmental Management Programmatic.Environmental Impact Statement (EM PEIS). Certain improvements are either being implemented or planned that would extend the usefulness and applicability of the WASTE-MGMT model beyond the EM PEIS and info the. strategic planning for management of wastes under the responsibility of DOE or other agencies