WorldWideScience

Sample records for analysis koyu kansu

  1. Ultimate strength analysis of thin plated structures using eigen-functions. 3rd Report. Application to reliability analysis; Koyu kansu wo mochiita usuita kozobutsu no dansosei kaisekiho. 3. Shinraisei kaiseki eno oyo

    Energy Technology Data Exchange (ETDEWEB)

    Ueda, Y. [Osaka University, Osaka (Japan). Welding Research Institute; Masaoka, K.; Okada, H. [University of Osaka Prefecture, Osaka (Japan). Faculty of Engineering

    1996-12-31

    A reliability analysis was performed on ultimate strength of a hull by introducing reliability engineerings into the idealized structural unit method. Elements developed under the present study were applied to a model of an actual structure to indicate that even an analysis requiring much time under the finite element method can be performed in a short time and at high accuracy when this method is used. Analysis acted with bending moment and shear force simultaneously was performed on a model used as a structure in experiments carried out by Nishihara, assuming pure bending moment and longitudinal strength during slamming. Then, a reliability analysis was conducted on the same model based on this analysis method to investigate the ultimate strength. In an analysis of an ultimate strength when bending and shearing that assume slamming act upon simultaneously, axial force in the hull side decreases as loading increases, wherein how the shearing force increases can be identified clearly. Although existence of initial bends reduces the strength, the effect of variance in the vicinity of the average value on the reliability is rather small, while the effect due to variance in yield stress is greater. 27 refs., 14 figs., 4 tabs.

  2. Improvement of two-dimensional gravity analysis by using logarithmic functions; Taisu kansu wo mochiita nijigen juryoku kaiseki no kairyo

    Energy Technology Data Exchange (ETDEWEB)

    Makino, M.; Murata, Y. [Geological Survey of Japan, Tsukuba (Japan)

    1996-05-01

    An examination was made, in the two dimensional tectonic analysis by gravity exploration, on a method that was applicable from a deep underground part to a shallow geological structure by using logarithmic functions. In the examination, a case was considered in which an underground structure was divided into a basement and a covering formation and in which the boundary part had undulations. An equation to calculate a basement structure from a gravity anomaly was derived so that, taking into consideration the effect from the height of an observation point, it might be applicable to the shallow distribution of the basement depth. In the test calculation, a model was assumed reaching the depth near the surface with the basement being a step structure. Density difference was set as 0.4g/cm{sup 3}. An analysis using an equation two-dimensionally modified from Ogihara`s (1987) method produced a fairly reasonable result, showing, however, a deformed basement around the boundary of the step structure, with the appearance of a small pulse-shaped structure. The analysis using logarithmic functions revealed that the original basement structure was faithfully restored. 3 refs., 5 figs.

  3. Plans to Develop a Gas Field in the Kansu on the Border of the Usturt State Nature Reserve is a Real Threat for the Ecosystem of the Reserve and Largest Population of the Saker Falcon in Kazakhstan

    Directory of Open Access Journals (Sweden)

    Mark V. Pestov

    2016-02-01

    Full Text Available Plans of JSC “KazMunayGaz” National Company” on developing the Kansu gas field, which is situated right next to current southern borders of Usturt natural reserve on Kenderli-Kayasan conservation zone (Mangystau Province of the Republic of Kazakhstan are a direct danger for the largest population of the Saker Falcon Falco cherrug korelovi in Kazakhstan and for Kenderli-Kayasan conservation zone’s ecosystem as a whole. On the contrary, the realization of plans to expand the Usturt State Reserve within the Government of the Republic of Kazakhstan/GEF/UNDP project “Rising of stability of systems in conservation territories in desert ecosystems through promoting life sustaining sources compatible with biodiversity in and around conservation areas” and international expert group’s initiative of Mangystau Protected Area System to be nominated for UNESCO World Heritage status could create favorable environment for Usturt population of the Saker Falcon. It’s evident that all possible outcomes should be taken into account in the long-term planning of future development of Mangystau region, and options of development with less negative effect on environment should be chosen. In their letter to President of Kazakhstan the experts described their opinion on the necessity of imposing a moratorium on exploration and development of the Kansu gas field and concentrating on alternative fields.

  4. Prediction of powerplant vibration using FRF data of FE model; Dentatsu kansu wo mochiita power plant shindo yosoku

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, T.; Tsukahara, M.; Sakaguchi, M.; Takahashi, Y. [Honda R and D Co. Ltd., Tokyo (Japan)

    1997-10-01

    For the purpose of shortening the development period, the estimation of powerplant vibration has become more important in the early design stage, and eigenvalue analysis by FEM is commonly used to solve this problem. Eigenvalue Analysis cannot directly predict vibration levels under running conditions that affect the durability of each component and the vibration of a car body. This paper presents a new approach using FRF data from FE models for accurate prediction of engine vibration under running conditions. By applying this approach to an in-line four cylinder engine, the predicted vibration is reasonably comparable with experimental results. 3 refs., 8 figs.

  5. On the approximation with Chebyshev`s Q-function to ABR waveform and the estimation of its characteristics; Chebyshev q kansu ni yoru ABR hakei kinji to tokusei suitei ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Ikawa, N.; Kurata, T. [Chiba University, Chiba (Japan); Kato, S. [Teikyo Heisei University, Tokyo (Japan); Aruga, M. [Tokai University, Tokai (Japan)

    1998-10-01

    It is well known that ABR (Auditory Brainstem Responses) waveforms are used when the judge of the brain death or the check of some hearing troubles and so on are needed. The physiological side of this responses is nearly made clear, but the systematic mechanism of them is not investigated sufficiently. Therefore the quantitative estimation of those systematic mechanism is not yet obtained sufficiently. In this paper, firstly the polynomial approximation results of the entire ABR waveform are shown by using the Chebyshev`s q-function. Secondarily the tendency of the latency (a parameter of ABR) is considered and compared with the experimental formula which was independently derived from the another data and analysis. Through these discussion it was shown that the indices of the quantitative analysis of characteristics of ABR parameters were given as the polynomial approximate functions of the use of the approximate method applying the Chebyshev`s q-function. 6 refs., 1 figs., 2 tabs.

  6. Calculation of wave resistance by using Kochin function in the Rankine source method; Rankinsosuho ni okeru kochin kansu wo mochiita zoha teiko keisan

    Energy Technology Data Exchange (ETDEWEB)

    Yasukawa, H. [Mitsubishi Heavy Industries, Ltd., Tokyo (Japan)

    1997-10-01

    In order to avoid negative wave resistance (which is physically incomprehensible) generated in calculating wave resistance by using the Rankine source method, a proposal was made on a wave resistance calculation method using the Kochin function which describes behavior of speed potential in regions far apart from a hull. The Baba`s condition was used as a free surface condition for the speed potential which expresses wave motions around a hull. This has allowed a new Kochin function which uses as unknown the speed potential on the hull surface and the free surface near the hull to be defined and combined with the Rankine source method. A comparison was made between the calculated values for wave resistance, hull subsidence and trim change of an ore transporting vessel (SR107 type of ship) in a fully loaded condition and the result of water tank tests. The wave resistance values derived from pressure integration have all become negative when the Froude number is from 0.1 to 0.2, while no negative resistance has appeared in the calculations by using the Kochin function, but the result has agreed with that of the water tank tests. Accuracy of the calculations at low speeds was improved. The trim change in the calculations was slightly smaller than that in the water tank tests. The subsidence showed a good agreement. 7 refs., 1 fig.

  7. Analysis

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang

    2014-01-01

    three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...

  8. Analysis

    CERN Document Server

    Maurin, Krzysztof

    1980-01-01

    The extraordinarily rapid advances made in mathematics since World War II have resulted in analysis becoming an enormous organism spread­ ing in all directions. Gone for good surely are the days of the great French "courses of analysis" which embodied the whole of the "ana­ lytical" knowledge of the times in three volumes-as the classical work of Camille Jordan. Perhaps that is why present-day textbooks of anal­ ysis are disproportionately modest relative to the present state of the art. More: they have "retreated" to the state before Jordan and Goursat. In recent years the scene has been changing rapidly: Jean Dieudon­ ne is offering us his monumentel Elements d'Analyse (10 volumes) written in the spirit of the great French Course d'Analyse. To the best of my knowledge, the present book is the only one of its size: starting from scratch-from rational numbers, to be precise-it goes on to the theory of distributions, direct integrals, analysis on com­ plex manifolds, Kahler manifolds, the theory of sheave...

  9. Evaluation of the autonomic neuropathy function immediately after a change to upright posture using the impulse response function; Impulse oto kansu wo mochiita shisei henkan katoki ni okeru jiritsu shinkei kino hyoka

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, K. [Nagoya City University, Nagoya (Japan); Moyoshi, M.; Takata, K. [Daido Institute of Technology, Nagoya (Japan); Watanabe, Y. [Toyota College of Technology, Aichi (Japan)

    1997-05-20

    Autonomic neuropathy function immediately after a change to upright posture has been evaluated by applying transient response function of the system to the blood regulation system. The impulse response function was determined from the change in heart rate before postural change to the upright posture, and was compared with the transient change immediately after a change to the upright posture. The time series of R-R interval of electrocardiogram was used as the time series of the change in heart rate. To determine the impulse response function, an autoregressive model was applied to the R-R interval time series. The impulse response function at the steady state is a transient reaction at the impulse stimulation added to the blood regulation system. The R-R interval decreases rapidly by the autonomic neuropathy reaction in which the blood is rapidly transferred into the legs immediately after a change to upright posture. There is a close correlation between the initial temporary decrease in R-R interval and the impulse response function derived from the change in heart rate immediately after a change to the upright posture. Accordingly, the blood regulation and autonomic neuropathy functions can be evaluated by the impulse response function without actual standing test and load of tested persons. 9 refs., 3 figs., 1 tab.

  10. Download this PDF file

    African Journals Online (AJOL)

    Dr Olaleye

    skulls and specimens from later Kansu prehistoric sites in comparison with north china and other recent crania. Paleont. Sinica, 6:1-83. Cassidy Patrick.J. (1913a) “megaseme'' webster 1913 dictionary. answers.com (homepage on the internet) available from http://www.answers.com/topic/megaseme. Cassidy Patrick.

  11. Decision analysis multicriteria analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis

  12. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  13. Instrumental analysis

    International Nuclear Information System (INIS)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-01

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  14. Conference on the Physics and Chemistry of Semiconductor Interfaces (26th) Held in the Catamaran Resort Hotel in Pacific Beach, San Diego, California on 17 January 1999 to 21 January 1999. Microelectronics and Nanometer Structures: Processing, Measurement, and Phenomena

    Science.gov (United States)

    2000-01-27

    borophosphosilicate glass films Tomoyuki Yoshida,a) Koyu Aoki, and Yasuichi Mitsushima Toyota Central Research and Development Laboratories, Inc...PA, 1978). 18K. Krishnan, P. J. Stout , and M. Watanabe, in Practical Fourier Trans- form Infrared Spectroscopy, edited by J. R. Ferraro and K

  15. Content Analysis

    Directory of Open Access Journals (Sweden)

    George Bedinelli Rossi

    2014-09-01

    Full Text Available This study introduces the various definitions and types of content analysis. This type of analysis historically presents itself as a quantitative approach to data analysis and currently shows up as a qualitative approach. The most common types are the conceptual and relational analysis. The latter receives influences of linguistic, cognitive and mental models and it is subdivided in affective extraction, analysis of proximity and cognitive mapping. Regarding the importance of this type of analysis, we have quantitative and qualitative character and the latter approach can be used to identify hypotheses, theoretical constructs or even models that can be tested by multivariate statistical techniques or even by experiments. 

  16. Sensitivity analysis

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...

  17. Real analysis

    CERN Document Server

    McShane, Edward James

    2013-01-01

    This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.

  18. Strategic analysis

    OpenAIRE

    Bartuňková, Alena

    2011-01-01

    The goal of the master thesis is to conduct a strategic analysis of the Czech Export Bank. The work is divided in two parts, a theoretically methodological and practical part. Within theoretical part there is identified strategy by different approaches and there are described the procedures for strategic analysis. Findings from the theoretically methodological part are applied in the practical part and subsequently are used in the strategic analysis. The external analysis is characterized by ...

  19. Fourier analysis

    CERN Document Server

    Stade, Eric

    2005-01-01

    A reader-friendly, systematic introduction to Fourier analysis Rich in both theory and application, Fourier Analysis presents a unique and thorough approach to a key topic in advanced calculus. This pioneering resource tells the full story of Fourier analysis, including its history and its impact on the development of modern mathematical analysis, and also discusses essential concepts and today's applications. Written at a rigorous level, yet in an engaging style that does not dilute the material, Fourier Analysis brings two profound aspects of the discipline to the forefront: the wealth of ap

  20. Functional analysis

    CERN Document Server

    Kantorovich, L V

    1982-01-01

    Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space

  1. Dimensional Analysis

    Indian Academy of Sciences (India)

    Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.

  2. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    1956-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  3. Incidents analysis

    International Nuclear Information System (INIS)

    Francois, P.

    1996-01-01

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs

  4. Factor Analysis via Components Analysis

    Science.gov (United States)

    Bentler, Peter M.; de Leeuw, Jan

    2011-01-01

    When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…

  5. Dimensional Analysis

    CERN Document Server

    Tan, Qingming

    2011-01-01

    Dimensional analysis is an essential scientific method and a powerful tool for solving problems in physics and engineering. This book starts by introducing the Pi Theorem, which is the theoretical foundation of dimensional analysis. It also provides ample and detailed examples of how dimensional analysis is applied to solving problems in various branches of mechanics. The book covers the extensive findings on explosion mechanics and impact dynamics contributed by the author's research group over the past forty years at the Chinese Academy of Sciences. The book is intended for advanced undergra

  6. Recursive analysis

    CERN Document Server

    Goodstein, R L

    2010-01-01

    Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and

  7. Hydroeconomic analysis

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel

    2017-01-01

    Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...

  8. Biorefinery Analysis

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.

  9. Nonlinear analysis

    CERN Document Server

    Gasinski, Leszek

    2005-01-01

    Hausdorff Measures and Capacity. Lebesgue-Bochner and Sobolev Spaces. Nonlinear Operators and Young Measures. Smooth and Nonsmooth Analysis and Variational Principles. Critical Point Theory. Eigenvalue Problems and Maximum Principles. Fixed Point Theory.

  10. Analysis II

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  11. Analysis I

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  12. Water analysis

    International Nuclear Information System (INIS)

    Fishman, M.J.; Erdmann, D.E.

    1975-01-01

    The literature of analytical chemistry applied to water analysis is reviewed for the period Oct. 1972, through Sept. 1974. The material used in preparing the review comes mainly from major analytical journals and Chemical Abstracts. Many methods, including activation and radiometric, are discussed for the analyses of various elements, including Mo, U, Th, rare earths, and halides. Radioactivity and isotope analysis are also discussed. (663 references.) (U.S.)

  13. Link Analysis

    Science.gov (United States)

    Donoho, Steve

    Link analysis is a collection of techniques that operate on data that can be represented as nodes and links. This chapter surveys a variety of techniques including subgraph matching, finding cliques and K-plexes, maximizing spread of influence, visualization, finding hubs and authorities, and combining with traditional techniques (classification, clustering, etc). It also surveys applications including social network analysis, viral marketing, Internet search, fraud detection, and crime prevention.

  14. Radioactivation analysis

    International Nuclear Information System (INIS)

    1959-01-01

    Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation

  15. Real analysis

    CERN Document Server

    Loeb, Peter A

    2016-01-01

    This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....

  16. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  17. Real analysis

    CERN Document Server

    DiBenedetto, Emmanuele

    2016-01-01

    The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...

  18. Numerical analysis

    CERN Document Server

    Jacques, Ian

    1987-01-01

    This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...

  19. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...

  20. META - ANALYSIS

    Directory of Open Access Journals (Sweden)

    Ivana Ilić

    2009-04-01

    Full Text Available Meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into a common result. In the past few years, there has been an increasing interest in meta-analysis from both medical researches and statisticians. One of the main targets of clinical research is to obtain reliable results, although clinical trials with the same topic often give contrasting results. Medical practice is strongly influenced by the results of clinical studies if they are brought to light through important scientific journals. This large amount of information often contains scattered data, and discordant conclusions, and sometimes it is very hard to define the quality and validity of each study. Today, a large number of biomedical journals give importance to articles using meta-analysis in their researches. By using meta-analysis as a method of summarizing, integrating and analyzing a large number of independent studies on the same topic and finally pooling their results into a common result, a researcher can achieve relevant, objective and unbiased conclusions, if the procedure is well-conducted and controlled by the experts. The aim of this paper is to provide the clinical researcher with the basic principles of meta-analysis and its concepts in order to perform a valid clinical study and to report results in the correct way. In today’s evidence-based medical practice, it is crucial for anyone who wants to deal seriously with the scientific work in the biomedical field to learn mathematical and statistical principles that build meta-analysis. In that way, this statistical method could be of great importance to the researcher who wants to respond to new demands of modern medical science.

  1. Outlier analysis

    CERN Document Server

    Aggarwal, Charu C

    2013-01-01

    With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and

  2. Field analysis

    Directory of Open Access Journals (Sweden)

    Waindok Andrzej

    2017-12-01

    Full Text Available Field analysis including eddy currents in the magnetic core of five-phase permanent magnet tubular linear actuator (TLA has been carried out. The eddy currents induced in the magnetic core cause the losses which have been calculated. The results from 2D finite element (FE analysis have been compared with those from 3D calculations. The losses in the mover of the five-phase actuator are much lower than the losses in its stator. That is why the former ones can be neglected in the computer aided designing. The calculation results have been verified experimentally

  3. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  4. Clustering analysis

    International Nuclear Information System (INIS)

    Romli

    1997-01-01

    Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods

  5. Image Analysis

    DEFF Research Database (Denmark)

    . The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries......The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...

  6. Risk analysis

    International Nuclear Information System (INIS)

    Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.

    1997-01-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es

  7. Convex analysis

    CERN Document Server

    Rockafellar, Ralph Tyrell

    2015-01-01

    Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and

  8. Elementary analysis

    CERN Document Server

    Snell, K S; Langford, W J; Maxwell, E A

    1966-01-01

    Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is

  9. Cluster analysis

    CERN Document Server

    Everitt, Brian S; Leese, Morven; Stahl, Daniel

    2011-01-01

    Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons

  10. Relativistic analysis

    International Nuclear Information System (INIS)

    Unterberger, A.

    1987-01-01

    We study the Klein-Gordon symbolic calculus of operators acting on solutions of the free Klein-Gordon equation. It contracts to the Weyl calculus as c→∞. Mathematically, it may also be considered as a pseudodifferential analysis on the unit ball of R n [fr

  11. Analysis paralysis

    Science.gov (United States)

    Bill Block

    2012-01-01

    I have been Editor-in-Chief for about 10 months now. Over that period of time, I have processed hundreds of manuscripts and considered hundreds of reviews. In doing so, I have noticed an emphasis on analysis at the expense of a better understanding of the ecological system under study. I mention this not to belittle statistical advances made within various disciplines...

  12. IWS analysis

    International Nuclear Information System (INIS)

    Rhoades, W.A.; Dray, B.J.

    1970-01-01

    The effect of Gadolinium-155 on the prompt kinetic behavior of a zirconium hydride reactor has been deduced, using experimental data from the SNAPTRAN machine. The poison material makes the temperature coefficient more positive, and the Type IV sleeves were deduced to give a positive coefficient above 1100 0 F. A thorough discussion of the data and analysis is included. (U.S.)

  13. Analysis report

    International Nuclear Information System (INIS)

    Saadi, Radouan; Marah, Hamid

    2014-01-01

    This report presents results related to Tritium analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal, within the framework of the RAF7011 project. It describes analytical method and instrumentation including general uncertainty estimation: Electrolytic enrichment and liquid scintillation counting; The results are expressed in Tritium Unit (TU); Low Limit Detection: 0.02 TU

  14. Dimensional Analysis

    Indian Academy of Sciences (India)

    to understand and quite straightforward to use. Dimensional analysis is a topic which every student of 'science encounters in elementary physics courses. The basics of this topic are taught and learnt quite hurriedly (and forgotten fairly quickly thereafter!) It does not generally receive the attention and the respect it deserves ...

  15. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  16. Poetic Analysis

    DEFF Research Database (Denmark)

    Nielsen, Kirsten

    2010-01-01

    The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonance...

  17. Architecture Analysis

    NARCIS (Netherlands)

    Iacob, Maria-Eugenia; Jonkers, Henk; van der Torre, Leon; de Boer, Frank S.; Bonsangue, Marcello; Stam, Andries W.; Lankhorst, Marc M.; Quartel, Dick A.C.; Aldea, Adina; Lankhorst, Marc

    2017-01-01

    This chapter also explains what the added value of enterprise architecture analysis techniques is in addition to existing, more detailed, and domain-specific ones for business processes or software, for example. Analogous to the idea of using the ArchiMate enterprise modelling language to integrate

  18. Trend analysis

    International Nuclear Information System (INIS)

    Smith, M.; Jones, D.R.

    1991-01-01

    The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning

  19. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  20. Wavelet analysis

    CERN Document Server

    Cheng, Lizhi; Luo, Yong; Chen, Bo

    2014-01-01

    This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...

  1. Harmonic analysis

    CERN Document Server

    Helson, Henry

    2010-01-01

    This second edition has been enlarged and considerably rewritten. Among the new topics are infinite product spaces with applications to probability, disintegration of measures on product spaces, positive definite functions on the line, and additional information about Weyl's theorems on equidistribution. Topics that have continued from the first edition include Minkowski's theorem, measures with bounded powers, idempotent measures, spectral sets of bounded functions and a theorem of Szego, and the Wiener Tauberian theorem. Readers of the book should have studied the Lebesgue integral, the elementary theory of analytic and harmonic functions, and the basic theory of Banach spaces. The treatment is classical and as simple as possible. This is an instructional book, not a treatise. Mathematics students interested in analysis will find here what they need to know about Fourier analysis. Physicists and others can use the book as a reference for more advanced topics.

  2. Geometric analysis

    CERN Document Server

    Bray, Hubert L; Mazzeo, Rafe; Sesum, Natasa

    2015-01-01

    This volume includes expanded versions of the lectures delivered in the Graduate Minicourse portion of the 2013 Park City Mathematics Institute session on Geometric Analysis. The papers give excellent high-level introductions, suitable for graduate students wishing to enter the field and experienced researchers alike, to a range of the most important areas of geometric analysis. These include: the general issue of geometric evolution, with more detailed lectures on Ricci flow and Kähler-Ricci flow, new progress on the analytic aspects of the Willmore equation as well as an introduction to the recent proof of the Willmore conjecture and new directions in min-max theory for geometric variational problems, the current state of the art regarding minimal surfaces in R^3, the role of critical metrics in Riemannian geometry, and the modern perspective on the study of eigenfunctions and eigenvalues for Laplace-Beltrami operators.

  3. [OSTEOMETRIC ANALYSIS].

    Science.gov (United States)

    Tartaglia, Gianna; Nava, Alessia

    2015-01-01

    In the paleobiological studies, the osteometry is a method for gaining insight into human populations of the past. The analysis of the data obtained from measurements of the skeleton can be applied in the determination of sex and degree of sexual dimorphism intra and interpopulation. The results obtained from osteometrical data of postcranial allow us to formulate hypotheses on certain aspects related to the living conditions of the people who lived in the urban and suburban area of ancient Rome.

  4. Survival analysis

    International Nuclear Information System (INIS)

    Badwe, R.A.

    1999-01-01

    The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model

  5. Water analysis

    International Nuclear Information System (INIS)

    Garbarino, J.R.; Steinheimer, T.R.; Taylor, H.E.

    1985-01-01

    This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references

  6. Economic analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-06-01

    The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.

  7. Activation analysis

    International Nuclear Information System (INIS)

    Beeck, J. OP de

    1977-01-01

    It is shown that activation analysis is especially suited to serve as a basis for determining the chemical similarity between samples defined by their trace-element concentration patterns. The general problem of classification and identification is discussed. The nature of possible classification structures and their appropriate clustering strategies is considered. A practical computer method is suggested and its application as well as the graphical representation of classification results are given. The possibility for classification using information theory is mentioned. Classification of chemical elements is discussed and practically realized after Hadamard transformation of the concentration variation patterns in a series of samples. (Sz.N.Z.)

  8. Exergy analysis

    DEFF Research Database (Denmark)

    Dovjak, M.; Simone, Angela; Kolarik, Jakub

    2011-01-01

    Exergy analysis enables us to make connections among processes inside the human body and processes in a building. So far, only the effect of different combinations of air temperatures and mean radiant temperatures have been studied, with constant relative humidity in experimental conditions...... al. (1998). The effect of different levels of RH, Ta and effective clothing insulation on human body exergy balance chain, changes in human body exergy consumption rate (hbExCr) and predicted mean vote (PMV) index were analyzed. The results show that thermal comfort conditions do not always results...

  9. Complex Analysis

    CERN Document Server

    Stein, Elias M

    2009-01-01

    With this second volume, we enter the intriguing world of complex analysis. From the first theorems on, the elegance and sweep of the results is evident. The starting point is the simple idea of extending a function initially given for real values of the argument to one that is defined when the argument is complex. From there, one proceeds to the main properties of holomorphic functions, whose proofs are generally short and quite illuminating: the Cauchy theorems, residues, analytic continuation, the argument principle.With this background, the reader is ready to learn a wealth of additional m

  10. Vector analysis

    CERN Document Server

    Newell, Homer E

    2006-01-01

    When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e

  11. Sequential analysis

    CERN Document Server

    Wald, Abraham

    2013-01-01

    In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio

  12. Vector analysis

    CERN Document Server

    Brand, Louis

    2006-01-01

    The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou

  13. Understanding analysis

    CERN Document Server

    Abbott, Stephen

    2015-01-01

    This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...

  14. Failure Analysis

    International Nuclear Information System (INIS)

    Iorio, A.F.; Crespi, J.C.

    1987-01-01

    After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)

  15. Matrix analysis

    CERN Document Server

    Bhatia, Rajendra

    1997-01-01

    A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu­ ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe­ matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...

  16. Functional analysis

    CERN Document Server

    Kesavan, S

    2009-01-01

    The material presented in this book is suited for a first course in Functional Analysis which can be followed by Masters students. While covering all the standard material expected of such a course, efforts have been made to illustrate the use of various theorems via examples taken from differential equations and the calculus of variations, either through brief sections or through exercises. In fact, this book will be particularly useful for students who would like to pursue a research career in the applications of mathematics. The book includes a chapter on weak and weak topologies and their applications to the notions of reflexivity, separability and uniform convexity. The chapter on the Lebesgue spaces also presents the theory of one of the simplest classes of Sobolev spaces. The book includes a chapter on compact operators and the spectral theory for compact self-adjoint operators on a Hilbert space. Each chapter has large collection of exercises at the end. These illustrate the results of the text, show ...

  17. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  18. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  19. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  20. Pyrotechnic Shock Analysis Using Statistical Energy Analysis

    Science.gov (United States)

    2015-10-23

    2013. 3. Lyon, Richard H., and DeJong, Richard G., “ Theory and Application of Statistical Energy Analysis, 2nd Edition,” Butterworth-Heinemann, 1995... Dalton , Eric C., “Ballistic Shock Response Prediction through the Synergistic Use of Statistical Energy Analysis, Finite Element Analysis, and

  1. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  2. Shape analysis in medical image analysis

    CERN Document Server

    Tavares, João

    2014-01-01

    This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...

  3. Integrated Sensitivity Analysis Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Friedman-Hill, Ernest J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Edward L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gibson, Marcus J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clay, Robert L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  4. K Basin Hazard Analysis

    International Nuclear Information System (INIS)

    PECH, S.H.

    2000-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  5. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  6. K Basins Hazard Analysis

    International Nuclear Information System (INIS)

    WEBB, R.H.

    1999-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  7. Qualitative Content Analysis

    OpenAIRE

    Philipp Mayring

    2000-01-01

    The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...

  8. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  9. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  10. Sentiment Analysis for Exploratory Data Analysis

    Directory of Open Access Journals (Sweden)

    Zoë Wilkinson Saldaña

    2018-01-01

    Full Text Available In this lesson you will learn to conduct 'sentiment analysis' on texts and to interpret the results. This is a form of exploratory data analysis based on natural language processing. You will learn to install all appropriate software and to build a reusable program that can be applied to your own texts.

  11. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  12. Cluster analysis for applications

    CERN Document Server

    Anderberg, Michael R

    1973-01-01

    Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o

  13. Circuit analysis for dummies

    CERN Document Server

    Santiago, John

    2013-01-01

    Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help

  14. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is ...

  15. Introductory numerical analysis

    CERN Document Server

    Pettofrezzo, Anthony J

    2006-01-01

    Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.

  16. Ecosystem Analysis Program

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models

  17. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  18. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  19. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  20. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  1. Discourse analysis and Foucault's

    Directory of Open Access Journals (Sweden)

    Jansen I.

    2008-01-01

    Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.

  2. Nonstandard Analysis and Constructivism!

    OpenAIRE

    Sanders, Sam

    2017-01-01

    Almost two decades ago, Wattenberg published a paper with the title 'Nonstandard Analysis and Constructivism?' in which he speculates on a possible connection between Nonstandard Analysis and constructive mathematics. We study Wattenberg's work in light of recent research on the aforementioned connection. On one hand, with only slight modification, some of Wattenberg's theorems in Nonstandard Analysis are seen to yield effective and constructive theorems (not involving Nonstandard Analysis). ...

  3. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available the stability of the parameter estimates. 9 / 27 Background Overview of the Theory of Extremes Case Studies Concluding Remarks Analysis of Extreme Rainfall Events Analysis of Extreme Wave Heights Figure: Map of South Africa with the study areas... highlighted 10 / 27 Background Overview of the Theory of Extremes Case Studies Concluding Remarks Analysis of Extreme Rainfall Events Analysis of Extreme Wave Heights Western Cape Climatologically diverse: Influence of the varied topography and it’s...

  4. Biological sequence analysis

    DEFF Research Database (Denmark)

    Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose

    This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene......This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis...

  5. Principal component analysis

    NARCIS (Netherlands)

    Bro, R.; Smilde, A.K.

    2014-01-01

    Principal component analysis is one of the most important and powerful methods in chemometrics as well as in a wealth of other areas. This paper provides a description of how to understand, use, and interpret principal component analysis. The paper focuses on the use of principal component analysis

  6. Charged particle activation analysis

    International Nuclear Information System (INIS)

    Peisach, M.

    1977-01-01

    The techniques of prompt and delayed activation analysis are outlined. Methods using cyclotron beams are suitable for delayed A.A., but prompt methods with relatively low energy beams serve a useful purpose for analysis of thin layers and surfaces. Multi-element analyses with prompt X-rays, generally applicable analysis by backscattering and specific analyses by nuclaer reactions are described [af

  7. Critical Classroom Discourse Analysis.

    Science.gov (United States)

    Kumaravadivelu, B.

    1999-01-01

    Conceptualizes a framework for conducting critical classroom-discourse analysis. Critiques the scope and method of current models of classroom-interaction analysis and classroom-discourse analysis and advocates using poststructuralist and postcolonialist understandings of discourse to develop a critical framework for understanding what actually…

  8. Practical data analysis

    CERN Document Server

    Cuesta, Hector

    2013-01-01

    Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.

  9. Data analysis for chemistry

    CERN Document Server

    Hibbert, DBrynn

    2005-01-01

    Based on D Brynn Hibbert''s lectures on data analysis to undergraduates and graduate students, this book covers topics including measurements, means and confidence intervals, hypothesis testing, analysis of variance, and calibration models. It is meant as an entry level book targeted at learning and teaching undergraduate data analysis.

  10. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  11. Strategic Analysis Overview

    Science.gov (United States)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  12. Foundations of mathematical analysis

    CERN Document Server

    Johnsonbaugh, Richard

    2010-01-01

    This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss

  13. Multivariate analysis with LISREL

    CERN Document Server

    Jöreskog, Karl G; Y Wallentin, Fan

    2016-01-01

    This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.

  14. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  15. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  16. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    Current analytical approaches in computational social science can be characterized by four dominant paradigms: text analysis (information extraction and classification), social network analysis (graph theory), social complexity analysis (complex systems science), and social simulations (cellular...... this limitation, based on the sociology of associations and the mathematics of set theory, this paper presents a new approach to big data analytics called social set analysis. Social set analysis consists of a generative framework for the philosophies of computational social science, theory of social data...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....

  17. Cost benefit analysis cost effectiveness analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data

  18. NASA trend analysis procedures

    Science.gov (United States)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  19. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  20. Data analysis in astronomy

    International Nuclear Information System (INIS)

    Di Gesu, V.; Crane, P.; Friedman, J.H.; Levialdi, S.; Scarsi, L.

    1985-01-01

    This book presents information on the following topics: the data analysis facilities that astronomers want; time analysis in astronomy; tools for periodicity searches; graphical methods of exploratory data analysis; multivariate statistics to analyze extraterrestrial particles from the ocean floor; application of bootstrap sampling in gamma-ray astronomy; an automated method for velocity field analysis; panel discussion on data analysis trends in x-ray and gamma-ray astronomy; the Groningen image processing system; astronomical input to image processing - astronomical output from image processing; 2-D photometry; spectrometry; time dependent analysis; solar image processing with the Clark Lake Radioheliograph; steps toward parallel processing; new architectures for image processing; data structures and languages in support of parallel image processing for astronomy; and morphology and probability in image processing

  1. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  2. Sparse Exploratory Factor Analysis.

    Science.gov (United States)

    Trendafilov, Nickolay T; Fontanella, Sara; Adachi, Kohei

    2017-07-13

    Sparse principal component analysis is a very active research area in the last decade. It produces component loadings with many zero entries which facilitates their interpretation and helps avoid redundant variables. The classic factor analysis is another popular dimension reduction technique which shares similar interpretation problems and could greatly benefit from sparse solutions. Unfortunately, there are very few works considering sparse versions of the classic factor analysis. Our goal is to contribute further in this direction. We revisit the most popular procedures for exploratory factor analysis, maximum likelihood and least squares. Sparse factor loadings are obtained for them by, first, adopting a special reparameterization and, second, by introducing additional [Formula: see text]-norm penalties into the standard factor analysis problems. As a result, we propose sparse versions of the major factor analysis procedures. We illustrate the developed algorithms on well-known psychometric problems. Our sparse solutions are critically compared to ones obtained by other existing methods.

  3. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  4. Static Analysis Numerical Algorithms

    Science.gov (United States)

    2016-04-01

    STATIC ANALYSIS OF NUMERICAL ALGORITHMS KESTREL TECHNOLOGY, LLC APRIL 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...3. DATES COVERED (From - To) NOV 2013 – NOV 2015 4. TITLE AND SUBTITLE STATIC ANALYSIS OF NUMERICAL ALGORITHMS 5a. CONTRACT NUMBER FA8750-14-C...and Honeywell Aerospace Advanced Technology to combine model-based development of complex avionics control software with static analysis of the

  5. Emission spectrochemical analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized

  6. Mastering Clojure data analysis

    CERN Document Server

    Rochester, Eric

    2014-01-01

    This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.

  7. The ATLAS Analysis Architecture

    International Nuclear Information System (INIS)

    Cranmer, K.S.

    2008-01-01

    We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability

  8. Stochastic Analysis 2010

    CERN Document Server

    Crisan, Dan

    2011-01-01

    "Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa

  9. Introduction to analysis

    CERN Document Server

    Gaughan, Edward D

    2009-01-01

    Introduction to Analysis is designed to bridge the gap between the intuitive calculus usually offered at the undergraduate level and the sophisticated analysis courses the student encounters at the graduate level. In this book the student is given the vocabulary and facts necessary for further study in analysis. The course for which it is designed is usually offered at the junior level, and it is assumed that the student has little or no previous experience with proofs in analysis. A considerable amount of time is spent motivating the theorems and proofs and developing the reader's intuition.

  10. Space Weather Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Space Weather Analysis archives are model output of ionospheric, thermospheric and magnetospheric particle populations, energies and electrodynamics

  11. Textile Technology Analysis Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...

  12. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  13. Thermogravimetric Analysis Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....

  14. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  15. International Market Analysis

    DEFF Research Database (Denmark)

    Sørensen, Olav Jull

    2009-01-01

    The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....

  16. Machine Fault Signature Analysis

    Directory of Open Access Journals (Sweden)

    Pratesh Jayaswal

    2008-01-01

    Full Text Available The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while detailed consideration is given to the subject of the rolling element bearing fault signature analysis.

  17. Analysis of food contaminants

    National Research Council Canada - National Science Library

    Gilbert, John

    1984-01-01

    ... quantification methods used in the analysis of mycotoxins in foods - Confirmation and quantification of trace organic food contaminants by mass spectrometry-selected ion monitoring - Chemiluminescence...

  18. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  19. Gabor Analysis for Imaging

    DEFF Research Database (Denmark)

    Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan

    2015-01-01

    In contrast to classical Fourier analysis, time–frequency analysis is concerned with localized Fourier transforms. Gabor analysis is an important branch of time–frequency analysis. Although significantly different, it shares with the wavelet transform methods the ability to describe the smoothness......, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...

  20. Circuit analysis with Multisim

    CERN Document Server

    Baez-Lopez, David

    2011-01-01

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo

  1. Chemical Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...

  2. Chemical Security Analysis Center

    Data.gov (United States)

    Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...

  3. Risk Analysis of Marine Structures

    DEFF Research Database (Denmark)

    Hansen, Peter Friis

    1998-01-01

    Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated.......Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated....

  4. Electric field analysis

    CERN Document Server

    Chakravorti, Sivaji

    2015-01-01

    This book prepares newcomers to dive into the realm of electric field analysis. The book details why one should perform electric field analysis and what are its practical implications. It emphasizes both the fundamentals and modern computational methods of electric machines. The book covers practical applications of the numerical methods in high voltage equipment, including transmission lines, power transformers, cables, and gas insulated systems.

  5. SWOT ANALYSIS - CHINESE PETROLEUM

    Directory of Open Access Journals (Sweden)

    Chunlan Wang

    2014-01-01

    Full Text Available This article was written in early December 2013,combined with the historical development andthe latest data on the Chinese Petroleum carried SWOT- analysis. This paper discusses corporate resources, cost, management and external factorssuch as the political environment and the marketsupply and demand, conducted a comprehensiveand profound analysis.

  6. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....

  7. Longitudinal Meta-analysis

    NARCIS (Netherlands)

    Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.

    2004-01-01

    The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated

  8. Spool assembly support analysis

    International Nuclear Information System (INIS)

    Norman, B.F.

    1994-01-01

    This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met

  9. Northern blotting analysis

    DEFF Research Database (Denmark)

    Josefsen, Knud; Nielsen, Henrik

    2011-01-01

    Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... the gap to the more laborious nuclease protection experiments....

  10. Beyond sensitivity analysis

    DEFF Research Database (Denmark)

    Lund, Henrik; Sorknæs, Peter; Mathiesen, Brian Vad

    2018-01-01

    point of view, the typical way of handling this challenge has been to predict future prices as accurately as possible and then conduct a sensitivity analysis. This paper includes a historical analysis of such predictions, leading to the conclusion that they are almost always wrong. Not only...

  11. Enabling interdisciplinary analysis

    Science.gov (United States)

    L. M. Reid

    1996-01-01

    'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...

  12. Spectroscopic analysis and control

    Energy Technology Data Exchange (ETDEWEB)

    Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles

    2017-04-18

    Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.

  13. Activation analysis. Detection limits

    International Nuclear Information System (INIS)

    Revel, G.

    1999-01-01

    Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)

  14. Advanced Analysis Environments - Summary

    International Nuclear Information System (INIS)

    Panacek, Suzanne

    2001-01-01

    This is a summary of the panel discussion on Advanced Analysis Environments. Rene Brun, Tony Johnson, and Lassi Tuura shared their insights about the trends and challenges in analysis environments. This paper contains the initial questions, a summary of the speakers' presentation, and the questions asked by the audience

  15. Analysis of Design Documentation

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1998-01-01

    has been established where we seek to identify useful design work patterns by retrospective analyses of documentation created during design projects. This paper describes the analysis method, a tentatively defined metric to evaluate identified work patterns, and presents results from the first...... analysis accomplished....

  16. FOOD RISK ANALYSIS

    Science.gov (United States)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  17. Marketing research cluster analysis

    Directory of Open Access Journals (Sweden)

    Marić Nebojša

    2002-01-01

    Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.

  18. Interaction Analysis and Supervision.

    Science.gov (United States)

    Amidon, Edmund

    This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)

  19. Zen and Behavior Analysis

    Science.gov (United States)

    Bass, Roger

    2010-01-01

    Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless--a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to…

  20. Statistical data analysis

    International Nuclear Information System (INIS)

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques

  1. Qualitative Content Analysis

    OpenAIRE

    Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs

    2014-01-01

    Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...

  2. Qualitative Content Analysis

    Directory of Open Access Journals (Sweden)

    Satu Elo

    2014-02-01

    Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.

  3. Interactive Controls Analysis (INCA)

    Science.gov (United States)

    Bauer, Frank H.

    1989-01-01

    Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.

  4. Towards Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan

    2005-01-01

    Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representi...

  5. Northern blotting analysis

    DEFF Research Database (Denmark)

    Josefsen, Knud; Nielsen, Henrik

    2011-01-01

    Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membran...

  6. Proximate Analysis of Coal

    Science.gov (United States)

    Donahue, Craig J.; Rais, Elizabeth A.

    2009-01-01

    This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…

  7. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  8. Robust multivariate analysis

    CERN Document Server

    J Olive, David

    2017-01-01

    This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given.  The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory.   The robust techniques  are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis.  A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...

  9. Essential real analysis

    CERN Document Server

    Field, Michael

    2017-01-01

    This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry.  Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...

  10. From analysis to surface

    DEFF Research Database (Denmark)

    Bemman, Brian; Meredith, David

    In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing it with a “......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis...

  11. Functional data analysis

    CERN Document Server

    Ramsay, J O

    1997-01-01

    Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...

  12. Java Analysis Studio

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Anthony S

    1998-10-23

    Java Analysis Studio is a desktop data analysis application aimed primarily at offline analysis of high-energy physics data. The goal is to make the application independent of any particular data format, so that it can be used to analyze data from any experiment. The application features a rich graphical user interface (GUI) aimed at making the program easy to learn and use, but which at the same time allows the user to perform arbitrarily complex data analysis tasks by writing analysis modules in Java. The application can be used either as a standalone application, or as a client for a remote Java Data Server. The client-server mechanism is targeted particularly at allowing remote users to access large data samples stored on a central data center in a natural and efficient way.

  13. Data analysis workbench

    International Nuclear Information System (INIS)

    Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.

    2012-01-01

    Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)

  14. K Basin safety analysis

    International Nuclear Information System (INIS)

    Porten, D.R.; Crowe, R.D.

    1994-01-01

    The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall

  15. Gait analysis: clinical facts.

    Science.gov (United States)

    Baker, Richard; Esquenazi, Alberto; Benedetti, Maria G; Desloovere, Kaat

    2016-08-01

    Gait analysis is a well-established tool for the quantitative assessment of gait disturbances providing functional diagnosis, assessment for treatment planning, and monitoring of disease progress. There is a large volume of literature on the research use of gait analysis, but evidence on its clinical routine use supports a favorable cost-benefit ratio in a limited number of conditions. Initially gait analysis was introduced to clinical practice to improve the management of children with cerebral palsy. However, there is good evidence to extend its use to patients with various upper motor neuron diseases, and to lower limb amputation. Thereby, the methodology for properly conducting and interpreting the exam is of paramount relevance. Appropriateness of gait analysis prescription and reliability of data obtained are required in the clinical environment. This paper provides an overview on guidelines for managing a clinical gait analysis service and on the principal clinical domains of its application: cerebral palsy, stroke, traumatic brain injury and lower limb amputation.

  16. Is activation analysis still active?

    International Nuclear Information System (INIS)

    Chai Zhifang

    2001-01-01

    This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)

  17. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  18. Containment vessel stability analysis

    International Nuclear Information System (INIS)

    Harstead, G.A.; Morris, N.F.; Unsal, A.I.

    1983-01-01

    The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)

  19. Analysis of monazite samples

    International Nuclear Information System (INIS)

    Kartiwa Sumadi; Yayah Rohayati

    1996-01-01

    The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis

  20. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  1. Factor analysis and scintigraphy

    International Nuclear Information System (INIS)

    Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.

    1976-01-01

    The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr

  2. Applied nonstandard analysis

    CERN Document Server

    Davis, Martin

    2005-01-01

    Geared toward upper-level undergraduates and graduate students, this text explores the applications of nonstandard analysis without assuming any knowledge of mathematical logic. It develops the key techniques of nonstandard analysis at the outset from a single, powerful construction; then, beginning with a nonstandard construction of the real number system, it leads students through a nonstandard treatment of the basic topics of elementary real analysis, topological spaces, and Hilbert space.Important topics include nonstandard treatments of equicontinuity, nonmeasurable sets, and the existenc

  3. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  4. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  5. Real Option Analysis (ROA

    Directory of Open Access Journals (Sweden)

    Armanto Witjaksono

    2003-03-01

    Full Text Available Net Present Value (NPV method have populer since middle 70’s and now most of expert felt that method has several limitation, especially if used to analyse big scale investment alocation capital. Another method that begin to popular is Real Option Analysis (ROA that use to replace Net Present Value (NPV method. The strenght of Real Option Analysis (ROA method is the flexiblelity in giving information for the decision maker. The weakness of Real Option Analysis (ROA method is the simple mathematic formula, as the formula in NPV method, is not found yet. 

  6. Fundamentals of mathematical analysis

    CERN Document Server

    Paul J Sally, Jr

    2013-01-01

    This is a textbook for a course in Honors Analysis (for freshman/sophomore undergraduates) or Real Analysis (for junior/senior undergraduates) or Analysis-I (beginning graduates). It is intended for students who completed a course in "AP Calculus", possibly followed by a routine course in multivariable calculus and a computational course in linear algebra. There are three features that distinguish this book from many other books of a similar nature and which are important for the use of this book as a text. The first, and most important, feature is the collection of exercises. These are spread

  7. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  8. Observations on risk analysis

    International Nuclear Information System (INIS)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested

  9. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  10. Foundations of analysis

    CERN Document Server

    Taylor, Joseph L

    2012-01-01

    Foundations of Analysis is an excellent new text for undergraduate students in real analysis. More than other texts in the subject, it is clear, concise and to the point, without extra bells and whistles. It also has many good exercises that help illustrate the material. My students were very satisfied with it.-Nat Smale, University of Utah I have taught our Foundations of Analysis course (based on Joe Taylor.s book) several times recently, and have enjoyed doing so. The book is well-written, clear, and concise, and supplies the students with very good introductory discussions of the various t

  11. Analysis of metal samples

    International Nuclear Information System (INIS)

    Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.

    2001-01-01

    An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)

  12. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  13. Summary Analysis: Hanford Site Composite Analysis Update

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-06-05

    The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.

  14. Plasma data analysis using statistical analysis system

    International Nuclear Information System (INIS)

    Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.

    1987-01-01

    Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab

  15. Exercises in analysis

    CERN Document Server

    Gasińksi, Leszek

    2014-01-01

    Exercises in Analysis will be published in two volumes. This first volume covers problems in five core topics of mathematical analysis: metric spaces; topological spaces; measure, integration, and Martingales; measure and topology; and functional analysis. Each of five topics correspond to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. At least 170 exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic.   This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of th...

  16. Main: Nucleotide Analysis [KOME

    Lifescience Database Archive (English)

    Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against jap...onica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...

  17. Lectures on Functional Analysis

    CERN Document Server

    Kurepa, Svetozar; Kraljević, Hrvoje

    1987-01-01

    This volume consists of a long monographic paper by J. Hoffmann-Jorgensen and a number of shorter research papers and survey articles covering different aspects of functional analysis and its application to probability theory and differential equations.

  18. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  19. Electrical Subsurface Grounding Analysis

    International Nuclear Information System (INIS)

    J.M. Calle

    2000-01-01

    The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements

  20. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  1. Water Quality Analysis Simulation

    Science.gov (United States)

    The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.

  2. Countercontrol in behavior analysis.

    Science.gov (United States)

    Delprato, Dennis J

    2002-01-01

    Countercontrol is a functional class of behavior that is part of Skinner's analysis of social behavior. Countercontrol refers to behavioral episodes comprised of socially mediated aversive controlling conditions and escape or avoidance responses that do not reinforce, and perhaps even punish, controllers' responses. This paper suggests that neglect of countercontrol in modern behavior analysis is unfortunate because the concept applies to interpersonal and social relations the fundamental operant principle that human behavior is both controlled and controlling-humans are not passive and inflexible. Countercontrol is addressed here in terms of conceptual status, contemporary developments in behavior analysis, its importance in a behavior-analytic approach to freedom and cultural design, applications, and research. The main conclusion is that Skinner's formulation of counter-control is scientifically supported and worthy of increased prominence in behavior analysis.

  3. Longitudinal categorical data analysis

    CERN Document Server

    Sutradhar, Brajendra C

    2014-01-01

    This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...

  4. Structural analysis of DAEs

    DEFF Research Database (Denmark)

    Poulsen, Mikael Zebbelin

    2002-01-01

    , by the implementation of the Simpy tool box. This is an object oriented system implemented in the Python language. It can be used for analysis of DAEs, ODEs and non-linear equation and uses e.g. symbolic representations of expressions and equations. The presentations of theory and algorithms for structural index...... analysis of DAE is original in the sense that it is based on a new matrix representation of the structural information of a general DAE system instead of a graph oriented representation. Also the presentation of the theory is found to be more complete compared to other presentations, since it e.g. proves....... The methodology is mainly based on strutural index analysis which is not limited by the index of the DAE as other methodologies. As a result of structural index analysis one can perform index reduction of the DAE and obtain the so-called augmented underlying ODE. It is also described, how to use the augmented...

  5. Full closure strategic analysis.

    Science.gov (United States)

    2014-07-01

    The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...

  6. Multidimensional nonlinear descriptive analysis

    CERN Document Server

    Nishisato, Shizuhiko

    2006-01-01

    Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...

  7. Analysis in Banach spaces

    CERN Document Server

    Hytönen, Tuomas; Veraar, Mark; Weis, Lutz

    The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...

  8. Errors from Image Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.

  9. Analysis in Banach spaces

    CERN Document Server

    Hytönen, Tuomas; Veraar, Mark; Weis, Lutz

    2016-01-01

    The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...

  10. Analysis of rare categories

    CERN Document Server

    He, Jingrui

    2012-01-01

    This book focuses on rare category analysis where the majority classes have smooth distributions and the minority classes exhibit the compactness property. It focuses on challenging cases where the support regions of the majority and minority classes overlap.

  11. Introduction to global analysis

    CERN Document Server

    Kahn, Donald W

    2007-01-01

    This text introduces the methods of mathematical analysis as applied to manifolds, including the roles of differentiation and integration, infinite dimensions, Morse theory, Lie groups, and dynamical systems. 1980 edition.

  12. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  13. ITS risk analysis.

    Science.gov (United States)

    1996-06-01

    Risk analysis plays a key role in the implementation of an architecture. Early definition of the situations, : processes, or events that have the potential for impeding the implementation of key elements of the ITS : National Architecture is a critic...

  14. Biodiesel Emissions Analysis Program

    Science.gov (United States)

    Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.

  15. Analysis and logic

    CERN Document Server

    Henson, C Ward; Kechris, Alexander S; Odell, Edward; Finet, Catherine; Michaux, Christian; Cassels, J W S

    2003-01-01

    This volume comprises articles from four outstanding researchers who work at the cusp of analysis and logic. The emphasis is on active research topics; many results are presented that have not been published before and open problems are formulated.

  16. Principles of Fourier analysis

    CERN Document Server

    Howell, Kenneth B

    2001-01-01

    Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...

  17. Modern real analysis

    CERN Document Server

    Ziemer, William P

    2017-01-01

    This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...

  18. Trace analysis by TXRF

    International Nuclear Information System (INIS)

    Hockett, R.S.

    1995-01-01

    Total reflection X-Ray Fluorescence (TXRF) originally was developed for trace analysis of small residues but has become a widespread method for measuring trace surface metal contamination an semiconductor substrates. It is estimated that approximately 100 TXRF instruments are in se in the semiconductor industry worldwide, and approximately half that for residue analysis x analytical laboratories. TXRF instrumentation is available today for reaching detection limits d the order of 10 9 atoms/cm 2 . This review emphasizes some of the more recent developments in TXRF for trace analysis, in particular with the use of synchrotron x-ray sources (SR-TXRF). There is some promise of reaching 10 7 atoms/cm 2 detection limits for surface analysis of semi-conductor substrates. 19 refs

  19. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  20. Applied multivariate statistical analysis

    National Research Council Canada - National Science Library

    Johnson, Richard Arnold; Wichern, Dean W

    1988-01-01

    .... The authors hope that their discussions will meet the needs of experimental scientists, in a wide variety of subject matter areas, as a readable introduciton to the staistical analysis of multvariate observations...

  1. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  2. Journals Data Analysis Project

    OpenAIRE

    Killiard, Patricia

    2018-01-01

    Slides from a lightning talk (5 minute presentation) by Patricia Killiard (Acting Deputy Director, Academic Services, Cambridge University Library). Presented at Cambridge Libraries Conference 2018. The talk discussed the work carried out by the currently running Journals Data Analysis Project.

  3. B-10 analysis

    International Nuclear Information System (INIS)

    Holland, W.E.

    1980-02-01

    A method was developed to determine if boron-loaded polymeric material contained enriched boron or natural boron. A prototype analyzer was constructed, and initial planning was done for an actual analysis facility

  4. Design and Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...

  5. Biorefinery Sustainability Analysis

    DEFF Research Database (Denmark)

    J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist

    2017-01-01

    This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...

  6. Intelligent audio analysis

    CERN Document Server

    Schuller, Björn W

    2013-01-01

    This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition.  Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...

  7. Scientific stream pollution analysis

    National Research Council Canada - National Science Library

    Nemerow, Nelson Leonard

    1974-01-01

    A comprehensive description of the analysis of water pollution that presents a careful balance of the biological,hydrological, chemical and mathematical concepts involved in the evaluation of stream...

  8. Energy Sector Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.

    2006-10-01

    This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.

  9. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  10. Perspectives in shape analysis

    CERN Document Server

    Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie

    2016-01-01

    This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...

  11. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  12. Piping Stress Analysis

    International Nuclear Information System (INIS)

    Setjo, Renaningsih

    2000-01-01

    Piping stress analysis on Primary Sampling System, Reactor Cooling System, and Feedwater System for AP600 have been performed. Piping stress analysis is one of the requirements in the design of piping system. Piping stress is occurred due to static and dynamic loads during service. Analysis was carried out. Using PS+CAEPIPE software based on the individual and combination loads with assumption that failure could be happened during normal, upset, emergency and faulted condition as describe in ASME III/ANSI B31.1. With performing the piping stress analysis, the layout (proper pipe routing) of the piping system can be design with the requirements of piping stress and pipe supports in mind I.e sufficient flexibility for thermal expansion, etc to commensurate with the i tended service such as temperatures, pressure, seismic and anticipated loading

  13. Invitation to classical analysis

    CERN Document Server

    Duren, Peter

    2012-01-01

    This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ

  14. CMS analysis operations

    International Nuclear Information System (INIS)

    Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S

    2010-01-01

    During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.

  15. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  16. Countercontrol in behavior analysis

    OpenAIRE

    Delprato, Dennis J.

    2002-01-01

    Countercontrol is a functional class of behavior that is part of Skinner's analysis of social behavior. Countercontrol refers to behavioral episodes comprised of socially mediated aversive controlling conditions and escape or avoidance responses that do not reinforce, and perhaps even punish, controllers' responses. This paper suggests that neglect of countercontrol in modern behavior analysis is unfortunate because the concept applies to interpersonal and social relations the fundamental ope...

  17. Stable isotope analysis

    International Nuclear Information System (INIS)

    Tibari, Elghali; Taous, Fouad; Marah, Hamid

    2014-01-01

    This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.

  18. High Resolution Spectral Analysis

    Science.gov (United States)

    2006-10-25

    liable methods for high resolution spectral analysis of multivariable processes, as well as to distance measures for quantitative assessment of...called "modern nonlinear spectral analysis methods " [27]. An alternative way to reconstruct /„(#), based on Tn, is the periodogram/correlogram f{6...eie). A homotopy method was proposed in [8, 9] leading to a differential equation for A(T) in a homotopy variable r. If the statistics are consistent

  19. Urinary Protein Biomarker Analysis

    Science.gov (United States)

    2017-10-01

    associated protein biomarkers were identified by transcriptomic comparison of cancer cells vs. normal luminal cells; cancer-associated stromal cells vs...analysis; (C) correction with PSA, P = 0.012); (D) ROC curve analysis. 4-1. Use of PSA levels for marker level normalization Other organs along the...Copyright: Shi et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License 3.0 (CC BY 3.0), which

  20. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  1. DART system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Boggs, Paul T.; Althsuler, Alan (Exagrid Engineering); Larzelere, Alex R. (Exagrid Engineering); Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.

  2. Forensic neutron activation analysis

    International Nuclear Information System (INIS)

    Kishi, T.

    1987-01-01

    The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs

  3. Cuckoo malware analysis

    CERN Document Server

    Oktavianto, Digit

    2013-01-01

    This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.

  4. A PROOF Analysis Framework

    CERN Document Server

    Gonzalez Caballero, Isidro

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of ...

  5. Activation analysis in Greece

    International Nuclear Information System (INIS)

    Grimanis, A.P.

    1985-01-01

    A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples

  6. Distributed analysis in ATLAS

    CERN Document Server

    Dewhurst, Alastair; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

  7. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  8. Professionalizing Intelligence Analysis

    Directory of Open Access Journals (Sweden)

    James B. Bruce

    2015-09-01

    Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.

  9. Harmonic and geometric analysis

    CERN Document Server

    Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao

    2015-01-01

    This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights.  The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...

  10. Biosensors for Cell Analysis.

    Science.gov (United States)

    Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander

    2015-01-01

    Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.

  11. Flow Latency Analysis with the Architecture Analysis & Design Language (AADL)

    National Research Council Canada - National Science Library

    Feiler, Peter; Hansson, Jorgen

    2007-01-01

    .... The latency analysis framework and calculations are illustrated in the context of an example model that uses the flow specification notation of the Architecture Analysis & Design Language (AADL...

  12. Reactor safety analysis

    International Nuclear Information System (INIS)

    Arien, B.

    1998-01-01

    Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described

  13. LIGO data analysis

    International Nuclear Information System (INIS)

    Shawhan, P.S.

    2003-01-01

    Gravitational waves promise to provide new information about massive astrophysical objects in the universe. Technological advances and engineering experience have finally made it feasible to construct detectors with sufficient sensitivity to detect the extremely weak waves which are believed to reach Earth. The Laser Interferometer Gravitational-Wave Observatory (LIGO) project has constructed two 'observatories' in the United States which are poised to begin collecting scientifically interesting data. The LIGO Data Analysis System has been designed to support various types of scientific analysis to be performed using this data; its components include an interface to the raw data archive, a 'Beowulf' cluster of PCs for parallel processing, and a database to store data analysis products. Attention has also been paid to the deployment of client interface programs and utility software to scientists throughout the LIGO Scientific Collaboration

  14. COI Structural Analysis Presentation

    Science.gov (United States)

    Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2001-01-01

    This report discusses the structural analysis of the Next Generation Space Telescope Mirror System Demonstrator (NMSD) developed by Composite Optics Incorporated (COI) in support of the Next Generation Space Telescope (NGST) project. The mirror was submitted to Marshall Space Flight Center (MSFC) for cryogenic testing and evaluation. Once at MSFC, the mirror was lowered to approximately 40 K and the optical surface distortions were measured. Alongside this experiment, an analytical model was developed and used to compare to the test results. A NASTRAN finite element model was provided by COI and a thermal model was developed from it. Using the thermal model, steady state nodal temperatures were calculated based on the predicted environment of the large cryogenic test chamber at MSFC. This temperature distribution was applied in the structural analysis to solve for the deflections of the optical surface. Finally, these deflections were submitted for optical analysis and comparison to the interferometer test data.

  15. Photon activation analysis

    International Nuclear Information System (INIS)

    Segebade, C.; Weise, H.P.; Lutz, G.J.

    1988-01-01

    This book is written to give, in a concentrated form, an overview of the application of photonuclear reactions to activation analysis. Is is intended to accompany the analyst's work in the photon activation analysis laboratory as a practical usable reference. Emphasis is placed upon analytical qualitative and quantitative data which are based upon experimentally obtained results. Therefore, both a source of general information on photon activation analysis and a laboratory manual are combined in this book. The results of the authors' laboratory work and a large amount of literature data are evaluated and presented as completely as possible by the authors. Special knowledge of photonuclear physics is not required; only a very elementary theoretical introduction is given. More detailed information on the physical and mathematical theory should be sought in the special literature which is cited in the relevant chapters. (orig./RB)

  16. Exercises in analysis

    CERN Document Server

    Gasiński, Leszek

    2016-01-01

    This second of two Exercises in Analysis volumes covers problems in five core topics of mathematical analysis: Function Spaces, Nonlinear and Multivalued Maps, Smooth and Nonsmooth Calculus, Degree Theory and Fixed Point Theory, and Variational and Topological Methods. Each of five topics corresponds to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. Exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic. This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of the ...

  17. The data analysis handbook

    CERN Document Server

    Frank, IE

    1994-01-01

    Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...

  18. Fuzzy data analysis

    CERN Document Server

    Bandemer, Hans

    1992-01-01

    Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.

  19. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  20. Psychotherapeutic discourse analysis.

    Science.gov (United States)

    Lewis, B

    1995-01-01

    Psychotherapeutic discourse analysis is a significant, largely unexplored, tool for psychotherapist and an ideal data sight for linguists. Increased multidisciplinary research in this area would be particularly fruitful. Conversations, including therapeutic conversations, are far from transparent conduits of information from one person to another. Complicating surface communication is "metacommunication," which takes the form of unconscious conversational styles--culturally influenced rules, norms, and expectations of how a conversation should proceed. If the therapist and the patient are working with different conversational styles, then blocks in communication and understanding are inevitable. Through discourse analysis conversational styles can become more available to awareness for use in the service of therapeutic goals. This paper is an example of a discourse analysis of an individual therapy session. I attempt to bridge the gap between broad psychodynamic treatment strategies and minute micro-conversational strategies that can be used to enhance the therapeutic process.

  1. Foundations of VISAR analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H.

    2006-06-01

    The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.

  2. Pitfalls of Exergy Analysis

    Science.gov (United States)

    Vágner, Petr; Pavelka, Michal; Maršík, František

    2017-04-01

    The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.

  3. Real mathematical analysis

    CERN Document Server

    Pugh, Charles C

    2015-01-01

    Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...

  4. Principles of harmonic analysis

    CERN Document Server

    Deitmar, Anton

    2014-01-01

    This book offers a complete and streamlined treatment of the central principles of abelian harmonic analysis: Pontryagin duality, the Plancherel theorem and the Poisson summation formula, as well as their respective generalizations to non-abelian groups, including the Selberg trace formula. The principles are then applied to spectral analysis of Heisenberg manifolds and Riemann surfaces. This new edition contains a new chapter on p-adic and adelic groups, as well as a complementary section on direct and projective limits. Many of the supporting proofs have been revised and refined. The book is an excellent resource for graduate students who wish to learn and understand harmonic analysis and for researchers seeking to apply it.

  5. Mathematical analysis II

    CERN Document Server

    Canuto, Claudio

    2015-01-01

    The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...

  6. Mathematical analysis I

    CERN Document Server

    Zorich, Vladimir A

    2015-01-01

    VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...

  7. Physics analysis workstation

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-06-01

    The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig

  8. Behavior analysis in Brasil

    Directory of Open Access Journals (Sweden)

    João Cláudio Todorov

    2006-01-01

    Full Text Available The history of behavior analysis in Brazil began with the visit of Fred S. Keller as a FulbrightScholar to the University of São Paulo in 1961. Keller introduced Skinner works to the Brazilianpsychologists. His first assistant was Carolina Martuscelli Bori, then a social psychologistinfluenced by the work of Kurt Lewin. Initially guided by Keller, Carolina Bori was the majorforce in the diffusion of Behavior Analysis in Brazil, beginning with the psychology course ofthe University of Brasília, where the first course on Experimental Analysis of Behavior beganin August of 1964. Most of behavior analysts in Brazil today were students, directly or indirectly,of Carolina Bori. Several graduate programs throughout the country offer courses in behavioranalysis.

  9. Physics Analysis Workstation

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-01-01

    The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN and an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high-level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode

  10. Analysis of Waves

    DEFF Research Database (Denmark)

    Frigaard, Peter; Andersen, Thomas Lykke

    The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...

  11. Contamination analysis unit

    International Nuclear Information System (INIS)

    Gregg, H.R.; Meltzer, M.P.

    1996-01-01

    The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig

  12. Non-commutative analysis

    CERN Document Server

    Jorgensen, Palle

    2017-01-01

    The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.

  13. Neutron signal transfer analysis

    CERN Document Server

    Pleinert, H; Lehmann, E

    1999-01-01

    A new method called neutron signal transfer analysis has been developed for quantitative determination of hydrogenous distributions from neutron radiographic measurements. The technique is based on a model which describes the detector signal obtained in the measurement as a result of the action of three different mechanisms expressed by signal transfer functions. The explicit forms of the signal transfer functions are determined by Monte Carlo computer simulations and contain only the distribution as a variable. Therefore an unknown distribution can be determined from the detector signal by recursive iteration. This technique provides a simple and efficient tool for analysis of this type while also taking into account complex effects due to the energy dependency of neutron interaction and single and multiple scattering. Therefore this method provides an efficient tool for precise quantitative analysis using neutron radiography, as for example quantitative determination of moisture distributions in porous buil...

  14. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Object...... an analysis of the values of the object's pixels in MS-Excel. The shell of the proceedure could also be used for purposes other than just the derivation of Object - Sub-object statistics, e.g. rule-based assigment processes....... Variable. This procedure was developed in order to be able to export objects as ESRI shape data with the 90-percentile of the Hue of each object's pixels as an item in the shape attribute table. This procedure uses a sub-level single pixel chessboard segmentation, loops for each of the objects...

  15. Life Cycle Inventory Analysis

    DEFF Research Database (Denmark)

    Bjørn, Anders; Moltesen, Andreas; Laurent, Alexis

    2017-01-01

    The inventory analysis is the third and often most time-consuming part of an LCA. The analysis is guided by the goal and scope definition, and its core activity is the collection and compilation of data on elementary flows from all processes in the studied product system(s) drawing on a combination...... of different sources. The output is a compiled inventory of elementary flows that is used as basis of the subsequent life cycle impact assessment phase. This chapter teaches how to carry out this task through six steps: (1) identifying processes for the LCI model of the product system; (2) planning...... and collecting data; (3) constructing and quality checking unit processes; (4) constructing LCI model and calculating LCI results; (5) preparing the basis for uncertainty management and sensitivity analysis; and (6) reporting....

  16. Real analysis on intervals

    CERN Document Server

    Choudary, A D R

    2014-01-01

    The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...

  17. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  18. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  19. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  20. Activation analysis in numismatics

    International Nuclear Information System (INIS)

    Barrandon, J.N.

    1980-01-01

    It is shown that two nuclear methods permit a non-destructive determination of major, minor and trace elements in three important ''archaeological'' metals: gold, silver, copper and alloys. The first one neutron activation analysis with a 252 Cf neutron source, enables a fast and accurate determination of three important elements of the coin's composition, viz. gold, silver and copper. With the second one, proton activation analysis, trace elements at ppm level in gold, silver and copper metals can be determined. Using these two techniques of activation analysis two important numismatic problems can be studied: the evolution of the fineness; characterization or differentiation by the trace elements of the metal used to mint the coin. Examples of each numismatic problem are also given. (author)

  1. Digital Fourier analysis fundamentals

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations.  These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...

  2. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  3. Cost analysis guidelines

    International Nuclear Information System (INIS)

    Strait, R.S.

    1996-01-01

    The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies

  4. Flight Dynamics Analysis Branch

    Science.gov (United States)

    Stengle, Tom; Flores-Amaya, Felipe

    2000-01-01

    This report summarizes the major activities and accomplishments carried out by the Flight Dynamics Analysis Branch (FDAB), Code 572, in support of flight projects and technology development initiatives in fiscal year 2000. The report is intended to serve as a summary of the type of support carried out by the FDAB, as well as a concise reference of key accomplishments and mission experience derived from the various mission support roles. The primary focus of the FDAB is to provide expertise in the disciplines of flight dynamics, spacecraft trajectory, attitude analysis, and attitude determination and control. The FDAB currently provides support for missions and technology development projects involving NASA, government, university, and private industry.

  5. Introduction to real analysis

    CERN Document Server

    Schramm, Michael J

    2008-01-01

    This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl

  6. WATER SUPPLY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    R.D. Clark

    1996-02-06

    This analysis defines and evaluates the surface water supply system from the existing J-13 well to the North Portal. This system includes the pipe running from J-13 to a proposed Booster Pump Station at the intersection of H Road and the North Portal access road. Contained herein is an analysis of the proposed Booster Pump Station with a brief description of the system that could be installed to the South Portal and the optional shaft. The tanks that supply the water to the North Portal are sized, and the supply system to the North Portal facilities and up to Topopah Spring North Ramp is defined.

  7. Real Time Text Analysis

    Science.gov (United States)

    Senthilkumar, K.; Ruchika Mehra Vijayan, E.

    2017-11-01

    This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language

  8. Analysis of maintenance strategies

    International Nuclear Information System (INIS)

    Laakso, K.; Simola, K.

    1998-01-01

    The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view

  9. Spectral analysis by correlation

    International Nuclear Information System (INIS)

    Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G.

    1969-01-01

    The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [fr

  10. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  11. UCF WP TIPOVER ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Z. Ceylan

    1998-04-28

    The purpose of this analysis is to determine the structural response of the 21 pressurized water reactor (PWR) uncanistered fuel (UCF) waste package (WP) to a tipover design basis event (DBE) dynamic load; the results will be reported in terms of stress magnitudes. Finite-element solution was performed by making use of the commercially available ANSYS finite-element code. A finite-element model of the waste package was developed and analyzed for a tipover DBE dynamic load. The results of this analysis were provided in tables and were also plotted in terms of the maximum stress contours to determine their locations.

  12. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  13. WATER SUPPLY ANALYSIS

    International Nuclear Information System (INIS)

    Clark, R.D.

    1996-01-01

    This analysis defines and evaluates the surface water supply system from the existing J-13 well to the North Portal. This system includes the pipe running from J-13 to a proposed Booster Pump Station at the intersection of H Road and the North Portal access road. Contained herein is an analysis of the proposed Booster Pump Station with a brief description of the system that could be installed to the South Portal and the optional shaft. The tanks that supply the water to the North Portal are sized, and the supply system to the North Portal facilities and up to Topopah Spring North Ramp is defined

  14. Elements of real analysis

    CERN Document Server

    Sprecher, David A

    2010-01-01

    This classic text in introductory analysis delineates and explores the intermediate steps between the basics of calculus and the ultimate stage of mathematics: abstraction and generalization.Since many abstractions and generalizations originate with the real line, the author has made it the unifying theme of the text, constructing the real number system from the point of view of a Cauchy sequence (a step which Dr. Sprecher feels is essential to learn what the real number system is).The material covered in Elements of Real Analysis should be accessible to those who have completed a course in

  15. Signal flow analysis

    CERN Document Server

    Abrahams, J R; Hiller, N

    1965-01-01

    Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther

  16. Highdimensional data analysis

    CERN Document Server

    Cai, Tony

    2010-01-01

    Over the last few years, significant developments have been taking place in highdimensional data analysis, driven primarily by a wide range of applications in many fields such as genomics and signal processing. In particular, substantial advances have been made in the areas of feature selection, covariance estimation, classification and regression. This book intends to examine important issues arising from highdimensional data analysis to explore key ideas for statistical inference and prediction. It is structured around topics on multiple hypothesis testing, feature selection, regression, cla

  17. Notes on functional analysis

    CERN Document Server

    Bhatia, Rajendra

    2009-01-01

    These notes are a record of a one semester course on Functional Analysis given by the author to second year Master of Statistics students at the Indian Statistical Institute, New Delhi. Students taking this course have a strong background in real analysis, linear algebra, measure theory and probability, and the course proceeds rapidly from the definition of a normed linear space to the spectral theorem for bounded selfadjoint operators in a Hilbert space. The book is organised as twenty six lectures, each corresponding to a ninety minute class session. This may be helpful to teachers planning a course on this topic. Well prepared students can read it on their own.

  18. Analysis in Euclidean space

    CERN Document Server

    Hoffman, Kenneth

    2007-01-01

    Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq

  19. Amino acid analysis

    Science.gov (United States)

    Winitz, M.; Graff, J. (Inventor)

    1974-01-01

    The process and apparatus for qualitative and quantitative analysis of the amino acid content of a biological sample are presented. The sample is deposited on a cation exchange resin and then is washed with suitable solvents. The amino acids and various cations and organic material with a basic function remain on the resin. The resin is eluted with an acid eluant, and the eluate containing the amino acids is transferred to a reaction vessel where the eluant is removed. Final analysis of the purified acylated amino acid esters is accomplished by gas-liquid chromatographic techniques.

  20. Flow Analysis Software Toolkit

    Science.gov (United States)

    Watson, Velvin; Castagnera, Karen; Plessel, Todd; Merritt, Fergus; Kelaita, Paul; West, John; Sandstrom, Tim; Clucas, Jean; Globus, AL; Bancroft, Gordon; hide

    1993-01-01

    Flow Analysis Software Toolkit (FAST) computer program provides software environment facilitating visualization of data. Collection of separate programs (modules) running simultaneously and helps user to examine results of numerical and experimental simulations. Intended for graphical depiction of computed flows, also assists in analysis of other types of data. Combines capabilities of such programs as PLOT3D, RIP, SURF, and GAS into one software environment with modules sharing data. All modules have consistent, highly interactive graphical user interface. Modular construction makes it flexible and extensible. Environment custom-configured, and new modules developed and added as needed. Written in ANSI compliant FORTRAN 77 and C language.

  1. Incident analysis methodology

    International Nuclear Information System (INIS)

    Libmann, J.

    1986-05-01

    The number of French nuclear power stations in operation and their division into standardized plant series very soon led to the requirement for a precise organization, within both the nuclear safety authorities and the operator, Electricite de France. The methods of analysis have been gradually extended and diversified and we shall speak of them, but it is evident that a very precise definition of the boundaries between what concerns safety and what does not concern it, is needed. This report first deals with the criteria on which declarations are based before outlining the main guidelines of analysis methodology [fr

  2. Harmonic analysis and applications

    CERN Document Server

    Heil, Christopher

    2007-01-01

    This self-contained volume in honor of John J. Benedetto covers a wide range of topics in harmonic analysis and related areas. These include weighted-norm inequalities, frame theory, wavelet theory, time-frequency analysis, and sampling theory. The chapters are clustered by topic to provide authoritative expositions that will be of lasting interest. The original papers collected are written by prominent researchers and professionals in the field. The book pays tribute to John J. Benedetto's achievements and expresses an appreciation for the mathematical and personal inspiration he has given to

  3. Foundations of stochastic analysis

    CERN Document Server

    Rao, M M; Lukacs, E

    1981-01-01

    Foundations of Stochastic Analysis deals with the foundations of the theory of Kolmogorov and Bochner and its impact on the growth of stochastic analysis. Topics covered range from conditional expectations and probabilities to projective and direct limits, as well as martingales and likelihood ratios. Abstract martingales and their applications are also discussed. Comprised of five chapters, this volume begins with an overview of the basic Kolmogorov-Bochner theorem, followed by a discussion on conditional expectations and probabilities containing several characterizations of operators and mea

  4. Gait Analysis Laboratory

    Science.gov (United States)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  5. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  6. Electronic Circuit Analysis Language (ECAL)

    Science.gov (United States)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  7. Orthonormal polynomials in wavefront analysis: error analysis.

    Science.gov (United States)

    Dai, Guang-Ming; Mahajan, Virendra N

    2008-07-01

    Zernike circle polynomials are in widespread use for wavefront analysis because of their orthogonality over a circular pupil and their representation of balanced classical aberrations. However, they are not appropriate for noncircular pupils, such as annular, hexagonal, elliptical, rectangular, and square pupils, due to their lack of orthogonality over such pupils. We emphasize the use of orthonormal polynomials for such pupils, but we show how to obtain the Zernike coefficients correctly. We illustrate that the wavefront fitting with a set of orthonormal polynomials is identical to the fitting with a corresponding set of Zernike polynomials. This is a consequence of the fact that each orthonormal polynomial is a linear combination of the Zernike polynomials. However, since the Zernike polynomials do not represent balanced aberrations for a noncircular pupil, the Zernike coefficients lack the physical significance that the orthonormal coefficients provide. We also analyze the error that arises if Zernike polynomials are used for noncircular pupils by treating them as circular pupils and illustrate it with numerical examples.

  8. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    -and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  9. Static analysis for blinding

    DEFF Research Database (Denmark)

    Nielsen, Christoffer Rosenkilde; Nielson, Hanne Riis

    2006-01-01

    operation blinding. In this paper we study the theoretical foundations for one of the successful approaches to validating cryptographic protocols and we extend it to handle the blinding primitive. Our static analysis approach is based on Flow Logic; this gives us a clean separation between the specification...

  10. Russian Language Analysis Project

    Science.gov (United States)

    Serianni, Barbara; Rethwisch, Carolyn

    2011-01-01

    This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…

  11. Text analysis in R

    NARCIS (Netherlands)

    Welbers, K.; van Atteveldt, W.H.; Benoit, K.

    2017-01-01

    Computational text analysis has become an exciting research field with many applications in communication research. It can be a difficult method to apply, however, because it requires knowledge of various techniques, and the software required to perform most of these techniques is not readily

  12. Digital Systems Analysis

    Science.gov (United States)

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  13. PROOF Analysis Framework (PAF)

    Science.gov (United States)

    Delgado Fernández, J.; Fernández del Castillo, E.; González Caballero, I.; Rodríguez Marrero, A.

    2015-12-01

    The PROOF Analysis Framework (PAF) has been designed to improve the ability of the physicist to develop software for the final stages of an analysis where typically simple ROOT Trees are used and where the amount of data used is in the order of several terabytes. It hides the technicalities of dealing with PROOF leaving the scientist to concentrate on the analysis. PAF is capable of using available non specific resources on, for example, local batch systems, remote grid sites or clouds through the integration of other toolkit like PROOF Cluster or PoD. While it has been successfully used on LHC Run-1 data for some key analysis, including the H →WW dilepton channel, the higher instantaneous and integrated luminosity together with the increase of the center-of-mass energy foreseen for the LHC Run-2, which will increment the total size of the samples by a factor 6 to 20, will demand PAF to improve its scalability and to reduce the latencies as much as possible. In this paper we address the possible problems of processing such big data volumes with PAF and the solutions implemented to overcome them. We will also show the improvements in order to make PAF more modular and accessible to other communities.

  14. Manual for subject analysis

    International Nuclear Information System (INIS)

    2002-01-01

    This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)

  15. Doxing: a conceptual analysis

    NARCIS (Netherlands)

    Douglas, David

    2016-01-01

    Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it

  16. Texture analysis of

    NARCIS (Netherlands)

    Lubsch, A.; Timmermans, K.

    2017-01-01

    Texture analysis is a method to test the physical properties of a material by tension and compression. The growing interest in commercialisation of seaweeds for human food has stimulated research into the physical properties of seaweed tissue. These are important parameters for the survival of

  17. Instrumental analysis, second edition

    International Nuclear Information System (INIS)

    Christian, G.D.; O'Reilly, J.E.

    1988-01-01

    The second edition of Instrumental Analysis is a survey of the major instrument-based methods of chemical analysis. It appears to be aimed at undergraduates but would be equally useful in a graduate course. The volume explores all of the classical quantitative methods and contains sections on techniques that usually are not included in a semester course in instrumentation (such as electron spectroscopy and the kinetic methods). Adequate coverage of all of the methods contained in this book would require several semesters of focused study. The 25 chapters were written by different authors, yet the style throughout the book is more uniform than in the earlier edition. With the exception of a two-chapter course in analog and digital circuits, the book purports to de-emphasize instrumentation, focusing more on the theory behind the methods and the application of the methods to analytical problems. However, a detailed analysis of the instruments used in each method is by no means absent. The book has the favor of a user's guide to analysis

  18. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  19. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  20. Grey component analysis

    NARCIS (Netherlands)

    Westerhuis, J.A.; Derks, E.P.P.A.; Hoefsloot, H.C.J.; Smilde, A.K.

    2007-01-01

    The interpretation of principal component analysis (PCA) models of complex biological or chemical data can be cumbersome because in PCA the decomposition is performed without any knowledge of the system at hand. Prior information of the system is not used to improve the interpretation. In this paper

  1. Electronic Analysis of Communication.

    Science.gov (United States)

    Baggaley, Jon

    1982-01-01

    Discusses the use of microcomputer-based testing methods in media and community research, with descriptions of the Programme Evaluation Analysis Computer (PEAC) developed for the Ontario Education Communications Authority and of the application of the PEAC system in a study of second-by-second responses to Orson Welles'"War of the…

  2. Mathematical analysis in India

    Indian Academy of Sciences (India)

    Mathematics being too vast a canvas, this writing has been broken down further into sub-disciplines. More speci- fically, regarding work in mathematical analysis, three people – namely Alladi Sitaram, V S Sunder and M Vanninathan – were entrusted with the task of coming up with something representative, subject to some ...

  3. Eleven papers in analysis

    CERN Document Server

    Shabalin, P L; Yakubenko, A A; Pokhilevich, VA; Krein, M G

    1986-01-01

    This collection of eleven papers covers a broad spectrum of topics in analysis, from the study of certain classes of analytic functions to the solvability of singular problems for differential and integral equations to computational schemes for the partial differential equations and singular integral equations.

  4. Android Security Analysis

    Science.gov (United States)

    2016-03-01

    Android Security Analysis Final Report Michael Peck Gananand Kini Andrew Pyles March 2016 Sponsor: NSA Dept. No.: J83H Contract No.: W56KGU...4 2 Addressing Android App Vulnerabilities with the... Android Lint Checker ............................................5 2.1 App Data-in-Transit Vulnerabilities due to Insecure Certificate Validation or

  5. Advanced biomedical image analysis

    CERN Document Server

    Haidekker, Mark A

    2010-01-01

    "This book covers the four major areas of image processing: Image enhancement and restoration, image segmentation, image quantification and classification, and image visualization. Image registration, storage, and compression are also covered. The text focuses on recently developed image processing and analysis operators and covers topical research"--Provided by publisher.

  6. Neutron Multiplicity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Frame, Katherine Chiyoko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-28

    Neutron multiplicity measurements are widely used for nondestructive assay (NDA) of special nuclear material (SNM). When combined with isotopic composition information, neutron multiplicity analysis can be used to estimate the spontaneous fission rate and leakage multiplication of SNM. When combined with isotopic information, the total mass of fissile material can also be determined. This presentation provides an overview of this technique.

  7. a meta-analysis.

    African Journals Online (AJOL)

    In the subgroup analysis stratified by IBD type, significant association was found in Crohn's disease(CD)(CT vs. CC:OR=0.68,95%CI 0.48-0.97). ... Relationship between IL-10 gene -819C/T polymorphism and the risk of inflammatory bowel disease: a ... ing migration inhibitory factor and cytotoxic T lympho- cyte associated ...

  8. Social network analysis

    NARCIS (Netherlands)

    de Nooy, W.; Crothers, C.

    2009-01-01

    Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the

  9. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  10. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  11. Sensitivity Analysis Without Assumptions.

    Science.gov (United States)

    Ding, Peng; VanderWeele, Tyler J

    2016-05-01

    Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.

  12. Writing proofs in analysis

    CERN Document Server

    Kane, Jonathan M

    2016-01-01

    This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...

  13. Behavior Analysis for Elderly

    NARCIS (Netherlands)

    Salah, A.A.; Kröse, B.J.A.; Cook, D.J.; Salah, A.A.; Kröse, B.J.A.; Cook, D.J.

    2015-01-01

    Ubiquitous computing, new sensor technologies, and increasingly available and accessible algorithms for pattern recognition and machine learning enable automatic analysis and modeling of human behavior in many novel ways. In this introductory paper of the 6th International Workshop on Human Behavior

  14. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    -effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  15. Safety analysis for 'Fugen'

    International Nuclear Information System (INIS)

    1997-10-01

    The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ''Fugen'' was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ''Fugen'' has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)

  16. Python data analysis

    CERN Document Server

    Idris, Ivan

    2014-01-01

    This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.

  17. Analysis and probability

    CERN Document Server

    Spataru, Aurel

    2013-01-01

    Probability theory is a rapidly expanding field and is used in many areas of science and technology. Beginning from a basis of abstract analysis, this mathematics book develops the knowledge needed for advanced students to develop a complex understanding of probability. The first part of the book systematically presents concepts and results from analysis before embarking on the study of probability theory. The initial section will also be useful for those interested in topology, measure theory, real analysis and functional analysis. The second part of the book presents the concepts, methodology and fundamental results of probability theory. Exercises are included throughout the text, not just at the end, to teach each concept fully as it is explained, including presentations of interesting extensions of the theory. The complete and detailed nature of the book makes it ideal as a reference book or for self-study in probability and related fields. It covers a wide range of subjects including f-expansions, Fuk-N...

  18. Retrospective landscape analysis

    DEFF Research Database (Denmark)

    Fritzbøger, Bo

    2011-01-01

    On the basis of maps from the 18th and 19th centuries, a retrospective analysis was carried out of documentary settlement and landscape data extending back to the Middle Ages with the intention of identifying and dating general structural and dynamic features of the cultural landscape in a select...

  19. Multiscale principal component analysis

    International Nuclear Information System (INIS)

    Akinduko, A A; Gorban, A N

    2014-01-01

    Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis

  20. Euler principal component analysis

    NARCIS (Netherlands)

    Liwicki, Stephan; Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja

    Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ 2-norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA,

  1. Work Analysis and Expertise.

    Science.gov (United States)

    1997

    This document contains four papers from a symposium on work analysis and expertise. "A Taxonomy of Employee Development: Toward an Organizational Culture of Expertise" (Ronald L. Jacobs) includes five categories of employee competence: novice, specialist, experienced specialist, expert, and master. "An Integrated Needs…

  2. Fluid dynamic transient analysis

    International Nuclear Information System (INIS)

    Vilhena Reigosa, R. de

    1992-01-01

    This paper describes the methodology adopted at NUCLEN for the fluid dynamic analyses for ANGRA 2. The fluid dynamic analysis allows, through computer codes to simulate and quantify the loads resulting from fluid dynamic transients caused by postulated ruptures or operational transients, in the piping of the safety systems and of the important operational systems. (author)

  3. Advanced Economic Analysis

    Science.gov (United States)

    Greenberg, Marc W.; Laing, William

    2013-01-01

    An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.

  4. VENTILATION TECHNOLOGY SYSTEMS ANALYSIS

    Science.gov (United States)

    The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...

  5. Rotation in correspondence analysis

    NARCIS (Netherlands)

    van de Velden, Michel; Kiers, Henk A.L.

    2005-01-01

    In correspondence analysis rows and columns of a nonnegative data matrix are depicted as points in a, usually, two-dimensional plot. Although such a two-dimensional plot often provides a reasonable approximation, the situation can occur that an approximation of higher dimensionality is required.

  6. Learning Haskell data analysis

    CERN Document Server

    Church, James

    2015-01-01

    If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.

  7. Introductory real analysis

    CERN Document Server

    Kolmogorov, A N; Silverman, Richard A

    1975-01-01

    Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.

  8. Radiological analysis of

    African Journals Online (AJOL)

    Dr Olaleye

    Crustacean and Sediment Samples from Fresh and Marine. Water in Oil Exploration Area of Ondo State, Nigeria. J. A. Ademola1* and S. I. Ehiedu2. 1. Department of Physics, University of Ibadan, Ibadan, Nigeria. 2. National Institute of Radiation Protection and Research, Ibadan, Nigeria. ABSTRACT: Radiological analysis ...

  9. [Analysis of pacemaker ECGs].

    Science.gov (United States)

    Israel, Carsten W; Ekosso-Ejangue, Lucy; Sheta, Mohamed-Karim

    2015-09-01

    The key to a successful analysis of a pacemaker electrocardiogram (ECG) is the application of the systematic approach used for any other ECG without a pacemaker: analysis of (1) basic rhythm and rate, (2) QRS axis, (3) PQ, QRS and QT intervals, (4) morphology of P waves, QRS, ST segments and T(U) waves and (5) the presence of arrhythmias. If only the most obvious abnormality of a pacemaker ECG is considered, wrong conclusions can easily be drawn. If a systematic approach is skipped it may be overlooked that e.g. atrial pacing is ineffective, the left ventricle is paced instead of the right ventricle, pacing competes with intrinsic conduction or that the atrioventricular (AV) conduction time is programmed too long. Apart from this analysis, a pacemaker ECG which is not clear should be checked for the presence of arrhythmias (e.g. atrial fibrillation, atrial flutter, junctional escape rhythm and endless loop tachycardia), pacemaker malfunction (e.g. atrial or ventricular undersensing or oversensing, atrial or ventricular loss of capture) and activity of specific pacing algorithms, such as automatic mode switching, rate adaptation, AV delay modifying algorithms, reaction to premature ventricular contractions (PVC), safety window pacing, hysteresis and noise mode. A systematic analysis of the pacemaker ECG almost always allows a probable diagnosis of arrhythmias and malfunctions to be made, which can be confirmed by pacemaker control and can often be corrected at the touch of the right button to the patient's benefit.

  10. Policy Analysis and Decisionmaking,

    Science.gov (United States)

    1980-01-01

    on "PoI icy Analysis as appl ied to Hiqh-Level Decision-Makinq,’ orciani,’t, ard ,ponsored by Orinoquia Asociacion Civil , in Caracas. Vvelc/t l...substantive and tactical c. Rand’s experience (the new Institute for Civil Justice as a change) i

  11. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  12. Experimental Analysis of Algorithms.

    Science.gov (United States)

    1987-12-01

    analysis tools for benchmark studies of heuristics for the Traveling Salesman Problem. Crowder, Dembo , and Mulvey[17] discuss issues in the presentation of...H. P. Crower, R. S. Dembo , and J. M. Mulvey. Reporting computational experiments in mathematical programming. Mathematical Programming 15:316-329

  13. Statistical Analysis Plan

    DEFF Research Database (Denmark)

    Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi

    2015-01-01

    This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....

  14. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  15. Visual Interactive Analysis

    DEFF Research Database (Denmark)

    Kirchmeier-Andersen, Sabine; Møller Christensen, Jakob; Lihn Jensen, Bente

    2004-01-01

    This article presents the latest version of VIA (version 3.0). The development of the program was initiated by a demand for more systematic training of language analysis in high schools and universities. The system is now web-based, which enables teachers and students to share exercises across...

  16. Chapter 8. Data Analysis

    Science.gov (United States)

    Lyman L. McDonald; Christina D. Vojta; Kevin S. McKelvey

    2013-01-01

    Perhaps the greatest barrier between monitoring and management is data analysis. Data languish in drawers and spreadsheets because those who collect or maintain monitoring data lack training in how to effectively summarize and analyze their findings. This chapter serves as a first step to surmounting that barrier by empowering any monitoring team with the basic...

  17. Analysis of orthotropic beams

    Science.gov (United States)

    Jen Y. Liu; S. Cheng

    1979-01-01

    A plane-stress analysis of orthotropic or isotropic beams is presented. The loading conditions considered are: (1) a concentrated normal load arbitrarily located on the beam, and (2) a distributed normal load covering an arbitrary length of the beam. exhibit close agreement with existing experimental data from Sitka spruce beams. Other loading conditions can similarly...

  18. Benchmark risk analysis models

    NARCIS (Netherlands)

    Ale BJM; Golbach GAM; Goos D; Ham K; Janssen LAM; Shield SR; LSO

    2002-01-01

    A so-called benchmark exercise was initiated in which the results of five sets of tools available in the Netherlands would be compared. In the benchmark exercise a quantified risk analysis was performed on a -hypothetical- non-existing hazardous establishment located on a randomly chosen location in

  19. Polysome Profile Analysis - Yeast

    Czech Academy of Sciences Publication Activity Database

    Pospíšek, M.; Valášek, Leoš Shivaya

    2013-01-01

    Roč. 530, č. 2013 (2013), s. 173-181 ISSN 0076-6879 Institutional support: RVO:61388971 Keywords : grow yeast cultures * polysome profile analysis * sucrose density gradient centrifugation Subject RIV: CE - Biochemistry Impact factor: 2.194, year: 2013

  20. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  1. Local Noncollinear Spin Analysis.

    Science.gov (United States)

    Abate, Bayileyegn A; Joshi, Rajendra P; Peralta, Juan E

    2017-12-12

    In this work, we generalize the local spin analysis of Clark and Davidson [J. Chem. Phys. 2001 115 (16), 7382] for the partitioning of the expectation value of the molecular spin square operator, ⟨Ŝ 2 ⟩, into atomic contributions, ⟨Ŝ A ·Ŝ B ⟩, to the noncollinear spin case in the framework of density functional theory (DFT). We derive the working equations, and we show applications to the analysis of the noncollinear spin solutions of typical spin-frustrated systems and to the calculation of magnetic exchange couplings. In the former case, we employ the triangular H 3 He 3 test molecule and a Mn 3 complex to show that the local spin analysis provides additional information that complements the standard one-particle spin population analysis. For the calculation of magnetic exchange couplings, J AB , we employ the local spin partitioning to extract ⟨Ŝ A ·Ŝ B ⟩ as a function of the interatomic spin orientation given by the angle θ. This, combined with the dependence of the electronic energy with θ, provides a methodology to extract J AB from DFT calculations that, in contrast to conventional energy differences based methods, does not require the use of ad hoc S A and S B values.

  2. Spatial Data Analysis.

    Science.gov (United States)

    Banerjee, Sudipto

    2016-01-01

    With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains.

  3. Haskell data analysis cookbook

    CERN Document Server

    Shukla, Nishant

    2014-01-01

    Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.

  4. Analysis of indentation creep

    Science.gov (United States)

    Don S. Stone; Joseph E. Jakes; Jonathan Puthoff; Abdelmageed A. Elmustafa

    2010-01-01

    Finite element analysis is used to simulate cone indentation creep in materials across a wide range of hardness, strain rate sensitivity, and work-hardening exponent. Modeling reveals that the commonly held assumption of the hardness strain rate sensitivity (mΗ) equaling the flow stress strain rate sensitivity (mσ...

  5. Uranium and transuranium analysis

    International Nuclear Information System (INIS)

    Regnaud, F.

    1989-01-01

    Analytical chemistry of uranium, neptunium, plutonium, americium and curium is reviewed. Uranium and neptunium are mainly treated and curium is only briefly evoked. Analysis methods include coulometry, titration, mass spectrometry, absorption spectrometry, spectrofluorometry, X-ray spectrometry, nuclear methods and radiation spectrometry [fr

  6. Proteoglycan isolation and analysis

    DEFF Research Database (Denmark)

    Woods, A; Couchman, J R

    2001-01-01

    Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...

  7. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  8. Activation Analysis of Aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Brune, Dag

    1961-01-15

    An analysis of pure aluminium alloyed with magnesium was per- formed by means of gamma spectrometry , Chemical separations were not employed. The isotopes to be determined were obtained in conditions of optimum activity by suitably choosing the time of irradiation and decay. The following elements were detected and measured quantitatively: Iron, zinc, copper, gallium, manganese, chromium, scandium and hafnium.

  9. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  10. A Role for Language Analysis in Mathematics Textbook Analysis

    Science.gov (United States)

    O'Keeffe, Lisa; O'Donoghue, John

    2015-01-01

    In current textbook analysis research, there is a strong focus on the content, structure and expectation presented by the textbook as elements for analysis. This research moves beyond such foci and proposes a framework for textbook language analysis which is intended to be integrated into an overall framework for mathematics textbook analysis. The…

  11. Introduction to Food Analysis

    Science.gov (United States)

    Nielsen, S. Suzanne

    Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet

  12. Drift Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dwayne C. Kicker

    2001-09-28

    A statistical description of the probable block sizes formed by fractures around the emplacement drifts has been developed for each of the lithologic units of the repository host horizon. A range of drift orientations with the drift azimuth varied in 15{sup o} increments has been considered in the static analysis. For the quasi-static seismic analysis, and the time-dependent and thermal effects analysis, two drift orientations have been considered: a drift azimuth of 105{sup o} and the current emplacement drift azimuth of 75{sup o}. The change in drift profile resulting from progressive deterioration of the emplacement drifts has been assessed both with and without backfill. Drift profiles have been determined for four different time increments, including static (i.e., upon excavation), 200 years, 2,000 years, and 10,000 years. The effect of seismic events on rock fall has been analyzed. Block size distributions and drift profiles have been determined for three seismic levels, including a 1,000-year event, a 5,000-year event, and a 10,000-year event. Data developed in this modeling and analysis activity have been entered into the TDMS (DTN: MO0109RDDAAMRR.003). The following conclusions have resulted from this drift degradation analysis: (1) The available fracture data are suitable for supporting a detailed key block analysis of the repository host horizon rock mass. The available data from the north-south Main Drift and the east-west Cross Drift provide a sufficient representative fracture sample of the repository emplacement drift horizon. However, the Tptpln fracture data are only available from a relatively small section of the Cross Drift, resulting in a smaller fracture sample size compared to the other lithologic units. This results in a lower degree of confidence that the key block data based on the Tptpln data set is actually representative of the overall Tptpln key block population. (2) The seismic effect on the rock fall size distribution for all events

  13. Best estimate containment analysis

    International Nuclear Information System (INIS)

    Smith, L.C.; Gresham, J.A.

    1993-01-01

    Primary reactor coolant system pipe ruptures are postulated as part of the design basis for containment integrity and equipment qualification validation for nuclear power plants. Current licensing analysis uses bounding conditions and assumptions, outside the range of actual operation, to determine a conservative measure of the performance requirements. Although this method has been adequate in the past, it does often involve the inclusion of excessive conservatism. A new licensing approach is under development that considers the performance of realistic analysis which quantifies the true plant and response. A licensing limit is then quantified above the realistic requirements by applying the appropriate plant data and methodology uncertainties. This best estimate approach allows a true measure of the conservative margin, above the plant performance requirements, to be quantified. By utilizing a portion of this margin, the operation, surveillance and maintenance burden can be reduced by transferring the theoretical margin inherent in the licensing analysis to real margin applied at the plant. Relaxation of surveillance and maintenance intervals, relaxation of diesel loading and containment cooling requirements, increased quantity of necessary equipment allowed to be out of service, and allowances for equipment allowed to be out of service, and allowances for equipment degradation are all potential benefits of applying this approach. Significant margins exist in current calculations due to the bounding nature of the evaluations. Scoping studies, which help quantify the potential margin available through best estimate mass and energy release analysis, demonstrate this. Also discussed in this paper is the approach for best estimate loss-of-coolant accident mass and energy release and containment analysis, the computer programs, the projected benefits, and the expected future directions

  14. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  15. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  16. NID Copper Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Zhu, Zihua

    2011-09-12

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology, possibly one under development at Nonlinear Ion Dynamics (NID), will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL in January 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are reported here. A second sample of isotopically separated copper was provided by NID to PNNL in August 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are also reported here.

  17. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  18. Critical Analysis of Multimodal Discourse

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples...... of recent work in the critical analysis of multimodal discourse....

  19. Regularized Generalized Canonical Correlation Analysis

    Science.gov (United States)

    Tenenhaus, Arthur; Tenenhaus, Michel

    2011-01-01

    Regularized generalized canonical correlation analysis (RGCCA) is a generalization of regularized canonical correlation analysis to three or more sets of variables. It constitutes a general framework for many multi-block data analysis methods. It combines the power of multi-block data analysis methods (maximization of well identified criteria) and…

  20. Exploratory Bi-Factor Analysis

    Science.gov (United States)

    Jennrich, Robert I.; Bentler, Peter M.

    2011-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger. The bi-factor model has a general factor and a number of group factors. The purpose of this article is to introduce an exploratory form of bi-factor analysis. An advantage of using exploratory bi-factor analysis is that one need not provide a specific…

  1. Workbook on data analysis

    International Nuclear Information System (INIS)

    Hopke, P.K.

    2000-01-01

    As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)

  2. MIR Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hazen, Damian [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Hick, Jason [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  3. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  4. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization.......In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  5. TMI-2 analysis exercise

    International Nuclear Information System (INIS)

    1990-01-01

    On March 28, 1979 an accident at the Three Mile Island Unit 2 (TMI-2) nuclear power plant led to melting of approximately 50% of the reactor's core. Since the TMI-2 accident has not been duplicated in an experimental facility, TMI-2 provides a unique opportunity to assess the capability of our severe accident analysis methods to simulate an accident in a full scale nuclear power plant. In this regard the OECD Nuclear Energy Agency (NEA) in collaboration with the US Department of Energy (DOE), has established a Joint Task Group to analyze various periods of the accident in an effort to benchmark severe accident computer codes. Participants in the analysis exercise and contributors to this report are provided and the results and conclusions that may be drawn from these results of the benchmark analyses are documented in this report. 12 refs., 10 figs., 5 tabs

  6. Waveform analysis of sound

    CERN Document Server

    Tohyama, Mikio

    2015-01-01

    What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...

  7. In Silico Expression Analysis.

    Science.gov (United States)

    Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz

    2016-01-01

    Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress.

  8. Exascale Data Analysis

    CERN Multimedia

    CERN. Geneva; Fitch, Blake

    2011-01-01

    Traditionaly, the primary role of supercomputers was to create data, primarily for simulation applications. Due to usage and technology trends, supercomputers are increasingly also used for data analysis. Some of this data is from simulations, but there is also a rapidly increasingly amount of real-world science and business data to be analyzed. We briefly overview Blue Gene and other current supercomputer architectures. We outline future architectures, up to the Exascale supercomputers expected in the 2020 time frame. We focus on the data analysis challenges and opportunites, especially those concerning Flash and other up-and-coming storage class memory. About the speakers Blake G. Fitch has been with IBM Research, Yorktown Heights, NY since 1987, mainly pursuing interests in parallel systems. He joined the Scalable Parallel Systems Group in 1990, contributing to research and development that culminated in the IBM scalable parallel system (SP*) product. His research interests have focused on applicatio...

  9. Handbook of radioactivity analysis

    CERN Document Server

    2012-01-01

    The updated and much expanded Third Edition of the "Handbook of Radioactivity Analysis" is an authoritative reference providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities and in the implementation of nuclear forensic analysis and nuclear safeguards. The Third Edition contains seven new chapters providing a reference text much broader in scope than the previous Second Edition, and all of the other chapters have been updated and expanded many with new authors. The book describes the basic principles of radiation detection and measurement, the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-ar...

  10. Intracochlear microprobe analysis

    International Nuclear Information System (INIS)

    Bone, R.C.; Ryan, A.F.

    1982-01-01

    Energy dispersive x-ray analysis (EDXA) or microprobe analysis provides cochlear physiologists with a means of accurately assessing relative ionic concentrations in selected portions of the auditory mechanism. Rapid freezing followed by lyophilization allows the recovery of fluid samples in crystalline form not only from perilymphatic and endolymphatic spaces, but also from much smaller subregions of the cochlea. Because samples are examined in a solid state, there is no risk of diffusion into surrounding or juxtaposed fluids. Samples of cochlear tissues may also be evaluated without the danger of intercellular ionic diffusion. During direct visualization by scanning electron microscopy, determination of the biochemical makeup of the material being examined can be simultaneously, assuring the source of the data collected. Other potential advantages and disadvantages of EDXA are reviewed. Initial findings as they relate to endolymph, perilymph, stria vascularis, and the undersurface of the tectorial membrane are presented

  11. Structured Analysis - IDEF0

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1999-01-01

    This note introduces the IDEF0 modelling language (semantics and syntax), and associated rules and techniques, for developing structured graphical representations of a system or enterprise. Use of this standard for IDEF0 permits the construction of models comprising system functions (activities...... that require a modelling technique for the analysis, development, re-engineering, integration, or acquisition of information systems; and incorporate a systems or enterprise modelling technique into a business process analysis or software engineering methodology.This note is a summary of the Standard...... for Integration Definition for Function Modelling (IDEF0). I.e. the Draft Federal Information Processing Standards Publication 183, 1993, December 21, Announcing the Standard for Integration Definition for Function Modelling (IDEF0)....

  12. Multicriteria diversity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stirling, Andy, E-mail: a.c.stirling@sussex.ac.u [SPRU-Science and Technology Policy Research, Freeman Centre, University of Sussex, Sussex BN1 9QE (United Kingdom)

    2010-04-15

    This paper outlines a novel general framework for analysing energy diversity. A critical review of different reasons for policy interest reveals that diversity is more than a supply security strategy. There are particular synergies with strategies for transitions to sustainability. Yet - despite much important work - policy analysis tends to address only a subset of the properties of diversity and remains subject to ambiguity, neglect and special pleading. Developing earlier work, the paper proposes a more comprehensive heuristic framework, accommodating a wide range of different disciplinary and socio-political perspectives. It is argued that the associated multicriteria diversity analysis method provides a more systematic, complete and transparent way to articulate disparate perspectives and approaches and so help to inform more robust and accountable policymaking.

  13. Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Feng, Ling

    2008-01-01

    of audio contexts along with pattern recognition methods to map components to known contexts. It also involves looking for the right representations for auditory inputs, i.e. the data analytic processing pipelines invoked by human brains. The main ideas refer to Cognitive Component Analysis, defined......This dissertation concerns the investigation of the consistency of statistical regularities in a signaling ecology and human cognition, while inferring appropriate actions for a speech-based perceptual task. It is based on unsupervised Independent Component Analysis providing a rich spectrum...... as the process of unsupervised grouping of generic data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. Its hypothesis runs ecologically: features which are essentially independent in a context defined ensemble, can be efficiently coded as sparse...

  14. Nuclear forensic analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present talk the fundamentals of the nuclear forensic investigations will be discussed followed by the detailed standard operating procedure (SOP) for the nuclear forensic analysis. The characteristics, such as, dimensions, particle size, elemental and isotopic composition help the nuclear forensic analyst in source attribution of the interdicted material, as the specifications of the nuclear materials used by different countries are different. The analysis of elemental composition could be done by SEM-EDS, XRF, CHNS analyser, etc. depending upon the type of the material. Often the trace constituents (analysed by ICP-AES, ICP-MS, AAS, etc) provide valuable information about the processes followed during the production of the material. Likewise the isotopic composition determined by thermal ionization mass spectrometry provides useful information about the enrichment of the nuclear fuel and hence its intended use

  15. Analysis of neural data

    CERN Document Server

    Kass, Robert E; Brown, Emery N

    2014-01-01

    Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.

  16. Physical analysis for tribology

    International Nuclear Information System (INIS)

    Quinn, F.J.

    1991-01-01

    This textbook by Dr. Quinn contains an interesting and useful combination of subject matter related to tribology and methods of surface analysis pertinent to wear problems. A brief introductory chapter includes a good overview of wear phenomena and mechanisms. Three chapters, comprising about one-third of the book, discuss surface and surface film diagnostic and analysis methods. These include optical, electrical and magnetic techniques as well as electron and x-ray diffraction methods. Considerable detail is provided on background related to crystallography and diffraction. Those not concerned with technique per se, will likely omit these sections. The last five chapters are core subject matter for students, engineers, and researchers interested in wear phenomena. Dr. Quinn draws considerable material from his own extensive background in the area, as well as a good selection of other examples from the research literature

  17. Constructive real analysis

    CERN Document Server

    Goldstein, Allen A

    1967-01-01

    This text introduces the methods of applied functional analysis and applied convexity. Suitable for advanced undergraduates and graduate students of mathematics, science, and technology, it focuses on the solutions to two closely related problems. The first concerns finding roots of systems of equations and operative equations in a given region. The second involves extremal problems of minimizing or maximizing functions defined on subsets of finite and infinite dimensional spaces. Rather than citing practical algorithms for solving problems, this treatment provides the tools for studying problem-related algorithms.Topics include iterations and fixed points, metric spaces, nonlinear programming, polyhedral convex programming, and infinite convex programming. Additional subjects include linear spaces and convex sets and applications to integral equations. Students should be familiar with advanced calculus and linear algebra. As an introduction to elementary functional analysis motivated by application, this vol...

  18. Finite Discrete Gabor Analysis

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel

    2007-01-01

    on the real line to be well approximated by finite and discrete Gabor frames. This method of approximation is especially attractive because efficient numerical methods exists for doing computations with finite, discrete Gabor systems. This thesis presents new algorithms for the efficient computation of finite......, discrete Gabor coefficients. Reconstruction of a signal from its Gabor coefficients is done by the use of a so-called dual window. This thesis presents a number of iterative algorithms to compute dual and self-dual windows. The Linear Time Frequency Toolbox is a Matlab/Octave/C toolbox for doing basic...... discrete time/frequency and Gabor analysis. It is intended to be both an educational and a computational tool. The toolbox was developed as part of this Ph.D. project to provide a solid foundation for the field of computational Gabor analysis....

  19. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  20. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  1. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels

    2015-01-01

    This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...... of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......). Based on the sociology of associations and the mathematics of classical, fuzzy and rough set theories, this paper proposes a research program. The function of which is to design, develop and evaluate social set analytics in terms of fundamentally novel formal models, predictive methods and visual...

  2. Neutron activation analysis

    International Nuclear Information System (INIS)

    Taure, I.; Riekstina, D.; Veveris, O.

    2004-01-01

    Neutron activation analysis (NAA) in Latvia began to develop after 1961 when nuclear reactor in Salaspils started to work. It provided a powerful neuron source, which is necessary for this analytical method. In 1963 at Institute of Physics of the Latvian Academy of Sciences the Laboratory of Neutron Activation Analysis was formed. At the first stage of development the main tasks were of theoretical and technical aspects of NAA. Later the NAA was used to solve problems in technology, biology, and medicine. In the beginning of the 80-ties more attention was focussed to the use of NAA in the environmental research. Environmental problems stayed the main task till the closing the nuclear reactor in Salaspils in 1998 that ceased the main the existence of the laboratory and of NAA, this significant and powerful analytical method in Latvia and Baltic in general. (authors)

  3. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...... of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...

  4. UVISS preliminary visibility analysis

    DEFF Research Database (Denmark)

    Betto, Maurizio

    1998-01-01

    The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part of the w......The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...... of the work is also to setup the kernel of a software tool for the visibility analysis thatshould be easily expandable to consider more complex strucures for future activities.This analysis is part of the UVISS assessment study and it is meant to provide elementsfor the definition and the selection...

  5. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...

  6. Visualization analysis and design

    CERN Document Server

    Munzner, Tamara

    2015-01-01

    Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...

  7. Overcoming: a concept analysis.

    Science.gov (United States)

    Brush, Barbara L; Kirk, Keri; Gultekin, Laura; Baiardi, Janet M

    2011-01-01

    Nurses often work with individuals and populations striving to improve or maintain the quality of their lives. Many, struggling from complex health and social problems, are challenged to surmount barriers to achieve this goal. The growing number of homeless families in the United States represent one such cohort. To develop an operational definition of overcoming and explicate its meaning, attributes, and characteristics as it relates to homeless families. Using the concept analysis method described by Walker and Avant, along with an extensive literature review, and sample cases pertaining to family homelessness, we delineated the defining attributes, antecedents, consequences, and empirical referents of the concept, overcoming. The results of this concept analysis, particularly the relationship of overcoming to family homelessness, provide guidance for further conceptualization and empirical testing, as well as for clinical practice. © 2011 Wiley Periodicals, Inc.

  8. Thermal analysis of COIL

    Science.gov (United States)

    Takeuchi, Noriyuki; Sugimoto, Daichi; Tei, Kazuyoku; Fujioka, Tomoo

    2004-05-01

    Analysis of heat release into operative gas of Chemical Oxygen Iodine Laser (COIL) is discussed. Pooling reaction of oxygen molecules in the excited state, the iodine dissociation process and the interaction of them with water vapor release energy of in the excited state oxygen molecules as heat energy. As results of heat release in the plenum, a rise of the total pressure as a rise of the total temperature is observed, and in the supersonic region a rise of static pressure and a decrease of total pressure as a rise of total temperature are observed. By following our analysis technique regarding pressure data of three different nozzles, the evaluations such as energy loss in a duct from a Singlet delta Oxygen Generator (SOG) and the number of dissipated oxygen molecules for the iodine dissociation can be estimated.

  9. Value tree analysis

    International Nuclear Information System (INIS)

    Keeney, R.; Renn, O.; Winterfeldt, D. von; Kotte, U.

    1985-01-01

    What are the targets and criteria on which national energy policy should be based. What priorities should be set, and how can different social interests be matched. To answer these questions, a new instrument of decision theory is presented which has been applied with good results to controversial political issues in the USA. The new technique is known under the name of value tree analysis. Members of important West German organisations (BDI, VDI, RWE, the Catholic and Protestant Church, Deutscher Naturschutzring, and ecological research institutions) were asked about the goals of their organisations. These goals were then ordered systematically and arranged in a hierarchical tree structure. The value trees of different groups can be combined into a catalogue of social criteria of acceptability and policy assessment. The authors describe the philosophy and methodology of value tree analysis and give an outline of its application in the development of a socially acceptable energy policy. (orig.) [de

  10. Data Systems Task Analysis.

    Science.gov (United States)

    1979-08-01

    QUALITY CCNTROL SUPERVISOR/NCOIC 369. PROGRAMMER 07?. PROGRAMMER ANALYST C7l, PROGRAMMING/ANALYSIS SUPERVISCR 󈨌. UALITY CONTROL PETTY OFFICER/CLERK...CLASSIFICATION OF THE FACILITY OR SITE THAT YOU ARE PRESENTLY WORKING IN? 01. CDPA (CENTRAL DESIGN PROGRAMMING ACTIVITY) 02. RASC (REGIONAL AUTOMATED...CARDS MANUALLY I)9. COORDINATE WITH CfFICES CF PFIMARY RESPONSIBILITY (OPR) ON NEW OR REVISED REPORTING REQUIREMENTS 115. DETERMINE ALTERNATE METHODS

  11. Paternity analysis in Excel.

    Science.gov (United States)

    Rocheta, Margarida; Dionísio, F Miguel; Fonseca, Luís; Pires, Ana M

    2007-12-01

    Paternity analysis using microsatellite information is a well-studied subject. These markers are ideal for parentage studies and fingerprinting, due to their high-discrimination power. This type of data is used to assign paternity, to compute the average selfing and outcrossing rates and to estimate the biparental inbreeding. There are several public domain programs that compute all this information from data. Most of the time, it is necessary to export data to some sort of format, feed it to the program and import the output to an Excel book for further processing. In this article we briefly describe a program referred from now on as Paternity Analysis in Excel (PAE), developed at IST and IBET (see the acknowledgments) that computes paternity candidates from data, and other information, from within Excel. In practice this means that the end user provides the data in an Excel sheet and, by pressing an appropriate button, obtains the results in another Excel sheet. For convenience PAE is divided into two modules. The first one is a filtering module that selects data from the sequencer and reorganizes it in a format appropriate to process paternity analysis, assuming certain conventions for the names of parents and offspring from the sequencer. The second module carries out the paternity analysis assuming that one parent is known. Both modules are written in Excel-VBA and can be obtained at the address (www.math.ist.utl.pt/~fmd/pa/pa.zip). They are free for non-commercial purposes and have been tested with different data and against different software (Cervus, FaMoz, and MLTR).

  12. Techno-Economic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Salvesen, F.; Sandgren, J. [KanEnergi AS, Rud (Norway)

    1997-12-31

    The present energy situation in the target area is summarized: 20 million inhabitants without electricity in north- west Russia, 50 % of the people in the Baltic`s without electricity, very high technical skills, biggest problems is the finance. The energy situation, the advantages of the renewables, the restrictions, and examples for possible technical solutions are reviewed on the basis of short analysis and experience with the Baltics and Russia

  13. Dental Forensics: Bitemark Analysis

    Directory of Open Access Journals (Sweden)

    Elza Ibrahim Auerkari

    2013-06-01

    Full Text Available Forensic odontology (dental forensics can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be practically unique for each individual. Therefore, finding such an imprint as a bitemark can bear a strong testimony that it was produced by the individual that has the matching dental pattern. However, the comparison of the observed bitemark and the suspected set of teeth will necessarily require human interpretation, and this is not infallible. Both technical challenges in the bitemarks and human errors in the interpretation are possible. To minimise such errors and to maximise the value of bitemark analysis, dedicated procedures and protocols have been developed, and the personnel taking care of the analysis need to be properly trained. In principle the action within the discipline should be conducted as in evidence-based dentristy, i.e. accepted procedures should have known error rates. Because of the involvement of human interpretation, even personal performance statistics may be required from legal expert statements. The requirements have been introduced largely due to cases where false convictions based on bitemark analysishave been overturned after DNA analysis.DOI: 10.14693/jdi.v15i2.76

  14. Dental Forensics: Bitemark Analysis

    OpenAIRE

    Elza Ibrahim Auerkari

    2013-01-01

    Forensic odontology (dental forensics) can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be pra...

  15. CLEAN: CLustering Enrichment ANalysis

    Science.gov (United States)

    Freudenberg, Johannes M; Joshi, Vineet K; Hu, Zhen; Medvedovic, Mario

    2009-01-01

    Background Integration of biological knowledge encoded in various lists of functionally related genes has become one of the most important aspects of analyzing genome-wide functional genomics data. In the context of cluster analysis, functional coherence of clusters established through such analyses have been used to identify biologically meaningful clusters, compare clustering algorithms and identify biological pathways associated with the biological process under investigation. Results We developed a computational framework for analytically and visually integrating knowledge-based functional categories with the cluster analysis of genomics data. The framework is based on the simple, conceptually appealing, and biologically interpretable gene-specific functional coherence score (CLEAN score). The score is derived by correlating the clustering structure as a whole with functional categories of interest. We directly demonstrate that integrating biological knowledge in this way improves the reproducibility of conclusions derived from cluster analysis. The CLEAN score differentiates between the levels of functional coherence for genes within the same cluster based on their membership in enriched functional categories. We show that this aspect results in higher reproducibility across independent datasets and produces more informative genes for distinguishing different sample types than the scores based on the traditional cluster-wide analysis. We also demonstrate the utility of the CLEAN framework in comparing clusterings produced by different algorithms. CLEAN was implemented as an add-on R package and can be downloaded at . The package integrates routines for calculating gene specific functional coherence scores and the open source interactive Java-based viewer Functional TreeView (FTreeView). Conclusion Our results indicate that using the gene-specific functional coherence score improves the reproducibility of the conclusions made about clusters of co

  16. Risk analysis and reliability

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1979-01-01

    Mathematical foundations of risk analysis are addressed. The importance of having the same probability space in order to compare different experiments is pointed out. Then the following topics are discussed: consequences as random variables with infinite expectations; the phenomenon of rare events; series-parallel systems and different kinds of randomness that could be imposed on such systems; and the problem of consensus of estimates of expert opinion

  17. Limit analysis via creep

    International Nuclear Information System (INIS)

    Taroco, E.; Feijoo, R.A.

    1981-07-01

    In this paper it is presented a variational method for the limit analysis of an ideal plastic solid. This method has been denominated as Modified Secundary Creep and enables to find the collapse loads through a minimization of a functional and a limit process. Given an ideal plastic material it is shown how to determinate the associated secundary creep constitutive equation. Finally, as an application, it is found the limit load in an pressurized von Mises rigid plastic sphere. (Author) [pt

  18. Biomedical signal analysis

    CERN Document Server

    Rangayyan, Rangaraj M

    2015-01-01

    The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.

  19. Accident consequence analysis

    International Nuclear Information System (INIS)

    Nixon, W.; Cooper, P.J.; Underwood, B.Y.; Peckover, R.S.

    1985-01-01

    The essential elements of an analysis of the radiological consequences of accidental atmospheric releases from nuclear plant are identified and the modelling approaches currently used briefly outlined. For the model description attention is focused on the techniques used within the context of a probabilistic risk assessment. This is followed by a brief outline of current research and development work in the field, allowing an indication of the nature of the next generation of consequence assessment methods. (author)

  20. Thoughts on multielement analysis

    International Nuclear Information System (INIS)

    Kaiser, H.

    1976-01-01

    The author discusses, in an informal fashion, some of the important aspects of multielement analysis that are frequently overlooked in the present-day trend of trying to measure everything (elements, compounds) in everything (environmental samples). While many points are touched upon, with the aim of providing 'fuel' for the subsequent General Discussion, two themes are illustrated in some depth; do our backgrounds spoil our results, and do our experiments require the impossible A base for planning experimental strategy is outlined. (author)

  1. CONCEPT ANALYSIS: AGGRESSION

    OpenAIRE

    Liu, Jianghong

    2004-01-01

    The concept of aggression is important to nursing because further knowledge of aggression can help generate a better theoretical model to drive more effective intervention and prevention approaches. This paper outlines a conceptual analysis of aggression. First, the different forms of aggression are reviewed, including the clinical classification and the stimulus-based classification. Then the manifestations and measurement of aggression are described. Finally, the causes and consequences of ...

  2. Behavioral Task Analysis

    Science.gov (United States)

    2010-01-01

    methods included task analysis as a critical phase in developing instruction and training. Mon- temerlo and Tennyson (1976) noted that from 1951 to 1976...designed. The trend in the U.S . Department of Defense toward extensive procedural documentation noted by Montemerlo and Tennyson (1976) has not...M. Gagne’ (Ed.), Psychological principles in system development (pp. 187-228). New York: Holt. Montemerlo, M. D., & Tennyson , M. E. (1975

  3. Cash flow analysis

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This chapter addresses the cash flow analysis of dual-purpose power plants. The topics discussed include the impact of inflation (including the relative inflation of individual cost components such as purchased power, supplemental power, standby charges, buy back rate, heating fuel costs, power plants costs, staffing, maintenance, insurance, and administrative charges), controlling costs, purchased power and fuel price projections, varying the assumptions, financing, and modeling of tax impacts

  4. Bayesian Factor Analysis.

    Science.gov (United States)

    1985-03-23

    Fundamantal Factors of Comprehension in...8217’"" . " *. . . . • * • "• "". . . . " . . . . • . . # • • • . .° - -.... . ... .. . . . . . . . . ................. 1 ,,.,..,*, University of Iowa/Novick 8 March 1985 Dr. James McBride Program Manager for Manpower, Psychological...Princeton, NJ 08541 Dr. Vern W. Urry Dr. Peter Stoloff Personnel R&D Center Center for Naval Analysis Office of Personnel Management 200 North

  5. Future Climate Analysis

    International Nuclear Information System (INIS)

    James Houseworth

    2001-01-01

    This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical Product Input Department''. (BSC 2001b, Addendum B

  6. Bite Mark Analysis

    OpenAIRE

    SK Padmakumar; VT Beena; N Salmanulfaris; Ashith B Acharya; G Indu; Sajai J Kumar

    2014-01-01

    Bite mark analysis plays an important role in personal identi- fi cation in forensic odontology. They are commonly seen in violent crimes such as sexual assaults, homicides, child abuse, etc. Human bites are common on the face and are usually seen on prominent locations of the face such as the ears, nose and lips. Individual characteristics recorded in the bite marks such as fractures, rotations, attrition, and congenital malformations are helpful in identifying the in...

  7. [Diagnosis: synovial fluid analysis].

    Science.gov (United States)

    Gallo Vallejo, Francisco Javier; Giner Ruiz, Vicente

    2014-01-01

    Synovial fluid analysis in rheumatological diseases allows a more accurate diagnosis in some entities, mainly infectious and microcrystalline arthritis. Examination of synovial fluid in patients with osteoarthritis is useful if a differential diagnosis will be performed with other processes and to distinguish between inflammatory and non-inflammatory forms. Joint aspiration is a diagnostic and sometimes therapeutic procedure that is available to primary care physicians. Copyright © 2014 Elsevier España, S.L. All rights reserved.

  8. Digital image analysis

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben

    2012-01-01

    Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....

  9. SAMPLING AND ANALYSIS PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T; P Fledderman, P

    2007-02-09

    Radiological sampling and analyses are performed to collect data for a variety of specific reasons covering a wide range of projects. These activities include: Effluent monitoring; Environmental surveillance; Emergency response; Routine ambient monitoring; Background assessments; Nuclear license termination; Remediation; Deactivation and decommissioning (D&D); and Waste management. In this chapter, effluent monitoring and environmental surveillance programs at nuclear operating facilities and radiological sampling and analysis plans for remediation and D&D activities will be discussed.

  10. Pulsed rf operation analysis

    International Nuclear Information System (INIS)

    Puglisi, M.; Cornacchia, M.

    1981-01-01

    The need for a very low final amplifier output impedance, always associated with class A operation, requires a very large power waste in the final tube. The recently suggested pulsed rf operation, while saving a large amount of power, increases the inherent final amplifier non linearity. A method is presented for avoiding the large signal non linear analysis and it is shown how each component of the beam induced voltage depends upon all the beam harmonics via some coupling coefficients which are evaluated

  11. Interpretative phenomenological analysis

    OpenAIRE

    Eatough, Virginia; Smith, Jonathan A.

    2017-01-01

    The Second Edition of The SAGE Handbook of Qualitative Research in Psychology provides comprehensive coverage of the qualitative methods, strategies, and research issues in psychology.\\ud \\ud Qualitative research in psychology has been transformed since the first edition's publication. Responding to this evolving field, existing chapters have been updated while three new chapters have been added on Thematic Analysis, Interpretation, and Netnography. With a focus on methodological progress thr...

  12. Numerical analysis II essentials

    CERN Document Server

    REA, The Editors of; Staff of Research Education Association

    1989-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Numerical Analysis II covers simultaneous linear systems and matrix methods, differential equations, Fourier transformations, partial differential equations, and Monte Carlo methods.

  13. Regional energy facility siting analysis

    International Nuclear Information System (INIS)

    Eberhart, R.C.; Eagles, T.W.

    1976-01-01

    Results of the energy facility siting analysis portion of a regional pilot study performed for the anticipated National Energy Siting and Facility Report are presented. The question of cell analysis versus site-specific analysis is explored, including an evaluation of the difference in depth between the two approaches. A discussion of the possible accomplishments of regional analysis is presented. It is concluded that regional sitting analysis could be of use in a national siting study, if its inherent limits are recognized

  14. Strictness Analysis for Attribute Grammars

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1992-01-01

    interpretation of attribute grammars. The framework is used to construct a strictness analysis for attribute grammars. Results of the analysis enable us to transform an attribute grammar such that attributes are evaluated during parsing, if possible. The analysis is proved correct by relating it to a fixpoint...... semantics for attribute grammars. An implementation of the analysis is discussed and some extensions to the analysis are mentioned....

  15. NETWORK ANALYSIS IN PSYCHOLOGY

    Directory of Open Access Journals (Sweden)

    Eduardo Fonseca-Pedrero

    2018-01-01

    Full Text Available The main goal of this work is to introduce a new approach called network analysis for its application in the field of psychology. In this paper we present the network model in a brief, entertaining and simple way and, as far as possible, away from technicalities and the statistical point of view. The aim of this outline is, on the one hand, to take the first steps in network analysis, and on the other, to show the theoretical and clinical implications underlying this model. Firstly, the roots of this approach are discussed as well as its way of understanding psychological phenomena, specifically psychopathological problems. The concepts of network, node and edge, the types of networks and the procedures for their estimation are all addressed. Next, measures of centrality are explained and some applications in the field of psychology are mentioned. Later, this approach is exemplified with a specific case, which estimates and analyzes a network of personality traits within the Big Five model. The syntax of this analysis is provided. Finally, by way of conclusion, a brief recapitulation is provided, and some cautionary reflections and future research lines are discussed.

  16. Drift Degradation Analysis

    International Nuclear Information System (INIS)

    G.H. Nieder-Westermann

    2005-01-01

    The outputs from the drift degradation analysis support scientific analyses, models, and design calculations, including the following: (1) Abstraction of Drift Seepage; (2) Seismic Consequence Abstraction; (3) Structural Stability of a Drip Shield Under Quasi-Static Pressure; and (4) Drip Shield Structural Response to Rock Fall. This report has been developed in accordance with ''Technical Work Plan for: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The drift degradation analysis includes the development and validation of rockfall models that approximate phenomenon associated with various components of rock mass behavior anticipated within the repository horizon. Two drift degradation rockfall models have been developed: the rockfall model for nonlithophysal rock and the rockfall model for lithophysal rock. These models reflect the two distinct types of tuffaceous rock at Yucca Mountain. The output of this modeling and analysis activity documents the expected drift deterioration for drifts constructed in accordance with the repository layout configuration (BSC 2004 [DIRS 172801])

  17. FUTURE CLIMATE ANALYSIS

    International Nuclear Information System (INIS)

    R.M. Forester

    2000-01-01

    This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog

  18. Political analysis using R

    CERN Document Server

    Monogan III, James E

    2015-01-01

    Political Analysis Using R can serve as a textbook for undergraduate or graduate students as well as a manual for independent researchers. It is unique among competitor books in its usage of 21 example datasets that are all drawn from political research. All of the data and example code is available from the Springer website, as well as from Dataverse (http://dx.doi.org/10.7910/DVN/ARKOTI). The book provides a narrative of how R can be useful for addressing problems common to the analysis of public administration, public policy, and political science data specifically, in addition to the social sciences more broadly. While the book uses data drawn from political science, public administration, and policy analyses, it is written so that students and researchers in other fields should find it accessible and useful as well. Political Analysis Using R is perfect for the first-time R user who has no prior knowledge about the program. By working through the first seven chapters of this book, an entry-level user sho...

  19. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  20. Analysis of radioactive strontium

    International Nuclear Information System (INIS)

    1977-01-01

    In environmental radiation survey, radioactive strontium has been analyzed in compliance with the manual ''Analyzing methods for radioactive strontium'' published in 1960 by the Science and Technology Agency, Japan, and revised in 1963. However, in a past decade, progress and development in analyzing methods and measuring equipments have been significant, therefore the manual was revised in 1974. Major revisions are as follows. (1) Analysis of 90 Sr with long half life was changed to the main theme and that of 89 Sr with short half life became a subordinate one. (2) Measuring criteria and sampling volume were revised. (3) Sample collection method was unified. (4) Analyzing method for soil was improved to NaOH-HCl method which has good recovery rate. (5) 90 Y separation method of simple operation was added for sea water analysis besides EDTA and fuming nitric acid methods. (6) Flame spectrometry for quantitative analysis of stable strontium was revised to atomic absorption spectrometry. The contents of the manual comprises 11 chapters describing introduction, measuring criteria for 90 Sr ( 89 Sr), rain and dust, land water, sea water, soil, sea bottom and river bottom sediments (changed from human urine and human bones), crops, milk (the previous one chapter was divided into two), marine organisms, and everyday foods, respectively. (Wakatsuki, Y.)

  1. Generalized Linear Covariance Analysis

    Science.gov (United States)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  2. H2@Scale Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, Mark

    2017-07-12

    'H2@Scale' is a concept based on the opportunity for hydrogen to act as an intermediate between energy sources and uses. Hydrogen has the potential to be used like the primary intermediate in use today, electricity, because it too is fungible. This presentation summarizes the H2@Scale analysis efforts performed during the first third of 2017. Results of technical potential uses and supply options are summarized and show that the technical potential demand for hydrogen is 60 million metric tons per year and that the U.S. has sufficient domestic resources to meet that demand. A high level infrastructure analysis is also presented that shows an 85% increase in energy on the grid if all hydrogen is produced from grid electricity. However, a preliminary spatial assessment shows that supply is sufficient in most counties across the U.S. The presentation also shows plans for analysis of the economic potential for the H2@Scale concept. Those plans involve developing supply and demand curves for potential hydrogen generation options and as compared to other options for use of that hydrogen.

  3. Interference and Sensitivity Analysis.

    Science.gov (United States)

    VanderWeele, Tyler J; Tchetgen Tchetgen, Eric J; Halloran, M Elizabeth

    2014-11-01

    Causal inference with interference is a rapidly growing area. The literature has begun to relax the "no-interference" assumption that the treatment received by one individual does not affect the outcomes of other individuals. In this paper we briefly review the literature on causal inference in the presence of interference when treatments have been randomized. We then consider settings in which causal effects in the presence of interference are not identified, either because randomization alone does not suffice for identification, or because treatment is not randomized and there may be unmeasured confounders of the treatment-outcome relationship. We develop sensitivity analysis techniques for these settings. We describe several sensitivity analysis techniques for the infectiousness effect which, in a vaccine trial, captures the effect of the vaccine of one person on protecting a second person from infection even if the first is infected. We also develop two sensitivity analysis techniques for causal effects in the presence of unmeasured confounding which generalize analogous techniques when interference is absent. These two techniques for unmeasured confounding are compared and contrasted.

  4. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  5. Numerical analysis using Sage

    CERN Document Server

    Anastassiou, George A

    2015-01-01

    This is the first numerical analysis text to use Sage for the implementation of algorithms and can be used in a one-semester course for undergraduates in mathematics, math education, computer science/information technology, engineering, and physical sciences. The primary aim of this text is to simplify understanding of the theories and ideas from a numerical analysis/numerical methods course via a modern programming language like Sage. Aside from the presentation of fundamental theoretical notions of numerical analysis throughout the text, each chapter concludes with several exercises that are oriented to real-world application.  Answers may be verified using Sage.  The presented code, written in core components of Sage, are backward compatible, i.e., easily applicable to other software systems such as Mathematica®.  Sage is  open source software and uses Python-like syntax. Previous Python programming experience is not a requirement for the reader, though familiarity with any programming language is a p...

  6. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  7. FUTURE CLIMATE ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    R.M. Forester

    2000-03-14

    This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog.

  8. Analysis of simulator training

    International Nuclear Information System (INIS)

    Hollnagel, E.

    1983-01-01

    A method has been developed for systematic observation of operator performance in nuclear training simulators which combines training and research. It is based on generally accepted theories of operator models and decision making developed at Riso and elsewhere. It makes explicitly available the data which experienced instructors implicitly use in their assessment of operator performance. This means that the feed-back/debriefing function of the training is facilitated, it becomes possible to use normal training sessions to obtain data which can be used in further theoretical studies of e.g. operator decision making, and the generalized description of operator performance may be used to evaluate the training program as such. The method for observation is designed in cooperation with the instructors so that it does not interfere with their normal work. It is based on a detailed prior analysis of experienced transients, leading to a description of an expected performance, and some transient-in-dependent observation schemes, which are used to characterize points where the actual performance deviates from the expected performance. The analysis of the observations takes place according to the structure of a general model of analysis developed from numerous studies of operator performance, in real life and in simulators. (author)

  9. NID Copper Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Zhu, Zihua

    2011-02-01

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology under development at Nonlinear Ion Dynamics (NID) will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making these isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL for isotopic analysis as a test of the NID technology. The results of that analysis are reported here.

  10. [Mindfulness: A Concept Analysis].

    Science.gov (United States)

    Chen, Tsai-Ling; Chou, Fan-Hao; Wang, Hsiu-Hung

    2016-04-01

    "Mindfulness" is an emerging concept in the field of healthcare. Ranging from stress relief to psychotherapy, mindfulness has been confirmed to be an effective tool to help individuals manage depression, anxiety, obsessive-compulsive disorder, and other health problems in clinical settings. Scholars currently use various definitions for mindfulness. While some of these definitions overlap, significant differences remain and a general scholarly consensus has yet to be reached. Several domestic and international studies have explored mindfulness-related interventions and their effectiveness. However, the majority of these studies have focused on the fields of clinical medicine, consultation, and education. Mindfulness has rarely been applied in clinical nursing practice and no related systematic concept analysis has been conducted. This paper conducts a concept analysis of mindfulness using the concept analysis method proposed by Walker and Avant (2011). We describe the defining characteristics of mindfulness, clarify the concept, and confirm the predisposing factors and effects of mindfulness using examples of typical cases, borderline cases, related cases, and contrary case. Findings may provide nursing staff with an understanding of the concept of mindfulness for use in clinical practice in order to help patients achieve a comfortable state of body and mind healing.

  11. Multilevel functional clustering analysis.

    Science.gov (United States)

    Serban, Nicoleta; Jiang, Huijing

    2012-09-01

    In this article, we investigate clustering methods for multilevel functional data, which consist of repeated random functions observed for a large number of units (e.g., genes) at multiple subunits (e.g., bacteria types). To describe the within- and between variability induced by the hierarchical structure in the data, we take a multilevel functional principal component analysis (MFPCA) approach. We develop and compare a hard clustering method applied to the scores derived from the MFPCA and a soft clustering method using an MFPCA decomposition. In a simulation study, we assess the estimation accuracy of the clustering membership and the cluster patterns under a series of settings: small versus moderate number of time points; various noise levels; and varying number of subunits per unit. We demonstrate the applicability of the clustering analysis to a real data set consisting of expression profiles from genes activated by immunity system cells. Prevalent response patterns are identified by clustering the expression profiles using our multilevel clustering analysis. © 2012, The International Biometric Society.

  12. Market analysis. Renewable fuels

    International Nuclear Information System (INIS)

    2014-01-01

    The Agency for Renewable Resources (FNR) had on behalf of the Federal Ministry of Food and Agriculture created a study on the market development of renewable resources in Germany and published this in the year of 2006. The aim of that study was to identify of actual status and market performance of the individual market segments of the material and energetic use as a basis for policy recommendations for accelerated and long term successful market launch and market share expansion of renewable raw materials. On behalf of the FNR, a market analysis of mid-2011 was carried out until the beginning of 2013, the results of which are hereby resubmitted. This market analysis covers all markets of material and energetic use in the global context, taking account of possible competing uses. A market segmentation, which was based on the product classification of the Federal Statistical Office, formed the basis of the analysis. A total of ten markets have been defined, seven material and three energetic use. [de

  13. Actinide isotopic analysis systems

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Ruhter, W.D.; Gunnink, R.

    1990-01-01

    This manual provides instructions and procedures for using the Lawrence Livermore National Laboratory's two-detector actinide isotope analysis system to measure plutonium samples with other possible actinides (including uranium, americium, and neptunium) by gamma-ray spectrometry. The computer program that controls the system and analyzes the gamma-ray spectral data is driven by a menu of one-, two-, or three-letter options chosen by the operator. Provided in this manual are descriptions of these options and their functions, plus detailed instructions (operator dialog) for choosing among the options. Also provided are general instructions for calibrating the actinide isotropic analysis system and for monitoring its performance. The inventory measurement of a sample's total plutonium and other actinides content is determined by two nondestructive measurements. One is a calorimetry measurement of the sample's heat or power output, and the other is a gamma-ray spectrometry measurement of its relative isotopic abundances. The isotopic measurements needed to interpret the observed calorimetric power measurement are the relative abundances of various plutonium and uranium isotopes and americium-241. The actinide analysis system carries out these measurements. 8 figs

  14. Future Climate Analysis

    International Nuclear Information System (INIS)

    Cambell, C. G.

    2004-01-01

    This report documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain, Nevada, the site of a repository for spent nuclear fuel and high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this report provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the following reports: ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]), ''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504]), ''Features, Events, and Processes in UZ Flow and Transport'' (BSC 2004 [DIRS 170012]), and ''Features, Events, and Processes in SZ Flow and Transport'' (BSC 2004 [DIRS 170013]). Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one available forecasting method for establishing upper and lower bounds for future climate estimates. The selection of different methods is directly dependent on the available evidence used to build a forecasting argument. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. While alternative analyses are possible for the case presented for Yucca Mountain, the evidence (data) used would be the same and the conclusions would not be expected to drastically change. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Other alternative

  15. Future Climate Analysis

    Energy Technology Data Exchange (ETDEWEB)

    C. G. Cambell

    2004-09-03

    This report documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain, Nevada, the site of a repository for spent nuclear fuel and high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this report provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the following reports: ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]), ''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504]), ''Features, Events, and Processes in UZ Flow and Transport'' (BSC 2004 [DIRS 170012]), and ''Features, Events, and Processes in SZ Flow and Transport'' (BSC 2004 [DIRS 170013]). Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one available forecasting method for establishing upper and lower bounds for future climate estimates. The selection of different methods is directly dependent on the available evidence used to build a forecasting argument. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. While alternative analyses are possible for the case presented for Yucca Mountain, the evidence (data) used would be the same and the conclusions would not be expected to drastically change. Other studies might develop a different rationale or select other past

  16. The OVIS analysis architecture.

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Jackson R.; Gentile, Ann C.; Brandt, James M.; De Sapio, Vincent; Thompson, David C.; Roe, Diana C.; Wong, Matthew H.; Pebay, Philippe Pierre

    2010-07-01

    This report summarizes the current statistical analysis capability of OVIS and how it works in conjunction with the OVIS data readers and interpolators. It also documents how to extend these capabilities. OVIS is a tool for parallel statistical analysis of sensor data to improve system reliability. Parallelism is achieved using a distributed data model: many sensors on similar components (metaphorically sheep) insert measurements into a series of databases on computers reserved for analyzing the measurements (metaphorically shepherds). Each shepherd node then processes the sheep data stored locally and the results are aggregated across all shepherds. OVIS uses the Visualization Tool Kit (VTK) statistics algorithm class hierarchy to perform analysis of each process's data but avoids VTK's model aggregation stage which uses the Message Passing Interface (MPI); this is because if a single process in an MPI job fails, the entire job will fail. Instead, OVIS uses asynchronous database replication to aggregate statistical models. OVIS has several additional features beyond those present in VTK that, first, accommodate its particular data format and, second, improve the memory and speed of the statistical analyses. First, because many statistical algorithms are multivariate in nature and sensor data is typically univariate, interpolation of data is required to provide simultaneous observations of metrics. Note that in this report, we will refer to a single value obtained from a sensor as a measurement while a collection of multiple sensor values simultaneously present in the system is an observation. A base class for interpolation is provided that abstracts the operation of converting multiple sensor measurements into simultaneous observations. A concrete implementation is provided that performs piecewise constant temporal interpolation of multiple metrics across a single component. Secondly, because calculations may summarize data too large to fit in memory

  17. Evaluating the function of applied behavior analysis a bibliometric analysis.

    OpenAIRE

    Critchfield, Thomas S

    2002-01-01

    Analysis of scholarly citations involving behavioral journals reveals that, consistent with its mission, applied behavior analysis research frequently references the basic behavioral literature but, as some have suspected, exerts narrow scholarly influence.

  18. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    Science.gov (United States)

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  19. CADDIS Volume 4. Data Analysis: Exploratory Data Analysis

    Science.gov (United States)

    Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.

  20. Resonance frequency analysis

    Directory of Open Access Journals (Sweden)

    Rajiv K Gupta

    2011-01-01

    Full Text Available Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA is one of such techniques which is most frequently used now days. The aim of this paper was to review and analyze critically the current available literature in the field of RFA, and to also discuss based on scientific evidence, the prognostic value of RFA to detect implants at risk of failure. A search was made using the PubMed database to find all the literature published on "Resonance frequency analysis for implant stability" till date. Articles discussed in vivo or in vitro studies comparing RFA with other methods of implant stability measurement and articles discussing its reliability were thoroughly reviewed and discussed. A limited number of clinical reports were found. Various studies have demonstrated the feasibility and predictability of the technique. However, most of these articles are based on retrospective data or uncontrolled cases. Randomized, prospective, parallel-armed longitudinal human trials are based on short-term results and long-term follow up are still scarce in this field. Nonetheless, from available literature, it may be concluded that RFA technique evaluates implant stability as a function of stiffness of the implant bone interface and is influenced by factors such as bone type, exposed implant height above the alveolar crest. Resonance frequency analysis could serve as a non-invasive diagnostic tool for detecting the implant stability of dental implants during the healing stages and in subsequent routine follow up care after treatment. Future studies, preferably randomized

  1. Health literacy: concept analysis.

    Science.gov (United States)

    Speros, Carolyn

    2005-06-01

    This paper reports an analysis of the concept of health literacy in order to clarify its meaning, reduce ambiguities associated with references to it, and promote consistency in using the concept in nursing dialogue and research. Health literacy is a relatively new concept in health promotion research. Only within the last decade have researchers identified the problems associated with health literacy, the role it plays in an individual's ability to comprehend health and self-care information, and its relationship to health outcomes. Clarifying the concept is essential so that nurses develop an awareness of the phenomenon and its relationship to the outcomes of their communication and health education efforts. The method used for this concept analysis was that of Walker and Avant (1995). Health literacy empowers people to act appropriately in new and changing health-related circumstances through the use of advanced cognitive and social skills. The defining attributes of health literacy are reading and numeracy skills, comprehension, the capacity to use information in health care decision-making, and successful functioning as a healthcare consumer. Antecedents of health literacy are literacy and a health-related experience. Consequences of health literacy include improved self-reported health status, lower health care costs, increased health knowledge, shorter hospitalizations, and less frequent use of health care services. Empirical referents of the concept are the Test of Functional Health Literacy in Adults and the health literacy component of the National Assessment of Adult Literacy. An analysis of the concept of health literacy enhances nurses' ability to assess more accurately their clients' levels of health literacy, thus identifying those at risk for misunderstanding health care instructions, shame associated with inadequate reading skills, and inability to adhere to health care recommendations.

  2. Plutonium-238 Decision Analysis

    International Nuclear Information System (INIS)

    Brown, Mike; Lechel, David J.; Leigh, C.D.

    1999-01-01

    Five transuranic (TRU) waste sites in the Department of Energy (DOE) complex, collectively, have more than 2,100 cubic meters of Plutonium-238 (Pu-238) TRU waste that exceed the wattage restrictions of the Transuranic Package Transporter-II (TRUPACT-11). The Waste Isolation Pilot Plant (WIPP) is being developed by the DOE as a repository for TRU waste. With the Waste Isolation Pilot Plant (WIPP) opening in 1999, these sites are faced with a need to develop waste management practices that will enable the transportation of Pu-238 TRU waste to WIPP for disposal. This paper describes a decision analysis that provided a logical framework for addressing the Pu-238 TRU waste issue. The insights that can be gained by performing a formalized decision analysis are multifold. First and foremost, the very process. of formulating a decision tree forces the decision maker into structured, logical thinking where alternatives can be evaluated one against the other using a uniform set of criteria. In the process of developing the decision tree for transportation of Pu-238 TRU waste, several alternatives were eliminated and the logical order for decision making was discovered. Moreover, the key areas of uncertainty for proposed alternatives were identified and quantified. The decision analysis showed that the DOE can employ a combination approach where they will (1) use headspace gas analyses to show that a fraction of the Pu-238 TRU waste drums are no longer generating hydrogen gas and can be shipped to WIPP ''as-is'', (2) use drums and bags with advanced filter systems to repackage Pu-238 TRU waste drums that are still generating hydrogen, and (3) add hydrogen getter materials to the inner containment vessel of the TRUPACT-11to relieve the build-up of hydrogen gas during transportation of the Pu-238 TRU waste drums

  3. Meta-analysis with R

    CERN Document Server

    Schwarzer, Guido; Rücker, Gerta

    2015-01-01

    This book provides a comprehensive introduction to performing meta-analysis using the statistical software R. It is intended for quantitative researchers and students in the medical and social sciences who wish to learn how to perform meta-analysis with R. As such, the book introduces the key concepts and models used in meta-analysis. It also includes chapters on the following advanced topics: publication bias and small study effects; missing data; multivariate meta-analysis, network meta-analysis; and meta-analysis of diagnostic studies.  .

  4. Analysis and assessment

    International Nuclear Information System (INIS)

    Grahn, D.

    1975-01-01

    The ultimate objective is to predict potential health costs tp man accruing from the effluents or by-products of any energy system or mix of systems, but the establishment of reliable prediction equations first requires a baseline analysis of those preexisting and essentially uncontrolled factors known to have significant influence on patterns of mortality. These factors are the cultural, social, economic, and demographic traits of a defined local or regional population. Thus, the immediate objective is the rigorous statistical definition of consistent relationships that may exist among the above traits and between them and selected causes of death, especially those causes that may have interpretive value for the detection of environmental pollutants

  5. Morphological analysis of ionomers

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This report discusses the progress made during the period of April 1st, 1989 and March 31st, 1990. Topics covered are: SANS of Telechelic Ionomers, SANS of Sulfonated Polyurethanes, Effect of Matrix Polarity and Ambient Aging on the Morphology of Sulfonated Polyurethane Ionomers, Adhesive Sphere Model for Analysis of SAXS Data from Ionomers, Comparison of Structure-Property Relationships in Carboxylated and Sulfonated Polyurethane Ionomers, Development of a Liquid-like Hard Sphere Model for Deformed Ionomer Samples, and Polymer Synthesis for Proposed Research. (JDL)

  6. Kindle Forensics: Acquisition & Analysis

    Directory of Open Access Journals (Sweden)

    Peter Hannay

    2011-06-01

    Full Text Available The Amazon Kindle eBook reader supports a wide range of capabilities beyond reading books. This functionality includes an inbuilt cellular data connection known as Whispernet. The Kindle provides web browsing, an application framework, eBook delivery and other services over this connection. The historic data left by user interaction with this device may be of forensic interest. Analysis of the Amazon Kindle device has resulted in a method to reliably extract and interpret data from these devices in a forensically complete manner.

  7. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  8. Heat Capacity Analysis Report

    International Nuclear Information System (INIS)

    Findikakis, A.

    2004-01-01

    The purpose of this report is to provide heat capacity values for the host and surrounding rock layers for the waste repository at Yucca Mountain. The heat capacity representations provided by this analysis are used in unsaturated zone (UZ) flow, transport, and coupled processes numerical modeling activities, and in thermal analyses as part of the design of the repository to support the license application. Among the reports that use the heat capacity values estimated in this report are the ''Multiscale Thermohydrologic Model'' report, the ''Drift Degradation Analysis'' report, the ''Ventilation Model and Analysis Report, the Igneous Intrusion Impacts on Waste Packages and Waste Forms'' report, the ''Dike/Drift Interactions report, the Drift-Scale Coupled Processes (DST and TH Seepage) Models'' report, and the ''In-Drift Natural Convection and Condensation'' report. The specific objective of this study is to determine the rock-grain and rock-mass heat capacities for the geologic stratigraphy identified in the ''Mineralogic Model (MM3.0) Report'' (BSC 2004 [DIRS 170031], Table 1-1). This report provides estimates of the heat capacity for all stratigraphic layers except the Paleozoic, for which the mineralogic abundance data required to estimate the heat capacity are not available. The temperature range of interest in this analysis is 25 C to 325 C. This interval is broken into three separate temperature sub-intervals: 25 C to 95 C, 95 C to 114 C, and 114 C to 325 C, which correspond to the preboiling, trans-boiling, and postboiling regimes. Heat capacity is defined as the amount of energy required to raise the temperature of a unit mass of material by one degree (Nimick and Connolly 1991 [DIRS 100690], p. 5). The rock-grain heat capacity is defined as the heat capacity of the rock solids (minerals), and does not include the effect of water that exists in the rock pores. By comparison, the rock-mass heat capacity considers the heat capacity of both solids and pore

  9. Drift Degradation Analysis

    International Nuclear Information System (INIS)

    D. Kicker

    2004-01-01

    Degradation of underground openings as a function of time is a natural and expected occurrence for any subsurface excavation. Over time, changes occur to both the stress condition and the strength of the rock mass due to several interacting factors. Once the factors contributing to degradation are characterized, the effects of drift degradation can typically be mitigated through appropriate design and maintenance of the ground support system. However, for the emplacement drifts of the geologic repository at Yucca Mountain, it is necessary to characterize drift degradation over a 10,000-year period, which is well beyond the functional period of the ground support system. This document provides an analysis of the amount of drift degradation anticipated in repository emplacement drifts for discrete events and time increments extending throughout the 10,000-year regulatory period for postclosure performance. This revision of the drift degradation analysis was developed to support the license application and fulfill specific agreement items between the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE). The earlier versions of ''Drift Degradation Analysis'' (BSC 2001 [DIRS 156304]) relied primarily on the DRKBA numerical code, which provides for a probabilistic key-block assessment based on realistic fracture patterns determined from field mapping in the Exploratory Studies Facility (ESF) at Yucca Mountain. A key block is defined as a critical block in the surrounding rock mass of an excavation, which is removable and oriented in an unsafe manner such that it is likely to move into an opening unless support is provided. However, the use of the DRKBA code to determine potential rockfall data at the repository horizon during the postclosure period has several limitations: (1) The DRKBA code cannot explicitly apply dynamic loads due to seismic ground motion. (2) The DRKBA code cannot explicitly apply loads due to thermal stress. (3) The DRKBA

  10. Analysis by compression

    DEFF Research Database (Denmark)

    Meredith, David

    MEL is a geometric music encoding language designed to allow for musical objects to be encoded parsimoniously as sets of points in pitch-time space, generated by performing geometric transformations on component patterns. MEL has been implemented in Java and coupled with the SIATEC pattern discov...... discovery algorithm to allow for compact encodings to be generated automatically from in extenso note lists. The MEL-SIATEC system is founded on the belief that music analysis and music perception can be modelled as the compression of in extenso descriptions of musical objects....

  11. Drift Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    D. Kicker

    2004-09-16

    Degradation of underground openings as a function of time is a natural and expected occurrence for any subsurface excavation. Over time, changes occur to both the stress condition and the strength of the rock mass due to several interacting factors. Once the factors contributing to degradation are characterized, the effects of drift degradation can typically be mitigated through appropriate design and maintenance of the ground support system. However, for the emplacement drifts of the geologic repository at Yucca Mountain, it is necessary to characterize drift degradation over a 10,000-year period, which is well beyond the functional period of the ground support system. This document provides an analysis of the amount of drift degradation anticipated in repository emplacement drifts for discrete events and time increments extending throughout the 10,000-year regulatory period for postclosure performance. This revision of the drift degradation analysis was developed to support the license application and fulfill specific agreement items between the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE). The earlier versions of ''Drift Degradation Analysis'' (BSC 2001 [DIRS 156304]) relied primarily on the DRKBA numerical code, which provides for a probabilistic key-block assessment based on realistic fracture patterns determined from field mapping in the Exploratory Studies Facility (ESF) at Yucca Mountain. A key block is defined as a critical block in the surrounding rock mass of an excavation, which is removable and oriented in an unsafe manner such that it is likely to move into an opening unless support is provided. However, the use of the DRKBA code to determine potential rockfall data at the repository horizon during the postclosure period has several limitations: (1) The DRKBA code cannot explicitly apply dynamic loads due to seismic ground motion. (2) The DRKBA code cannot explicitly apply loads due to thermal

  12. Data analysis with Mplus

    CERN Document Server

    Geiser, Christian

    2012-01-01

    A practical introduction to using Mplus for the analysis of multivariate data, this volume provides step-by-step guidance, complete with real data examples, numerous screen shots, and output excerpts. The author shows how to prepare a data set for import in Mplus using SPSS. He explains how to specify different types of models in Mplus syntax and address typical caveats--for example, assessing measurement invariance in longitudinal SEMs. Coverage includes path and factor analytic models as well as mediational, longitudinal, multilevel, and latent class models. Specific programming tips an

  13. Industrial numerical analysis

    International Nuclear Information System (INIS)

    McKee, S.; Elliott, C.M.

    1986-01-01

    The applications of mathematics to industrial problems involves the formulation of problems which are amenable to mathematical investigation, mathematical modelling, the solution of the mathematical problem and the inter-pretation of the results. There are 12 chapters describing industrial problems where mathematics and numerical analysis can be applied. These range from the numerical assessment of the flatness of engineering surfaces and plates, the design of chain links, control problems in tidal power generation and low thrust satellite trajectory optimization to mathematical models in welding. One chapter, on the ageing of stainless steels, is indexed separately. (UK)

  14. Analysis of financial opportunities

    Directory of Open Access Journals (Sweden)

    Dessislava Kostova – Pickett

    2018-02-01

    Full Text Available We will start with an analysis structure that defines the key areas in which long-term funding sources will be analyzed and attracted. We will then look at the techniques for calculating the impact on the company's financial performance caused by the introduction of new capital from each of the major sources. We will turn to the often used form of graphing, EBIT (break before interest and tax break-even chart to demonstrate the dynamic impact of financial choices, which changes the company's situation. Once we touch the leases as a special source, we will list the key issues involved in the stock options.

  15. Uncertain data envelopment analysis

    CERN Document Server

    Wen, Meilin

    2014-01-01

    This book is intended to present the milestones in the progression of uncertain Data envelopment analysis (DEA). Chapter 1 gives some basic introduction to uncertain theories, including probability theory, credibility theory, uncertainty theory and chance theory. Chapter 2 presents a comprehensive review and discussion of basic DEA models. The stochastic DEA is introduced in Chapter 3, in which the inputs and outputs are assumed to be random variables. To obtain the probability distribution of a random variable, a lot of samples are needed to apply the statistics inference approach. Chapter 4

  16. Future Climate Analysis

    Energy Technology Data Exchange (ETDEWEB)

    James Houseworth

    2001-10-12

    This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical

  17. Visual Analysis of Humans

    CERN Document Server

    Moeslund, Thomas B

    2011-01-01

    This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes

  18. Atom trap trace analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Z.-T.; Bailey, K.; Chen, C.-Y.; Du, X.; Li, Y.-M.; O' Connor, T. P.; Young, L.

    2000-05-25

    A new method of ultrasensitive trace-isotope analysis has been developed based upon the technique of laser manipulation of neutral atoms. It has been used to count individual {sup 85}Kr and {sup 81}Kr atoms present in a natural krypton sample with isotopic abundances in the range of 10{sup {minus}11} and 10{sup {minus}13}, respectively. The atom counts are free of contamination from other isotopes, elements,or molecules. The method is applicable to other trace-isotopes that can be efficiently captured with a magneto-optical trap, and has a broad range of potential applications.

  19. analysis methods of uranium

    International Nuclear Information System (INIS)

    Bekdemir, N.; Acarkan, S.

    1997-01-01

    There are various methods for the determination of uranium. The most often used methods are spectrophotometric (PAR, DBM and Arsenazo III) and potentiometric titration methods. For uranium contents between 1-300 g/LU potentiometric titration method based on oxidation-reduction reactions gives reliable results. PAR (1-pyridiyl-2-azo resorcinol) is a sensitive reagent for uranium, forming complexes in aqueous solutions. It is a suitable method for determination of uranium at concentrations between 2-400microgram U. In this study, the spectrophotometric and potentiometric analysis methods, used in the Nuclear Fuel Department will be discussed in detail and other methods and their principles will be briefly mentioned

  20. Analysis in indexing

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2005-01-01

    The paper discusses the notion of steps in indexing and reveals that the document-centered approach to indexing is prevalent and argues that the document-centered approach is problematic because it blocks out context-dependent factors in the indexing process. A domain-centered approach to indexing...... is presented as an alternative and the paper discusses how this approach includes a broader range of analyses and how it requires a new set of actions from using this approach; analysis of the domain, users and indexers. The paper concludes that the two-step procedure to indexing is insufficient to explain...

  1. Dimensional analysis made simple

    International Nuclear Information System (INIS)

    Lira, Ignacio

    2013-01-01

    An inductive strategy is proposed for teaching dimensional analysis to second- or third-year students of physics, chemistry, or engineering. In this strategy, Buckingham's theorem is seen as a consequence and not as the starting point. In order to concentrate on the basics, the mathematics is kept as elementary as possible. Simple examples are suggested for classroom demonstrations of the power of the technique and others are put forward for homework or experimentation, but instructors are encouraged to produce examples of their own. (paper)

  2. Analysis of Infiltration Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper

  3. Simulator training analysis

    International Nuclear Information System (INIS)

    Hollnagel, E.; Rasmussen, J.

    1981-08-01

    This paper presents a suggestion for systematic collection of data during the normal use of training simulators, with the double purpose of supporting trainee debriefing and providing data for further theoretical studies of operator performance. The method is based on previously described models of operator performance and decision-making, and is a specific instance of the general method for analysis of operator performance data. The method combines a detailed transient-specific description of the expected performance with transient-independent tools for observation of critical activities. (author)

  4. Sequence analysis on microcomputers.

    Science.gov (United States)

    Cannon, G C

    1987-10-02

    Overall, each of the program packages performed their tasks satisfactorily. For analyses where there was a well-defined answer, such as a search for a restriction site, there were few significant differences between the program sets. However, for tasks in which a degree of flexibility is desirable, such as homology or similarity determinations and database searches, DNASTAR consistently afforded the user more options in conducting the required analysis than did the other two packages. However, for laboratories where sequence analysis is not a major effort and the expense of a full sequence analysis workstation cannot be justified, MicroGenie and IBI-Pustell offer a satisfactory alternative. MicroGenie is a polished program system. Many may find that its user interface is more "user friendly" than the standard menu-driven interfaces. Its system of filing sequences under individual passwords facilitates use by more than one person. MicroGenie uses a hardware device for software protection that occupies a card slot in the computer on which it is used. Although I am sympathetic to the problem of software piracy, I feel that a less drastic solution is in order for a program likely to be sharing limited computer space with other software packages. The IBI-Pustell package performs the required analysis functions as accurately and quickly as MicroGenie but it lacks the clearness and ease of use. The menu system seems disjointed, and new or infrequent users often find themselves at apparent "dead-end menus" where the only clear alternative is to restart the entire program package. It is suggested from published accounts that the user interface is going to be upgraded and perhaps when that version is available, use of the system will be improved. The documentation accompanying each package was relatively clear as to how to run the programs, but all three packages assumed that the user was familiar with the computational techniques employed. MicroGenie and IBI-Pustell further

  5. [Snoring analysis methods].

    Science.gov (United States)

    Fiz Fernández, José Antonio; Solà Soler, Jordi; Jané Campos, Raimon

    2011-06-11

    Snore is a breathing sound that is originated during sleep, either nocturnal or diurnal. Snoring may be inspiratory, expiratory or it may occupy the whole breathing cycle. It is caused by the vibrations of the different tissues of the upper airway. Many procedures have been used to analyze it, from simple interrogation, to standardized questionnaires, to more sophisticated acoustic methods developed thanks to the advance of biomedical techniques in the last years. The present work describes the current state of the art of snoring analysis procedures. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  6. [Preoperative analysis in rhinoplasty].

    Science.gov (United States)

    Nguyen, P S; Bardot, J; Duron, J B; Levet, Y; Aiach, G

    2014-12-01

    Preoperative analysis in rhinoplasty consists in analyzing individual anatomical and functional characteristics without losing sight of the initial requirements of the patient to which priority should be given. The examination is primarily clinical but it also uses preoperative photographs taken at specific accurate angles. Detecting functional disorders or associated general pathologies, which will reduce the risk of complications. All of these factors taken into account, the surgeon can work out a rhinoplasty plan which he or she will subsequently explain to the patient and obtain his or her approbation. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  7. Practical multivariate analysis

    CERN Document Server

    Afifi, Abdelmonem; Clark, Virginia A

    2011-01-01

    ""First of all, it is very easy to read. … The authors manage to introduce and (at least partially) explain even quite complex concepts, e.g. eigenvalues, in an easy and pedagogical way that I suppose is attractive to readers without deeper statistical knowledge. The text is also sprinkled with references for those who want to probe deeper into a certain topic. Secondly, I personally find the book's emphasis on practical data handling very appealing. … Thirdly, the book gives very nice coverage of regression analysis. … this is a nicely written book that gives a good overview of a large number

  8. Bayesian Exploratory Factor Analysis

    Science.gov (United States)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  9. Contextual analysis of videos

    CERN Document Server

    Thida, Myo; Monekosso, Dorothy

    2013-01-01

    Video context analysis is an active and vibrant research area, which provides means for extracting, analyzing and understanding behavior of a single target and multiple targets. Over the last few decades, computer vision researchers have been working to improve the accuracy and robustness of algorithms to analyse the context of a video automatically. In general, the research work in this area can be categorized into three major topics: 1) counting number of people in the scene 2) tracking individuals in a crowd and 3) understanding behavior of a single target or multiple targets in the scene.

  10. Geometric analysis and PDEs

    CERN Document Server

    Ambrosetti, Antonio; Malchiodi, Andrea

    2009-01-01

    This volume contains lecture notes on some topics in geometric analysis, a growing mathematical subject which uses analytical techniques, mostly of partial differential equations, to treat problems in differential geometry and mathematical physics. The presentation of the material should be rather accessible to non-experts in the field, since the presentation is didactic in nature. The reader will be provided with a survey containing some of the most exciting topics in the field, with a series of techniques used to treat such problems.

  11. gap: Genetic Analysis Package

    Directory of Open Access Journals (Sweden)

    Jing Hua Zhao

    2007-06-01

    Full Text Available A preliminary attempt at collecting tools and utilities for genetic data as an R package called gap is described. Genomewide association is then described as a specific example, linking the work of Risch and Merikangas (1996, Long and Langley (1997 for family-based and population-based studies, and the counterpart for case-cohort design established by Cai and Zeng (2004. Analysis of staged design as outlined by Skol et al. (2006 and associate methods are discussed. The package is flexible, customizable, and should prove useful to researchers especially in its application to genomewide association studies.

  12. Analysis, manifolds and physics

    CERN Document Server

    Choquet-Bruhat, Y

    2000-01-01

    Twelve problems have been added to the first edition; four of them are supplements to problems in the first edition. The others deal with issues that have become important, since the first edition of Volume II, in recent developments of various areas of physics. All the problems have their foundations in volume 1 of the 2-Volume set Analysis, Manifolds and Physics. It would have been prohibitively expensive to insert the new problems at their respective places. They are grouped together at the end of this volume, their logical place is indicated by a number of parenthesis following the title.

  13. Analysis of irradiated food

    International Nuclear Information System (INIS)

    Meier, W.

    1991-01-01

    Foods, e.g. chicken, shrimps, frog legs, spices, different dried vegetables, potatoes and fruits are legally irradiated in many countries and are probably also exported into countries, which do not permit irradiation of any food. Therefore all countries need analytical methods to determine whether food has been irradiated or not. Up to now, two physical (ESR-spectroscopy and thermoluminescence) and two chemical methods (o-tyrosine and volatile compounds) are available for routine analysis. Several results of the application of these four mentioned methods on different foods are presented and a short outlook on other methods (chemiluminescence, DNA-changes, biological assays, viscometric method and photostimulated luminescence) will be given. (author)

  14. Limestone rocks analysis by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Izquierdo M, G.; Ponce R, R.; Vazquez J, J.

    1996-01-01

    By request of a private company, employing basically X-ray fluorescence analysis (X RF), was established a fast and accurate method for the analysis of the major elements in limestone rocks. Additionally, for complementing analysis was determined by ion chromatography, the chlorides appearance and by atomic absorption of sodium. By gravimetry, was determined the losses by ignition and the alpha quartz. (Author)

  15. Rapid Analysis Model: Reducing Analysis Time without Sacrificing Quality.

    Science.gov (United States)

    Lee, William W.; Owens, Diana

    2001-01-01

    Discusses the performance technology design process and the fact that the analysis phase is often being eliminated to speed up the process. Proposes a rapid analysis model that reduces time needed for analysis and still ensures more successful value-added solutions that focus on customer satisfaction. (LRW)

  16. Image sequence analysis

    CERN Document Server

    1981-01-01

    The processing of image sequences has a broad spectrum of important applica­ tions including target tracking, robot navigation, bandwidth compression of TV conferencing video signals, studying the motion of biological cells using microcinematography, cloud tracking, and highway traffic monitoring. Image sequence processing involves a large amount of data. However, because of the progress in computer, LSI, and VLSI technologies, we have now reached a stage when many useful processing tasks can be done in a reasonable amount of time. As a result, research and development activities in image sequence analysis have recently been growing at a rapid pace. An IEEE Computer Society Workshop on Computer Analysis of Time-Varying Imagery was held in Philadelphia, April 5-6, 1979. A related special issue of the IEEE Transactions on Pattern Anal­ ysis and Machine Intelligence was published in November 1980. The IEEE Com­ puter magazine has also published a special issue on the subject in 1981. The purpose of this book ...

  17. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  18. Nonlinear functional analysis

    CERN Document Server

    Deimling, Klaus

    1985-01-01

    topics. However, only a modest preliminary knowledge is needed. In the first chapter, where we introduce an important topological concept, the so-called topological degree for continuous maps from subsets ofRn into Rn, you need not know anything about functional analysis. Starting with Chapter 2, where infinite dimensions first appear, one should be familiar with the essential step of consider­ ing a sequence or a function of some sort as a point in the corresponding vector space of all such sequences or functions, whenever this abstraction is worthwhile. One should also work out the things which are proved in § 7 and accept certain basic principles of linear functional analysis quoted there for easier references, until they are applied in later chapters. In other words, even the 'completely linear' sections which we have included for your convenience serve only as a vehicle for progress in nonlinearity. Another point that makes the text introductory is the use of an essentially uniform mathematical languag...

  19. Reactor operational transient analysis

    International Nuclear Information System (INIS)

    Shin, W.K.; Chae, S.K.; Han, K.I.; Yang, K.S.; Chung, H. D.; Kim, H.G.; Moon, H.J.; Ryu, Y.H.

    1983-01-01

    To build up efficient capability of safety review and inspection for the nuclear power plants, four area of studies have performed as follows: 1) In order to search the most optimized operating method during load follow operating schemes, automatic control and normal control, are compared each other under the CAOC condition. The analysis performed by DDID code has shown that the reactor has to be controlled by the operator manually during load follow operation. 2) Through the sensitivity analysis by COBRA code, the operating parameters, such as coolant pressure, flow rate, inlet temperature, and power distribution are shown to be important to the determination of DNBR. Expecially, inlet temperature of primary coolant system is appeared as the most senstive parameter on DNBR. 3) FRAPCON code is adapted to study the sensitivity of several operational parameters on the mechanical properties of reactor fuel rod. 4) The calculations procedure which is required to be obtained the neutron fluence at the reactor vessel and the spectrum at the surveillance capsule is established. The results of computation are conpared with those of FSAR and SWRI report and proved its applicability to reactor surveillance program. (Author)

  20. Heavy ion activation analysis

    International Nuclear Information System (INIS)

    Lass, B.D.; Roche, N.G.; Sanni, A.O.; Schweikert, E.A.; Ojo, J.F.

    1982-01-01

    A report on radioactivation with ion beams of 3 6 Li and 14 N is presented with some analytical applications: the determination of C via 12 C( 6 Li,αn) 13 N; the determination of Li and Be, using 14 N activation. Next, examples, with limitations in selectivity. The detection limits using a 1 μA h of activation irradiation are 5 ppm for C and 1 ppm for Li or Be. With 9 Be suitable for analytical applications are: sup(10,11)B( 9 Be,xn) 18 F and 14 N( 9 Be,αn) 18 F. Assuming a 1 μA h irradiation the detection limits for N and B are 1.5 ng and 0.5 ng, respectively, using a 7.8 MeV 9 Be beam. For activation with 12 C, experimental results with 12 MeV 12 C beam demonstrate that the beam is best suited for 7 Li analysis by the reaction 7 Li( 12 C,n) 18 F. The detection limit for a 1 μA h irradiation is 1 ng and the only other low Z elements activated are B and C. Finally, 12 C radioactivation was further combined with autoradiography for positional analysis. The spatial resolution of the technique was estimated to be 40 μm for an exposure corresponding to 6x10 5 disintegrations. As low as 10 -12 g of Li was readily detected by autoradiography. (author)