Pokkuluri, P. Raj; Dwulit-Smith, Jeff; Duke, Norma E; Wilton, Rosemarie; Mack, Jamey C; Bearden, Jessica; Rakowski, Ella; Babnigg, Gyorgy; Szurmant, Hendrik; Joachimiak, Andrzej; Schiffer, Marianne
2013-01-01
Anaeromyxobacter dehalogenans is a δ-proteobacterium found in diverse soils and sediments. It is of interest in bioremediation efforts due to its dechlorination and metal-reducing capabilities. To gain an understanding on A. dehalogenans' abilities to adapt to diverse environments we analyzed its signal transduction proteins. The A. dehalogenans genome codes for a large number of sensor histidine kinases (HK) and methyl-accepting chemotaxis proteins (MCP); among these 23 HK and 11 MCP protein...
Pokkuluri, P Raj; Dwulit-Smith, Jeff; Duke, Norma E; Wilton, Rosemarie; Mack, Jamey C; Bearden, Jessica; Rakowski, Ella; Babnigg, Gyorgy; Szurmant, Hendrik; Joachimiak, Andrzej; Schiffer, Marianne
2013-10-01
Anaeromyxobacter dehalogenans is a δ-proteobacterium found in diverse soils and sediments. It is of interest in bioremediation efforts due to its dechlorination and metal-reducing capabilities. To gain an understanding on A. dehalogenans' abilities to adapt to diverse environments we analyzed its signal transduction proteins. The A. dehalogenans genome codes for a large number of sensor histidine kinases (HK) and methyl-accepting chemotaxis proteins (MCP); among these 23 HK and 11 MCP proteins have a sensor domain in the periplasm. These proteins most likely contribute to adaptation to the organism's surroundings. We predicted their three-dimensional folds and determined the structures of two of the periplasmic sensor domains by X-ray diffraction. Most of the domains are predicted to have either PAS-like or helical bundle structures, with two predicted to have solute-binding protein fold, and another predicted to have a 6-phosphogluconolactonase like fold. Atomic structures of two sensor domains confirmed the respective fold predictions. The Adeh_2942 sensor (HK) was found to have a helical bundle structure, and the Adeh_3718 sensor (MCP) has a PAS-like structure. Interestingly, the Adeh_3718 sensor has an acetate moiety bound in a binding site typical for PAS-like domains. Future work is needed to determine whether Adeh_3718 is involved in acetate sensing by A. dehalogenans. PMID:23897711
Smidt, H.; Leest, de, H.T.J.I.; Oost, van der, J.; De Vos
2000-01-01
To characterize the expression and possible regulation of reductive dehalogenation in halorespiring bacteria, a 11.5-kb genomic fragment containing the o-chlorophenol reductive dehalogenase-encoding cprBA genes of the gram-positive bacterium Desulfitobacterium dehalogenans was subjected to detailed molecular characterization. Sequence analysis revealed the presence of eight designated genes with the order cprTKZEBACD and with the same polarity except for cprT. The deduced cprC and cprK gene p...
Wiegel, Juergen; Zhang, Xiaoming; Wu, Qingzhong
1999-01-01
Ten years after reports on the existence of anaerobic dehalogenation of polychlorinated biphenyls (PCBs) in sediment slurries, we report here on the rapid reductive dehalogenation of para-hydroxylated PCBs (HO-PCBs), the excreted main metabolites of PCB in mammals, which can exhibit estrogenic and antiestrogenic activities in humans. The anaerobic bacterium Desulfitobacterium dehalogenans completely dehalogenates all flanking chlorines (chlorines in ortho position to the para-hydroxyl group) ...
Utkin, I.; Dalton, D. D.; Wiegel, J.
1995-01-01
Resting cells of Desulfitobacterium dehalogenans JW/IU-DC1 growth with pyruvate and 3-chloro-4-hydroxyphenylacetate (3-Cl-4-OHPA) as the electron acceptor and inducer of dehalogenation reductively ortho-dehalogenate pentachlorophenol (PCP); tetrachlorophenols (TeCPs); the trichlorophenols 2,3,4-TCP, 2,3,6-TCP, and 2,4,6-TCP; the dichlorophenols 2,3-DCP, 2,4-DCP, and 2,6-DCP; 2,6-dichloro-4-R-phenols (2,6-DCl-4-RPs, where R is -H, -F, -Cl, -NO2, -CO2, or -COOCH3; 2-chloro-4-R-phenols (2-Cl-4-R...
Hwang, Soonkyu; Song, Yoseb; Cho, Byung-Kwan
2015-01-01
Acetobacterium bakii DSM 8239 is an anaerobic, psychrophilic, and chemolithoautotrophic bacterium that is a potential platform for producing commodity chemicals from syngas fermentation. We report here the draft genome sequence of A. bakii DSM 8239 (4.14 Mb) to elucidate its physiological and metabolic properties related to syngas fermentation.
Transformation of tetrachloromethane to dichloromethane and carbon dioxide by Acetobacterium woodii
Five anaerobic bacteria were tested for their abilities to transform tetrachloromethane so that information about enzymes involved in reductive dehalogenations of polychloromethanes could be obtained. Cultures of the sulfate reducer Desulfobacterium autotrophicum transformed some 80 μM tetrachloromethane to trichloromethane and a small amount of dichloromethane in 18 days under conditions of heterotrophic growth. The acetogens Acetobacterium woodii and Clostridium thermoaceticum in fructose-salts and glucose-salts media, respectively, degraded some 80 μM tetrachloromethane completely within 3 days. Trichloromethane accumulated as a transient intermediate, but the only chlorinated methanes recovered at the end of the incubation were 8 μM dichloromethane and traces of chloromethane. Desulfobacter hydrogenophilus and an autotrophic, nitrate-reducing bacterium were unable to transform tetrachloromethane. Reduction of chlorinated methanes was thus observed only in the organisms with the acetyl-coenzyme A pathway. Experiments with [14C]tetrachloromethane were done to determine the fate of this compound in the acetogen A. woodii. Radioactivity in an 11-day heterotrophic culture was largely (67%) recovered in CO2, acetate, pyruvate, and cell material. In experiments with cell suspensions to which [14C]tetrachloromethane was added, 14CO2 appeared within 20 s as the major transformation product. A. woodii thus catalyzes reductive dechlorinations and transforms tetrachloromethane to CO2 by a series of unknown reactions
Molecular characterization of anaerobic dehalogenation by Desulfitobacterium dehalogenans
Smidt, H.
2001-01-01
Haloorganics such as chlorophenols and chlorinated ethenes are among the most abundant pollutants in soil, sediments and groundwater, mainly caused by past and present industrial and agricultural activities. Due to bioaccumulation and toxicity, these compounds threaten the integrity of the environment, and human and animal health. A recently discovered, phylogenetically diverse, group of anaerobic so-called halorespiring bacteria is able to couple the reductive dehalogenation of various haloo...
Molecular characterization of anaerobic dehalogenation by Desulfitobacterium dehalogenans
Smidt, H.
2001-01-01
Haloorganics such as chlorophenols and chlorinated ethenes are among the most abundant pollutants in soil, sediments and groundwater, mainly caused by past and present industrial and agricultural activities. Due to bioaccumulation and toxicity, these compounds threaten the integrity of the environme
Jin-Feng eLiu
2015-03-01
Full Text Available Sequestration of CO2 in oil reservoirs is considered to be one of the feasible options for mitigating atmospheric CO2 building up and also for the in situ potential bioconversion of stored CO2 to methane. However, the information on these functional microbial communities and the impact of CO2 storage on them is hardly available. In this paper a comprehensive molecular survey was performed on microbial communities in production water samples from oil reservoirs experienced CO2-flooding by analysis of functional genes involved in the process, including cbbM, cbbL, fthfs, [FeFe]-hydrogenase and mcrA. As a comparison, these functional genes in the production water samples from oil reservoir only experienced water-flooding in areas of the same oil bearing bed were also analyzed. It showed that these functional genes were all of rich diversity in these samples, and the functional microbial communities and their diversity were strongly affected by a long-term exposure to injected CO2. More interestingly, microorganisms affiliated with members of the genera Methanothemobacter, Acetobacterium and Halothiobacillus as well as hydrogen producers in CO2 injected area either increased or remained unchanged in relative abundance compared to that in water-flooded area, which implied that these microorganisms could adapt to CO2 injection and, if so, demonstrated the potential for microbial fixation and conversion of CO2 into methane in subsurface oil reservoirs.
Liu, Jin-Feng; Sun, Xiao-Bo; Yang, Guang-Chao; Mbadinga, Serge M; Gu, Ji-Dong; Mu, Bo-Zhong
2015-01-01
Sequestration of CO2 in oil reservoirs is considered to be one of the feasible options for mitigating atmospheric CO2 building up and also for the in situ potential bioconversion of stored CO2 to methane. However, the information on these functional microbial communities and the impact of CO2 storage on them is hardly available. In this paper a comprehensive molecular survey was performed on microbial communities in production water samples from oil reservoirs experienced CO2-flooding by analysis of functional genes involved in the process, including cbbM, cbbL, fthfs, [FeFe]-hydrogenase, and mcrA. As a comparison, these functional genes in the production water samples from oil reservoir only experienced water-flooding in areas of the same oil bearing bed were also analyzed. It showed that these functional genes were all of rich diversity in these samples, and the functional microbial communities and their diversity were strongly affected by a long-term exposure to injected CO2. More interestingly, microorganisms affiliated with members of the genera Methanothemobacter, Acetobacterium, and Halothiobacillus as well as hydrogen producers in CO2 injected area either increased or remained unchanged in relative abundance compared to that in water-flooded area, which implied that these microorganisms could adapt to CO2 injection and, if so, demonstrated the potential for microbial fixation and conversion of CO2 into methane in subsurface oil reservoirs. PMID:25873911
Sanford, Robert A.; Cole, James R.; Tiedje, James M
2002-01-01
Five strains were isolated which form a physiologically and phylogenetically coherent group of chlororespiring microorganisms and represent the first taxon in the Myxobacteria capable of anaerobic growth. The strains were enriched and isolated from various soils and sediments based on their ability to grow using acetate as an electron donor and 2-chlorophenol (2-CPh) as an electron acceptor. They are slender gram-negative rods with a bright red pigmentation that exhibit gliding motility and f...
Prozessintensivierung der Gasfermentation mit Acetobacterium woodii in Rührkesselreaktoren
Kantzow, Christina
2016-01-01
Eine Alternative zur Synthese von Essigsäure aus fossilen Rohstoffen sind Gasfermentationen, bei denen Kohlenstoffdioxid mit Wasserstoff durch acetogene Bakterien zu Acetat umgesetzt wird. Die reaktionstechnische Analyse von Gasfermentationen mit A. woodii im Rührkesselreaktor zeigte, dass hohe Acetatkonzentrationen (60 g/L) im einfachen Satzverfahren und hohe Raum-Zeit-Ausbeuten (148 g/L/d) im kontinuierlich betriebenen Membranreaktor bei vollständiger Zellrückhaltung erreicht werden können....
Maurin, Krzysztof
1980-01-01
The extraordinarily rapid advances made in mathematics since World War II have resulted in analysis becoming an enormous organism spread ing in all directions. Gone for good surely are the days of the great French "courses of analysis" which embodied the whole of the "ana lytical" knowledge of the times in three volumes-as the classical work of Camille Jordan. Perhaps that is why present-day textbooks of anal ysis are disproportionately modest relative to the present state of the art. More: they have "retreated" to the state before Jordan and Goursat. In recent years the scene has been changing rapidly: Jean Dieudon ne is offering us his monumentel Elements d'Analyse (10 volumes) written in the spirit of the great French Course d'Analyse. To the best of my knowledge, the present book is the only one of its size: starting from scratch-from rational numbers, to be precise-it goes on to the theory of distributions, direct integrals, analysis on com plex manifolds, Kahler manifolds, the theory of sheave...
Richardson, Ruth [Cornell Univ., Ithaca, NY (United States)
2016-02-28
Our overall goal was to improve the understanding of microbial iron and sulfate reduction by evaluating a diverse iron and sulfate reducing organisms utilizing a multi-omics approach combining “top-down” and “bottom-up” omics methodologies. We initiated one of the first combined comparative genomics, shotgun proteomics, RTqPCR, and heterologous expression studies in pursuit of our project objectives. Within the first year of this project, we created a new bioinformatics tool for ortholog identification (“SPOCS”). SPOCS is described in our publication, Curtis et al., 2013. Using this tool we were able to identify conserved orthologous groups across diverse iron and sulfate reducing microorganisms from Firmicutes, gamma-proteobacteria and delta-proteobacteria. For six iron and sulfate reducers we also performed shotgun proteomics (“bottom-up” proteomics including accurate mass and time (AMT) tag and iTRAQ approaches). Cultures include Gram (-) and Gram (+) microbes. Gram (-) were: Geobacter sulfureducens (grown on iron citrate and fumarate), Geobacter bemidjiensis (grown on iron citrate and fumarate), Shewanella oneidiensis (grown on iron citrate and fumarate) and Anaeromyxobacter dehalogenans (grown on iron citrate and fumarate). Although all cultures grew on insoluble iron, the iron precipitates interfered with protein extraction and analysis; which remains a major challenge for researchers in disparate study systems. Among the Gram (-) organisms studied, Anaeromyxobacter dehalogenans remains the most poorly characterized. Yet, it is arguably the most versatile organisms we studied. In this work we have used comparative proteomics to hypothesize which two of the dozens of predicted c-type cytochromes within Anaeromyxobacter dehalogenans may be directly involved in soluble iron reduction. Unfortunately, heterologous expression of these Anaeromyxobacter dehalogenans ctype cytochromes led to poor protein production and/or formation of inclusion bodies
Thomas, Sara H.; Sanford, Robert A.; Amos, Benjamin K.; Leigh, Mary Beth; Cardenas, Erick; Löffler, Frank E.
2009-01-01
Anaeromyxobacter spp. respire soluble hexavalent uranium, U(VI), leading to the formation of insoluble U(IV), and are present at the uranium-contaminated Oak Ridge Integrated Field Research Challenge (IFC) site. Pilot-scale in situ bioreduction of U(VI) has been accomplished in area 3 of the Oak Ridge IFC site following biostimulation, but the susceptibility of the reduced material to oxidants (i.e., oxygen) compromises long-term U immobilization. Following oxygen intrusion, attached Anaeromy...
Decision analysis multicriteria analysis
The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
Gionis, Aristides
2013-01-01
The objective of this report is to highlight opportunities for enhancing global research data infrastructures from the point of view of data analysis. We discuss various directions and data-analysis functionalities for supporting such infrastructures.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
Kantorovich, L V
1982-01-01
Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space
The neutron activation analysis, which appears to be in limits for further advance, is the most suitable for providing information on the principal as well as the microcomponents in any sample of solid form. Then, instrumental activation analysis is capable of determination of far many elements in various samples. Principally on the neutron activation analysis, the following are described in literature survey from 1982 to middle 1984: bibliography, review, data collection, etc.; problems in spectral analysis and measurement; activation analysis with neutrons; charged particle and photo-nucleus reactions; chemical separation, isotopic dilution activation analysis; molecular activation analysis; standard materials; life and its relation samples; environmental, food, court trial and archaeological samples; space and earth sciences. (Mori, K.)
Bartuňková, Alena
2008-01-01
The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...
Chládek, Vítězslav
2012-01-01
The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...
Li, L.; Braat, L.C.; Lei, G.; Arets, E.J.M.M.; Liu, J.; Jiang, L.; Fan, Z.; Liu, W.; He, H.; Sun, X.
2014-01-01
This chapter presents the results of the scenario analysis of China’s ecosystems focusing on forest, grassland, and wetland ecosystems. The analysis was undertaken using Conversion of Land Use Change and its Effects (CLUE) modeling and an ecosystem service matrix (as explained below) complemented by
Lanczos, Cornelius
2010-01-01
Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs
Factor Analysis via Components Analysis
Bentler, Peter M.; de Leeuw, Jan
2011-01-01
When the factor analysis model holds, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits us to define new optimization criteria and estimation methods for exploratory factor analysis. Although this article is primarily conceptual in nature, an illustrative example and a small simulation show…
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...
Tan, Qingming
2011-01-01
Dimensional analysis is an essential scientific method and a powerful tool for solving problems in physics and engineering. This book starts by introducing the Pi Theorem, which is the theoretical foundation of dimensional analysis. It also provides ample and detailed examples of how dimensional analysis is applied to solving problems in various branches of mechanics. The book covers the extensive findings on explosion mechanics and impact dynamics contributed by the author's research group over the past forty years at the Chinese Academy of Sciences. The book is intended for advanced undergra
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Donoho, Steve
Link analysis is a collection of techniques that operate on data that can be represented as nodes and links. This chapter surveys a variety of techniques including subgraph matching, finding cliques and K-plexes, maximizing spread of influence, visualization, finding hubs and authorities, and combining with traditional techniques (classification, clustering, etc). It also surveys applications including social network analysis, viral marketing, Internet search, fraud detection, and crime prevention.
Graffelman, Jan
2013-01-01
Factor analysis is a multivariate statistical method for data reduction that originated in psychometrics and has found applications in many branches of science. The method aims to describe the correlation structure between a large set of observed variables in terms of a few underlying latent variables called factors. Factor analysis employs a specific model, where observed variables are modelled as linear combinations of common factors plus a specific error term. This model can be estimated b...
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
Paul, Debra; Cadle, James
2010-01-01
Throughout the business world, public, private and not-for-profit organisations face huge challenges. Business analysts must respond by developing practical, creative and financially sound solutions. This excellent guide gives them the necessary tools. It supports everyone wanting to achieve university and industry qualifications in business analysis and information systems. It is particularly beneficial for those studying for ISEB qualifications in Business Analysis. Some important additions since the first edition (2006): the inclusion of new techniques such as Ishikawa diagrams and spaghe
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from m
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K-mean method' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods
The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...
Rockafellar, Ralph Tyrell
2015-01-01
Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Steuding, Jorn
2005-01-01
While its roots reach back to the third century, diophantine analysis continues to be an extremely active and powerful area of number theory. Many diophantine problems have simple formulations, they can be extremely difficult to attack, and many open problems and conjectures remain. Diophantine Analysis examines the theory of diophantine approximations and the theory of diophantine equations, with emphasis on interactions between these subjects. Beginning with the basic principles, the author develops his treatment around the theory of continued fractions and examines the classic theory, inclu
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field
This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process
Song Xuexia
2005-01-01
@@ In the past, more attempts had been made to explore the ways for teachers to teach English but fewer for learners to learn the language. Learner analysis is to analyze "What the learner is", including age, attitude, motivation, intelligence,aptitude, personality , and etc, with the purpose to realize the transition from "teacher-centered" into "learner-oriented".
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance relation
David P. MacKinnon; Fairchild, Amanda J.; Fritz, Matthew S.
2007-01-01
Mediating variables are prominent in psychological theory and research. A mediating variable transmits the effect of an independent variable on a dependent variable. Differences between mediating variables and confounders, moderators, and covariates are outlined. Statistical methods to assess mediation and modern comprehensive approaches are described. Future directions for mediation analysis are discussed.
Dovjak, M.; Simone, Angela; Kolarik, Jakub;
2011-01-01
Exergy analysis enables us to make connections among processes inside the human body and processes in a building. So far, only the effect of different combinations of air temperatures and mean radiant temperatures have been studied, with constant relative humidity in experimental conditions...
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonance...
Assaf, A. George; Josiassen, Alexander
2016-01-01
and macro applications of these approaches, summarizing and critically reviewing the characteristics of the existing studies. We also conduct a meta-analysis to create an overview of the efficiency results of frontier applications. This allows for an investigation of the impact of frontier methodology...
The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties
Moore, R; Brødsgaard, I; Miller, ML;
1997-01-01
consistency in use of descriptors within groups, validity of description, accuracy of individuals compared with others in their group, and minimum required sample size were calculated using Cronbach's alpha, factor analysis, and Bayesian probability. Ethnic and professional differences within and across...
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
Colver, David
2010-01-01
Inclusion analysis is the name given by Operis to a black box testing technique that it has found to make the checking of key financial ratios calculated by spreadsheet models quicker, easier and more likely to find omission errors than code inspection.
The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning
Factor Analysis via Components Analysis
Bentler, Peter M.; Jan de Leeuw
2011-01-01
Under the null hypothesis, component loadings are linear combinations of factor loadings, and vice versa. This interrelation permits defining new optimization criteria and estimation methods for exploratory factor analysis. Although this note is primarily conceptual in nature, an illustrative example shows the methodology to be promising.
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different ....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
General remarks on sensitivity analysis, the study of changes in a model output produced by varying model inputs, are made first. Sampling methods are discussed, and three sensitivity measures: partial rank correlation, derivative or response surface, and partial variance are described. Some sample results for a 16-input, 13-output hydrodynamics model are given. Both agreement and disagreement were found among the sensitivity measures. 4 figures
Cecconi, Jaures
2011-01-01
G. Bottaro: Quelques resultats d'analyse spectrale pour des operateurs differentiels a coefficients constants sur des domaines non bornes.- L. Garding: Eigenfuction expansions.- C. Goulaouic: Valeurs propres de problemes aux limites irreguliers: applications.- G. Grubb: Essential spectra of elliptic systems on compact manifolds.- J.Cl. Guillot: Quelques resultats recents en Scattering.- N. Schechter: Theory of perturbations of partial differential operators.- C.H. Wilcox: Spectral analysis of the Laplacian with a discontinuous coefficient.
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Wald, Abraham
2013-01-01
In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio
Clement, J. D.; Kirby, K. D.
1973-01-01
Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Popovová, Šárka
2015-01-01
The aim of this bachelor thesis is to define basic methods, which are used for the preparation of a business strategy and to use those methods in a real situation. The theoretical part describes the methodology of external and internal analysis. The practical part then applies single methods such as PEST, VRIO, Porter`s five forces and value chain in order to define competitive advantages of Dr. Popov company. At the end of the Bachelor thesis will be assessment of the current situation and s...
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)
Pharyngeal analysis studies examine the function of the oropharyngeal structures and confirms or refutes the impression formed during the bedside clinical assessment. Three test meals consisting of liquid barium, puree, and cookie crumbs are usually used. These consistencies do not help either the nurses or the relatives of the patients in choosing meals. The authors use several test meals that equate to real-life foods. The equivalent real-life foods can be produced in any kitchen. The examination is done with a radiologist, speech pathologist, and a nutritionist present, and a single multidisciplinary report is issued after review of the video study. This report indicates which foods the patient can swallow effectively without aspiration and is much more useful than the three test meals for ongoing management
Rockafellar, R Tyrrell
1998-01-01
From its origins in the minimization of integral functionals, the notion of 'variations' has evolved greatly in connection with applications in optimization, equilibrium, and control. It refers not only to constrained movement away from a point, but also to modes of perturbation and approximation that are best describable by 'set convergence', variational convergence of functions and the like. This book develops a unified framework and, in finite dimension, provides a detailed exposition of variational geometry and subdifferential calculus in their current forms beyond classical and convex analysis. Also covered are set-convergence, set-valued mappings, epi-convergence, duality, maximal monotone mappings, second-order subderivatives, measurable selections and normal integrands. The changes in this 3rd printing mainly concern various typographical corrections, and reference omissions that came to light in the previous printings. Many of these reached the authors' notice through their own re-reading, that of th...
Oehler Dirk
2012-12-01
Full Text Available Abstract Background Thermacetogenium phaeum is a thermophilic strictly anaerobic bacterium oxidizing acetate to CO2 in syntrophic association with a methanogenic partner. It can also grow in pure culture, e.g., by fermentation of methanol to acetate. The key enzymes of homoacetate fermentation (Wood-Ljungdahl pathway are used both in acetate oxidation and acetate formation. The obvious reversibility of this pathway in this organism is of specific interest since syntrophic acetate oxidation operates close to the energetic limitations of microbial life. Results The genome of Th. phaeum is organized on a single circular chromosome and has a total size of 2,939,057 bp. It comprises 3.215 open reading frames of which 75% could be assigned to a gene function. The G+C content is 53.88 mol%. Many CRISPR sequences were found, indicating heavy phage attack in the past. A complete gene set for a phage was found in the genome, and indications of phage action could also be observed in culture. The genome contained all genes required for CO2 reduction through the Wood-Ljungdahl pathway, including two formyl tetrahydrofolate ligases, three carbon monoxide dehydrogenases, one formate hydrogenlyase complex, three further formate dehydrogenases, and three further hydrogenases. The bacterium contains a menaquinone MQ-7. No indications of cytochromes or Rnf complexes could be found in the genome. Conclusions The information obtained from the genome sequence indicates that Th. phaeum differs basically from the three homoacetogenic bacteria sequenced so far, i.e., the sodium ion-dependent Acetobacterium woodii, the ethanol-producing Clostridium ljungdahlii, and the cytochrome-containing Moorella thermoacetica. The specific enzyme outfit of Th. phaeum obviously allows ATP formation both in acetate formation and acetate oxidation.
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
This review is the seventh in the series compiled by using the Dialog on-line CA Search facilities at the Information Resource Center of USS Technical Center covering the period from Oct. 1984 to Nov. 1, 1986. The quest for better surface properties, through the application of various electrochemical and other coating techniques, seems to have increased and reinforces the notion that only through the value added to a steel by proper finishing steps can a major supplier hope to compete profitably. The detection, determination, and control of microalloying constituents has also been generating a lot of interest as evidenced by the number of publications devoted to this subject in the last few years. Several recent review articles amplify on the recent trends in the application of modern analytical technology to steelmaking. Another review has been devoted to the determination of trace elements and the simultaneous determination of elements in metals by mass spectrometry, atomic absorption spectrometry, and multielement emission spectrometry. Problems associated with the analysis of electroplating wastewaters have been reviewed in a recent publication that has described the use of various spectrophotometric methods for this purpose. The collection and treatment of analytical data in the modern steel making environment have been extensively reviewed emphasis on the interaction of the providers and users of the analytical data, its quality, and the cost of its collection. Raw material treatment and beneficiation was the dominant theme
... Home Visit Global Sites Search Help? Pericardial Fluid Analysis Share this page: Was this page helpful? Formal name: Pericardial Fluid Analysis Related tests: Pleural Fluid Analysis , Peritoneal Fluid Analysis , ...
... Home Visit Global Sites Search Help? Peritoneal Fluid Analysis Share this page: Was this page helpful? Formal name: Peritoneal Fluid Analysis Related tests: Pleural Fluid Analysis , Pericardial Fluid Analysis , ...
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
An example of multidimensional analysis: Discriminant analysis
Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included
Basic principles of neutron activation analysis are outlined. Examples of its use in police science include analysis for gunshot residues, toxic element determinations and multielement comparisons. Advantages of neutron activation analysis over other techniques are described. (R.L.)
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
杜梅香
2006-01-01
This paper is about the discourse analysis and illustrate the approach to analysis the Critical discourses and the discourses about the educational situation of China. And it also includes the condensed theoretical support of the Critical discourse analysis and analysis of the sample I of the discourses between an illiterate person and the literate.
Ibsen, Lars Bo; Liingaard, Morten
This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software (section 1.1), general digital analysis (section 1.2), basics of structural dynamics and modal analysis (section 1.3) and system...
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
Using orCAD Pspice EDA software, the circuit simulation and the analysis such as transient analysis, noise analysis, temperature analysis, are made for charge-sensitive preamplifier. By calculation and comparison, the result shows circuit noise responses according to the temperature changes. (authors)
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Quantitative analysis chemistry
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Analysis of Precision of Activation Analysis Method
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Georgiana Cristina NUKINA
2012-07-01
Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
Hostick, Donna J.; Nicholls, Andrew K.; McDonald, Sean C.; Hollomon, Jonathan B.
2005-08-01
A joint NREL, ORNL, and PNNL team conducted market analysis to help inform DOE/EERE's Weatherization and Intergovernmental Program planning and management decisions. This chapter presents the results of the market analysis for the Buildings sector.
Event history analysis: overview
Keiding, Niels
2001-01-01
Survival analysis, Multi-state models, Counting processes, Aalen-Johansen estimator, Markov processes......Survival analysis, Multi-state models, Counting processes, Aalen-Johansen estimator, Markov processes...
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose;
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and...
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Whitaker, Simon
2009-01-01
It is argued that the determinates of low frequency (less than once an hour) challenging behavior are likely to be more complex than those of high frequency behavior involving setting events that may not be present when the behavior occurs. The analysis of case records is then examined as a method of identifying possible setting events to low frequency behaviours. It is suggested that time series analysis, correlational analysis and time lag sequential analysis may all be useful methods in th...
Analysis of Business Environment
Horáková, Eva
2012-01-01
This bachelor's theses deals with analysis of the entrepreneurial environment of the company ALBO okna - dveře, s.r.o. . With the help of SWOT analysis, passportisation and Porter's analysis of five rival companies an analysis of the entrepreneurial environment, in which the company ALBO okna - dveře, s.r.o. is situated, will be carried out. This theses also comprises the evaluation of opportunities and risks resulting from the given entrepreneurial environment.
Computer Programming Job Analysis
Debdulal Dutta Roy
2002-01-01
This study investigated relative uses of computer programming job characteristics across different organizations and effects of different demographic variables on job analysis ratings. Data were collected from 201 computer programers of 6 different organizations through checklist. Principal component analysis noted four mostly used job characteristics as program writing and testing, human relations, data analysis and user satisfaction. Of them only data analysis differed among different organ...
Discourse analysis and Foucault's
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Extending Scalasca's analysis features
Lorenz, Daniel; Böhme, David; Mohr, Bernd; Strube, Alexandre; Szebenyi, Zoltan
2013-01-01
Scalasca is a performance analysis tool, which parses the trace of an application run for certain patterns that indicate performance inefficiencies. In this paper, we present recently developed new features in Scalasaca. In particular, we describe two newly implemented analysis methods: the root cause analysis which tries to identify the cause of a delay and the critical path analysis, which analyses the path of execution that determines the application runtime. Furthermore, we present time-s...
SOFAS: Software Analysis Services
Ghezzi, G
2010-01-01
We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
This book gives descriptions of use and method analysis, which includes handling of glass tools, boiling, filter of liquid, distillation method and a sampling process, evaporation and dry, reagent solution and concentration. It explains waster quality analysis with quantitative analysis, weight analysis and solvent extraction. It also introduces flow measurement, and test method of each item like hydrogen ion concentration, ammoniacal nitrogen, phosphate, chrome, cadmium, alkyl mercury and number of colitis germs.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Corliss, William R.
1968-01-01
In activation analysis, a sample of an unknown material is first irradiated (activated) with nuclear particles. In practice these nuclear particles are almost always neutrons. The success of activation analysis depends upon nuclear reactions which are completely independent of an atom's chemical associations. The value of activation analysis as a research tool was recognized almost immediately upon the discovery of artificial radioactivity. This book discusses activation analysis experiments, applications and technical considerations.
[Transcriptome analysis and epigenetic analysis during osteoclastogenesis].
Nakamura, Shinya; Tanaka, Sakae
2016-04-01
The importance of receptor activator of nuclear factor-κB ligand(RANKL)during osteoclastogenesis was discovered in 1998. After that Nfatc1, downstream gene of RANKL-RANK signaling, was identified as a master regulator of osteoclastogenesis by transcriptome analysis. In recent years, with the advancement of epigenetic analysis method and big data analysis technology, epigenetic analysis about osteoclastogenesis gradually progresses. Some papers using H3K4me3 and H3K27me3 histone modification change data, DNase-seq data and formaldehyde-assisted isolation of regulatory elements(FAIRE)-seq data during osteoclastogenesis were published recently. It will probably contribute to elucidate the crosstalk between osteoclasts and osteoblasts, osteocytes or chondrocytes in the future. PMID:27013627
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Hibbert, DBrynn
2005-01-01
Based on D Brynn Hibbert''s lectures on data analysis to undergraduates and graduate students, this book covers topics including measurements, means and confidence intervals, hypothesis testing, analysis of variance, and calibration models. It is meant as an entry level book targeted at learning and teaching undergraduate data analysis.
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Sparse exploratory factor analysis
Fontanella, Sara; Trendafilov, Nickolay; Adachi, Kohei
2014-01-01
Sparse principal component analysis is a very active research area in the last decade. In the same time, there are very few works on sparse factor analysis. We propose a new contribution to the area by exploring a procedure for sparse factor analysis where the unknown parameters are found simultaneously.
Harmonic and spectral analysis
Székelyhidi, László
2014-01-01
This book provides a modern introduction to harmonic analysis and synthesis on topological groups. It serves as a guide to the abstract theory of Fourier transformation. For the first time, it presents a detailed account of the theory of classical harmonic analysis together with the recent developments in spectral analysis and synthesis. Sample Chapter(s)Chapter 1: Duality of Finite Abelian Groups (254 KB) Contents:Abstract Harmonic Analysis:Duality of Finite Abelian GroupsHarmonic Analysis on Finite Abelian GroupsSet
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Analysis in usability evaluations
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper Anders Søren
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in...... usability evaluations....
Gabriel Data Analysis (GDA): from data analysis to food analysis
OLIVE, Gilles
2011-01-01
GDA is a software belonging to the Gabriel package and is devoted to data analysis. Year after year some new features have been introduced and the latest introductions are more dedicated to food. GDA is built around modules and we describe here the most widely used in food chemistry. GDA can be obtained free of charge upon request.
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data. It...
Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process
Establishing PIXE analysis conditions for sweat analysis
This paper describes a technical analysis of biological samples of sweat by proton induced X-ray emission (hereafter referred to as PIXE). The choice of the irradiation conditions, the preparation of the targets, and the processing treatment of spectra are described. Two excitation systems were used, one under vacuum and the other in atmospheric pressure under helium. Advantages and disadvantages of the two systems are described and detection limits obtained in both cases are presented. A comparison with results obtained by X-ray analysis induced by radioactive sources is made. (orig.)
Cost benefit analysis cost effectiveness analysis
The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data
Introduction to reliability analysis
Reliability analysis is concerned with the analysis of devices and systems whose individual components are prone to failure. This textbook presents an introduction to reliability analysis of repairable and nonrepairable systems. It is based on courses given to both undergraduate and graduate students of engineering and statistics as well as in workshops for professional engineers and scientists. The book concentrates on the methodology of the subject and on understanding theoretical results rather than on its theoretical development. An intrinsic aspect of reliability analysis is that the failure of components is best modelled using techniques drawn from probability and statistics. The author covers all the basic concepts required from these subjects and covers the main modern reliability analysis techniques thoroughly. These include: the graphical analysis of life data, maximum likelihood estimation, and Bayesian estimation. Throughout, the emphsis is on the practicalities of the subject with numerous examples drawn from industrial and engineering settings. (orig.) With 50 figs
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this...... well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid;
2016-01-01
automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address......Current analytical approaches in computational social science can be characterized by four dominant paradigms: text analysis (information extraction and classification), social network analysis (graph theory), social complexity analysis (complex systems science), and social simulations (cellular...... this limitation, based on the sociology of associations and the mathematics of set theory, this paper presents a new approach to big data analytics called social set analysis. Social set analysis consists of a generative framework for the philosophies of computational social science, theory of social data...
The heat transport systems of MONJU are three main heat transport loops, each loop consist of the primary, the secondary loop and the water-steam system, in addition, the auxiliary cooling system. These systems are under the influence one another on plant transient. So it is important to evaluate the flow and heat characteristics of the heat transport systems on calculating plant transient. We made the plant dynamic analysis codes of MONJU to calculate the plant transient analysis and evaluate the plant characteristics by the disturbance on the on-power operation and the performance of the plant control systems. In this paper, one of the main plant dynamic simulation code of MONJU, the calculation conditions on analysis, the plant safety analysis, the plant stability analysis and the plant thermal transient analysis are discribed. (author)
Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-01-01
We use the Detrended Fluctuation Analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of Grassberger- Proccacia (GP) analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or ...
Verhoosel, C.V.; Gutiérrez, M. A.; Hulshoff, S.J.
2006-01-01
The field of fluid-structure interaction is combined with the field of stochastics to perform a stochastic flutter analysis. Various methods to directly incorporate the effects of uncertainties in the flutter analysis are investigated. The panel problem with a supersonic fluid flowing over it is considered as a testcase. The stochastic moments (mean, standard deviation, etc.) of the flutter point are computed by an uncertainty analysis. Sensitivity-based methods are used to determine the stoc...
Procházková, Gabriela
2014-01-01
The Bachelor thesis focuses on the analysis of the marketing mix which was applied to a company called Bill Ltd. The theoretical part focuses on the general description of the marketing mix and its individual elements. In the practical part is mentioned the history and present of the company Billa Ltd. It also includes a specific analysis of individual elements of the marketing mix of the company Bill Ltd. and analysis of the results of the electronic questioning, that focuses on br...
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it....
Ranganath, Rajesh; Perotte, Adler; Elhadad, Noémie; Blei, David
2016-01-01
The electronic health record (EHR) provides an unprecedented opportunity to build actionable tools to support physicians at the point of care. In this paper, we investigate survival analysis in the context of EHR data. We introduce deep survival analysis, a hierarchical generative approach to survival analysis. It departs from previous approaches in two primary ways: (1) all observations, including covariates, are modeled jointly conditioned on a rich latent structure; and (2) the observation...
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
Möncke, Ulrich R.
1986-01-01
This paper specifies the theoretical basis for the implementation of different generators of the OPTRAN system. Grammar Flow analysis transports the techniques of data flow analysis to the meta level of compiler construction. The analogon to the states in data flow analysis are the syntax trees together with some information, which is associated with trees by propagation functions. One example is the association of characteristic graphs, another example the association of sets of matching tre...
Machine Fault Signature Analysis
Mulchandani, K. B.; A.K. Wadhwani; Pratesh Jayaswal
2008-01-01
The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while deta...
Johnson, Andrew
2009-01-01
This essay provides a detailed strategic analysis of 3rdWhale, a Vancouver-based start-up in the sustainability sector, along with an analysis of the smartphone applications industry. Porter?s five forces model is used to perform an industry analysis of the smartphone application industry and identify key success factors for application developers. Using the identified factors, 3rdWhale is compared to its indirect competitors to identify opportunities and threats and produce a range of strate...
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
Hazard Analysis Database Report
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification
2010-01-01
Finite element analysis is an engineering method for the numerical analysis of complex structures. This book provides a bird's eye view on this very broad matter through 27 original and innovative research studies exhibiting various investigation directions. Through its chapters the reader will have access to works related to Biomedical Engineering, Materials Engineering, Process Analysis and Civil Engineering. The text is addressed not only to researchers, but also to professional engineers, engineering lecturers and students seeking to gain a better understanding of where Finite Element Analysis stands today.
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
HIRENASD analysis Information Package
National Aeronautics and Space Administration — Updated November 2, 2011 Contains summary information and analysis condition details for the Aeroelastic Prediction Workshop Information plotted in this package is...
Risk analysis methodology survey
Batson, Robert G.
1987-01-01
NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
The Krsko Source Term Analysis (STA) has been provided as integral part of the Krsko Individual Plant Examination Project (IPE) Level 2 (Containment Analysis). Based on its own definition, the STA quantifies the magnitude, time dependence and composition of the fission product releases which characterize each Release Category (RC). The Krsko STA also addresses the definition of each RC, identification and the choice of dominant accident sequences within a release category, analysis of the representative accident sequences using MAAP 3.OB (Modular Accident Analysis Program) revision 18 to estimate the source term characteristic and discussion of identified major areas of uncertainty. (author)
Murty, PSR
2007-01-01
Power system analysis is a pre-requisite course for electrical engineering students. This book introduces concepts of a power system, network model faults and analysis and the primitive network stability. It also deals with graph theory relevant to various incidence matrices, building of network matrices and power flow studies. It further discusses with short circuit analysis, unbalanced fault analysis and power system stability problems, such as, steady state stability, transient stability and dynamic stability. Salient Features: Number of worked examples are followed after explaining theory
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Hazard Analysis Database Report
GRAMS, W.H.
2000-12-28
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.
National Oceanic and Atmospheric Administration, Department of Commerce — Space Weather Analysis archives are model output of ionospheric, thermospheric and magnetospheric particle populations, energies and electrodynamics
Machine Fault Signature Analysis
K. B. Mulchandani
2008-03-01
Full Text Available The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while detailed consideration is given to the subject of the rolling element bearing fault signature analysis.
Risk Analysis of Marine Structures
Hansen, Peter Friis
1998-01-01
Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated.......Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated....
Advanced Analysis Environments - Summary
This is a summary of the panel discussion on Advanced Analysis Environments. Rene Brun, Tony Johnson, and Lassi Tuura shared their insights about the trends and challenges in analysis environments. This paper contains the initial questions, a summary of the speakers' presentation, and the questions asked by the audience
Interaction Analysis and Supervision.
Amidon, Edmund
This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)
Spool assembly support analysis
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
MEAD retrospective analysis report
Hasager, Charlotte Bay; Carstensen, J.; Frohn, L.M.;
2003-01-01
The retrospective analysis investigates links between atmospheric nitrogen deposition and algal bloom development in the Kattegat Sea from April to September 1989-1999. The analysis is based on atmospheric deposition model results from the ACDEP model,hydrodynamic deep-water flux results, phytopl...
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...
Shot loading platform analysis
This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Structural analysis for Diagnosis
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2001-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...
Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...
Bunke, H; Baird, H
1994-01-01
Interest in the automatic processing and analysis of document images has been rapidly increasing during the past few years. This book addresses the different subfields of document image analysis, including preprocessing and segmentation, form processing, handwriting recognition, line drawing and map processing, and contextual processing.
Analysis of Design Documentation
Hansen, Claus Thorp
1998-01-01
has been established where we seek to identify useful design work patterns by retrospective analyses of documentation created during design projects. This paper describes the analysis method, a tentatively defined metric to evaluate identified work patterns, and presents results from the first...... analysis accomplished....
Bemman, Brian; Meredith, David
Sheer Pluck (1984), a twelve-tone composition for guitar by Milton Babbitt (1916–2011). This analysis focuses on the all-partition array structure on which the piece is based. Having pre- sented this analysis, we formalize some constraints on the structure of the piece and explore some computational...
Bass, Roger
2010-01-01
Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless--a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to…
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
Interactive Controls Analysis (INCA)
Bauer, Frank H.
1989-01-01
Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.
Josefsen, Knud; Nielsen, Henrik
2011-01-01
Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... closing the gap to the more laborious nuclease protection experiments....
Analysis of Industrial Wastewaters.
Mancy, K. H.; Weber, W. J., Jr.
A comprehensive, documented discussion of certain operating principles useful as guidelines for the analysis of industrial wastewaters is presented. Intended primarily for the chemist, engineer, or other professional person concerned with all aspects of industrial wastewater analysis, it is not to be considered as a substitute for standard manuals…
The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques
Liberdová, I.
2015-01-01
This article is focused on the dynamic drawing analysis. It deals with temporal segmentation methods for hand-drawn pictures. The automatic vectorization of segmentation results is considered as well. Dynamic drawing analysis may significantly improves tracing drawing test utilization in the clinical physiology trials.
Structural analysis for diagnosis
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2002-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential tech-nique to obtain redundant information for diagnosis, is reconsidered in this paper. Matching is reformulated as a problem of...
Louis Kaplow; Steven Shavell
1999-01-01
This entry for the forthcoming The New Palgrave Dictionary of Economics (Second Edition) surveys the economic analysis of five primary fields of law: property law; liability for accidents; contract law; litigation; and public enforcement and criminal law. It also briefly considers some criticisms of the economic analysis of law.
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported
Chakravorti, Sivaji
2015-01-01
This book prepares newcomers to dive into the realm of electric field analysis. The book details why one should perform electric field analysis and what are its practical implications. It emphasizes both the fundamentals and modern computational methods of electric machines. The book covers practical applications of the numerical methods in high voltage equipment, including transmission lines, power transformers, cables, and gas insulated systems.
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
Porten, D.R.; Crowe, R.D.
1994-12-16
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall.
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Strategic analysis of the company
Matoušková, Irena
2012-01-01
Strategic analysis of the company In my thesis I developed a strategic analysis of the company Pacovské strojírny a.s. I describe the various methods of internal and external strategic analysis in the theoretical part. I followed the methods used in the practical part. In an internal strategic analysis, I focused on the identification of internal resources and capabilities, the financial analysis and the chain of creating value. External strategic analysis includes PEST analysis, Porter's fiv...
Material analysis on engineering statistics
Lee, Seung Hun
2008-03-15
This book is about material analysis on engineering statistics using mini tab, which includes technical statistics and seven tools of QC, probability distribution, presumption and checking, regression analysis, tim series analysis, control chart, process capacity analysis, measurement system analysis, sampling check, experiment planning, response surface analysis, compound experiment, Taguchi method, and non parametric statistics. It is good for university and company to use because it deals with theory first and analysis using mini tab on 6 sigma BB and MBB.
Material analysis on engineering statistics
This book is about material analysis on engineering statistics using mini tab, which includes technical statistics and seven tools of QC, probability distribution, presumption and checking, regression analysis, tim series analysis, control chart, process capacity analysis, measurement system analysis, sampling check, experiment planning, response surface analysis, compound experiment, Taguchi method, and non parametric statistics. It is good for university and company to use because it deals with theory first and analysis using mini tab on 6 sigma BB and MBB.
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)
NASA Enterprise Visual Analysis
Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck
2007-01-01
NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.
Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-10-01
We use the detrended fluctuation analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of GP analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or systems of low dimensionality such as the Earth climate. The DFA method is additionally able to distinguish a natural language signal from a computer code signal. This last result may be useful in the field of cryptography.
Kosmidis, K; Argyrakis, P; Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-01-01
We use the Detrended Fluctuation Analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of Grassberger- Proccacia (GP) analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or systems of low dimensionality such as the earth climate. The DFA method is additionally able to distinguish a natural language signal from a computer code signal. This last result may be useful in the field of cryptography.
Nielsen, Christoffer Rosenkilde; Nielson, Hanne Riis
2006-01-01
operation blinding. In this paper we study the theoretical foundations for one of the successful approaches to validating cryptographic protocols and we extend it to handle the blinding primitive. Our static analysis approach is based on Flow Logic; this gives us a clean separation between the specification...... of the analysis and its realisation in an automatic tool. We concentrate on the former in the present paper and provide the semantic foundation for our analysis of protocols using blinding - also in the presence of malicious attackers....
Integrated Analysis Capability Program
Vos, R. G.; Beste, D. L.; Greg, J.; Frisch, H. P.
1991-01-01
Integrated Analysis Capability (IAC) software system intended to provide highly effective, interactive analysis tool for integrated design of large structures. Supports needs of engineering analysis groups concerned with interdisciplinary problems. Developed to serve as software interface between computer programs from fields of structures, thermodynamics, controls, and dynamics of systems on one hand and executive software system and data base on other hand to yield highly efficient multi-disciplinary system. Special attention given to such users' requirements as handling data and online assistance with operational features and ability to add new modules of user's choice at future date. Written in FORTRAN 77.
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Finite Discrete Gabor Analysis
Søndergaard, Peter Lempel
2007-01-01
Gabor analysis is a method for analyzing signals through the use of a set of basic building blocks. The building blocks consists of a certain function (the window) that is shifted in time and frequency. The Gabor expansion of a signal contains information on the behavior of the signal in certain...... basic discrete time/frequency and Gabor analysis. It is intended to be both an educational and a computational tool. The toolbox was developed as part of this Ph.D. project to provide a solid foundation for the field of computational Gabor analysis....
Factor analysis and scintigraphy
The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing
In applying ALARA somebody has to determine what is reasonable. A decision has to be made. Decision Analysis is a procedure for structuring decision problems, and therefore it might be expected to be of use in the application of the ALARA principle. This paper explores the contribution that Decision Analysis may make in this context, particularly discussing the application of multi-attribute value theory. An example of using the theory on a problem of radioactive waste disposal is given, and the potential of an increased use of Decision Analysis in making the ALARA principle work, is discussed
The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis
Numerical analysis of inflation
In this paper the mechanism of a cosmological phase transition is addressed in a new way which avoids weaknesses of previous approaches. The effects of inhomogeneities are included explicitly. A numerical analysis is presented in which for a wide class of models the Universe enters a period of new inflation. The analysis is classical and applies to models in which the scalar field responsible for driving inflation is weakly coupled to other fields. We derive heuristic arguments which determine the boundaries of the region in parameter space for which inflation is realized. The agreement with the numerical results is good. This paper complements a previous analytical analysis
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Iremonger, M J
2013-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Schuller, Björn W
2013-01-01
This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition. Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...
Pesticide Instrumental Analysis
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Canonical Information Analysis
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is...... replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in...
Henson, C Ward; Kechris, Alexander S; Odell, Edward; Finet, Catherine; Michaux, Christian; Cassels, J W S
2003-01-01
This volume comprises articles from four outstanding researchers who work at the cusp of analysis and logic. The emphasis is on active research topics; many results are presented that have not been published before and open problems are formulated.
Unsupervised Linear Discriminant Analysis
无
2006-01-01
An algorithm for unsupervised linear discriminant analysis was presented. Optimal unsupervised discriminant vectors are obtained through maximizing covariance of all samples and minimizing covariance of local k-nearest neighbor samples. The experimental results show our algorithm is effective.
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Paciesas, William S.
1993-01-01
Miscellaneous tasks related to the operation of, and analysis of data from, the Burst and Transient Experiment (BATSE) on the Compton Gamma Ray Observatory (CGRO) were performed. The results are summarized and relevant references are included.
Main: Nucleotide Analysis [KOME
Full Text Available Nucleotide Analysis GenBank blastx search ... result Result of blastx search ... against GenBank amino a ... cid sequence kome_genbank_blastx_search _result.zip kome_genbank_blastx_search _result ...
Main: Nucleotide Analysis [KOME
Full Text Available Nucleotide Analysis GenBank blastn search ... result Result of blastn search ... against GenBank nucleot ... ide sequence kome_genbank_blastn_search _result.zip kome_genbank_blastn_search _result ...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Gasińksi, Leszek
2014-01-01
Exercises in Analysis will be published in two volumes. This first volume covers problems in five core topics of mathematical analysis: metric spaces; topological spaces; measure, integration, and Martingales; measure and topology; and functional analysis. Each of five topics correspond to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. At least 170 exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic. This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of th...
Denker, A; Rauschenberg, J; Röhrich, J; Strub, E
2006-01-01
Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.
Crabb, J W; West, K A; Dodson, W S; Hulmes, J D
2001-05-01
Amino acid analysis (AAA) is one of the best methods to quantify peptides and proteins. Two general approaches to quantitative AAA exist, namely, classical postcolumn derivatization following ion-exchange chromatography and precolumn derivatization followed by reversed-phase HPLC (RP-HPLC). Excellent instrumentation and several specific methodologies are available for both approaches, and both have advantages and disadvantages. This unit focuses on picomole-level AAA of peptides and proteins using the most popular precolumn-derivatization method, namely, phenylthiocarbamyl amino acid analysis (PTC-AAA). It is directed primarily toward those interested in establishing the technology with a modest budget. PTC derivatization and analysis conditions are described, and support and alternate protocols describe additional techniques necessary or useful for most any AAA method--e.g., sample preparation, hydrolysis, instrument calibration, data interpretation, and analysis of difficult or unusual residues such as cysteine, tryptophan, phosphoamino acids, and hydroxyproline. PMID:18429107
Main: Nucleotide Analysis [KOME
Full Text Available Nucleotide Analysis PLACE search result Result of signal search against PLACE : cis-acting regul ... atory DNA elements Database ... kome_place_search_result.zip kome_place_search_res ...
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
13. seminar 'Activation analysis'
Collection of the abstracts of contributions to the seminar covering broad ranges of application of activation analysis and improvements of systems and process steps. Most of them have been prepared separately for the energy data bases. (RB)
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
... may experience difficulties. Several factors can affect the sperm count or other semen analysis values, including use of alcohol, tobacco, ... the News Article Index About This Site Send Us Your Comments ...
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
Piping stress analysis on Primary Sampling System, Reactor Cooling System, and Feedwater System for AP600 have been performed. Piping stress analysis is one of the requirements in the design of piping system. Piping stress is occurred due to static and dynamic loads during service. Analysis was carried out. Using PS+CAEPIPE software based on the individual and combination loads with assumption that failure could be happened during normal, upset, emergency and faulted condition as describe in ASME III/ANSI B31.1. With performing the piping stress analysis, the layout (proper pipe routing) of the piping system can be design with the requirements of piping stress and pipe supports in mind I.e sufficient flexibility for thermal expansion, etc to commensurate with the i tended service such as temperatures, pressure, seismic and anticipated loading
Main: Nucleotide Analysis [KOME
Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against japonica genome... sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...
Main: Nucleotide Analysis [KOME
Full Text Available Nucleotide Analysis Indica genome blast search result Result of blastn search against Indica genome... sequence (top hit only) kome_indica_genome_blast_search_result.zip kome_indica_genome_blast_search_result ...
US Fish and Wildlife Service, Department of the Interior — This document is an analysis and summary of progress toward achieving the interim management objectives for the Russian River during the 1979 season. Additionally,...
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Barth, Howard G.; Sun, Shao-Tang
1989-01-01
Presents a review of research focusing on scattering, elution techniques, electrozone sensing, filtration, centrifugation, comparison of techniques, data analysis, and particle size standards. The review covers the period 1986-1988. (MVL)
Systems analysis-independent analysis and verification
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Software reliability analysis in probabilistic risk analysis
Probabilistic Risk Analysis (PRA) is a tool which can reveal shortcomings of the NPP design in general. PRA analysts have not had sufficient guiding principles in modelling particular digital components malfunctions. Digital I and C systems are mostly analysed simply and the software reliability estimates are engineering judgments often lacking a proper justification. The OECD/NEA Working Group RISK's task DIGREL develops a taxonomy of failure modes of digital I and C systems. The EU FP7 project HARMONICS develops software reliability estimation method based on an analytic approach and Bayesian belief network. (author)
Gait Analysis by Multi Video Sequence Analysis
Jensen, Karsten; Juhl, Jens
2009-01-01
The project presented in this article aims to develop software so that close-range photogrammetry with sufficient accuracy can be used to point out the most frequent foot mal positions and monitor the effect of the traditional treatment. The project is carried out as a cooperation between the Ort...... Sequence Analysis (MVSA). Results show that the developed MVSA system, in the following called Fodex, can measure the navicula height with a precision of 0.5-0.8 mm. The calcaneus angle can be measured with a precision of 0.8-1.5 degrees....
Hartmanová, Dominika
2013-01-01
Bachelor Thesis Analysis of the marketing mix describes a marketing mix of company Lego Tradings, s. r. o. The theoretical part includes specification of basic concepts, such as marketing, marketing mix, tools of marketing mix, product, price, place and promotion. The second part is devoted to custom solutions. The introducion of the Lego company comes first. There are also analysis of the tools of marketing mix. In this part the results are described for a marketing research, namely a quest...
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
Economic Analysis of Constitutions
Roger B. Myerson
2000-01-01
This paper is a preliminary draft of an article to appear in Chicago Law Review (2000), as part of a symposium reviewing two new books on economic analysis of constitutions: Dennis Mueller's Constitutional Democracy and Robert Cooter's Strategic Constitution. Some of the basic questions of constitutional analysis are introduced, and the importance of work in this area is shown as one of the major new developments in social theory. The methods of economic theory are then shown to be particular...
MEAD retrospective analysis report
Hasager, Charlotte Bay; CARSTENSEN J.; Frohn, L. M.; Gustafson, B.; Brandt, J.; Conley, D.; Geernaert, G.; Henriksen, P.; C. A. Skjøth; Johnsson, M.
2003-01-01
The retrospective analysis investigates links between atmospheric nitrogen deposition and algal bloom development in the Kattegat Sea from April to September 1989-1999. The analysis is based on atmospheric deposition model results from the ACDEP model,hydrodynamic deep-water flux results, phytoplankton abundance observations from Danish and Swedish marine monitoring stations and optical satellite data. Approximately 70 % of the atmospheric deposition consists of wet depostion of highly episod...
Boggs, Paul T.; Althsuler, Alan (Exagrid Engineering); Larzelere, Alex R. (Exagrid Engineering); Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.
Submarine hydraulic control analysis
Bower, Michael J.
1980-01-01
Approved for public release; distribution unlimited A mathematical model was developed to include line effects in the submarine hydraulic system dynamic performance analysis. The project was undertaken in an effort to demonstrate the necessity of coupling the entire hydraulic power network for an accurate analysis of any of the subsystems rather than the current practice of treating a component loop as an isolated system. It was intended that the line model could be co...
Vávrová, Eva
2014-01-01
This thesis deals with the analysis of EEG during various sleep stages, which is done by calculating the selected parameters from the time and frequency domain. These parameters are calculated from individual segments of EEG signals that correspond with various sleep stages. Based on the analysis it decides which EEG parameters are appropriate for the automatic detection of the phases and which method is more suitable for evaluation of data in hypnogram. The programme MATLAB was used for the ...
Speed, T. P.
2003-01-01
This talk will review a little over a decade's research on applying certain stochastic models to biological sequence analysis. The models themselves have a longer history, going back over 30 years, although many novel variants have arisen since that time. The function of the models in biological sequence analysis is to summarize the information concerning what is known as a motif or a domain in bioinformatics, and to provide a tool for discovering instances of that motif or domain in a separa...
Mateev, Nikolay; Menon, Vijay; Pingali, Keshav
2000-01-01
Restructuring compilers use dependence analysis to prove that the meaning of a program is not changed by a transformation. A well-known limitation of dependence analysis is that it examines only the memory locations read and written by a statement, and does not assume any particular interpretation for the operations in that statement. Exploiting the semantics of these operations enables a wider set of transformations to be used, and is critical for optimizing important codes such as LU factor...
Bass, Roger
2010-01-01
Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless—a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to Enlightenment and Samādhi. The concept of stimulus singularity is introduced to account for why, within Zen's frame of reference, its methods can be stu...
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Provenance as Dependency Analysis
Cheney, James; Ahmed, Amal; Acar, Umut,
2007-01-01
Provenance is information recording the source, derivation, or history of some information. Provenance tracking has been studied in a variety of settings; however, although many design points have been explored, the mathematical or semantic foundations of data provenance have received comparatively little attention. In this paper, we argue that dependency analysis techniques familiar from program analysis and program slicing provide a formal foundation for forms of provenance that are intende...
Bayesian Group Factor Analysis
Virtanen, Seppo; Klami, Arto; Khan, Suleiman A; Kaski, Samuel
2011-01-01
We introduce a factor analysis model that summarizes the dependencies between observed variable groups, instead of dependencies between individual variables as standard factor analysis does. A group may correspond to one view of the same set of objects, one of many data sets tied by co-occurrence, or a set of alternative variables collected from statistics tables to measure one property of interest. We show that by assuming group-wise sparse factors, active in a subset of the sets, the variat...
Klami, Arto; Virtanen, Seppo; Leppäaho, Eemeli; Kaski, Samuel
2014-01-01
Factor analysis provides linear factors that describe relationships between individual variables of a data set. We extend this classical formulation into linear factors that describe relationships between groups of variables, where each group represents either a set of related variables or a data set. The model also naturally extends canonical correlation analysis to more than two sets, in a way that is more flexible than previous extensions. Our solution is formulated as variational inferenc...
Analysis of irradiated materials
Papers presented at the UKAEA Conference on Materials Analysis by Physical Techniques (1987) covered a wide range of techniques as applied to the analysis of irradiated materials. These varied from reactor component materials, materials associated with the Authority's radwaste disposal programme, fission products and products associated with the decommissioning of nuclear reactors. An invited paper giving a very comprehensive review of Laser Ablation Microprobe Mass Spectroscopy (LAMMS) was included in the programme. (author)
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Software for Schenkerian Analysis
Marsden, Alan
2011-01-01
Software developed to automate the process of Schen-kerian analysis is described. The current state of the art is that moderately good analyses of small extracts can be generated, but more information is required about the criteria by which analysts make decisions among alternative interpretations in the course of analysis. The software described here allows the procedure of reduction to be examined while in process, allowing decision points, and potentially criteria, to become clear.
Ho, Nancy; Sommers, Marilyn
2013-01-01
Anhedonia presents itself in a myriad of disease processes. To further develop our understanding of anhedonia and effective ways to manage it, the concept requires clear boundaries. This paper critically examined the current scientific literature and conducted a concept analysis of anhedonia to provide a more accurate and lucid understanding the concept. As part of the concept analysis, this paper also provides model, borderline, related, and contrary examples of anhedonia.
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of four main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents, the development of an expert system for the aid to diagnosis; the development and application of a probabilistic reactor dynamics method. Main achievements in 1999 are reported
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Khedker, Uday P; Rawat, Prashant Singh
2011-01-01
Flow- and context-sensitive pointer analysis is generally considered too expensive for large programs; most tools relax one or both of the requirements for scalability. We formulate a flow- and context-sensitive points-to analysis that is lazy in the following sense: points-to information is computed only for live pointers and its propagation is sparse (restricted to live ranges of respective pointers). Further, our analysis (a) uses strong liveness, effectively including dead code elimination; (b) afterwards calculates must-points-to information from may-points-to information instead of using a mutual fixed-point; and (c) uses value-based termination of call strings during interprocedural analysis (which reduces the number of call strings significantly). A naive implementation of our analysis within GCC-4.6.0 gave analysis time and size of points-to measurements for SPEC2006. Using liveness reduced the amount of points-to information by an order of magnitude with no loss of precision. For all programs under ...
Integrated genetic analysis microsystems
With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Dewhurst, A.; Legger, F.
2015-12-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.
Legger, Federica; The ATLAS collaboration
2015-01-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...
Activation analysis in forensic studies
Application of neutron activation analysis in forensics are grouped into 3 categories: firearms-discharge applications, elemental analysis of other nonbiological evidence materials (paint, other), and elemental analysis of biological evidence materials (multielemental analysis of hair, analysis of hair for As and Hg). 18 refs
Failure Analysis for Improved Reliability
Sood, Bhanu
2016-01-01
Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure
Renal calculi: genesis and analysis
In this paper, genesis, composition and analysis of urolithiasis is discussed. Methods of medical examination are considered, e.g. quantitative analysis, infrared spectrometry, and X-ray diffraction analysis. (Auth.)
Software engineering with analysis patterns
Geyer-Schulz, Andreas; Hahsler, Michael
2001-01-01
The purpose of this article is twofold, first to promote the use of patterns in the analysis phase of the software life-cycle by proposing an outline template for analysis patterns that strongly supports the whole analysis process from the requirements analysis to the analysis model and further on to its transformation into a flexible design. Second we present, as an example, a family of analysis patterns that deal with a series of pressing problems in cooperative work, collaborative informat...
Neutron multiplicity analysis tool
Stewart, Scott L [Los Alamos National Laboratory
2010-01-01
I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
The high sensitivity of high-flux (reactor) thermal-neutron activation analysis (NAA) for the detection and quantitative measurement of a large number of elements has led, in recent years, to a considerable degree of application of the method in the area of scientific crime investigation (criminalistics). Thus, in a Forensic Activation Analysis Bibliography recently compiled by the author, some 135 publications in this field are listed - and more are appearing quite rapidly. The nondestructive character of the purely-instrumental form of the method is an added advantage in forensic work, since evidence samples involved in actual criminal cases are not destroyed during analysis, but are preserved intact for possible presentation in court. Quite aside from, or in addition to, use in court, NAA results can be very helpful in the investigative stage of particular criminal cases. The ultra sensitivity of the method often enables one to analyze evidence specimens that are too tiny for meaningful analysis by more conventional elemental analysis methods. Also, this high sensitivity often enables one to characterize, or individualize, evidence specimens as to the possibility of common origin - via the principle of multi-element trace-constituent characterization
Automated document analysis system
Black, Jeffrey D.; Dietzel, Robert; Hartnett, David
2002-08-01
A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
Suphattharachai Chomphan
2012-01-01
Full Text Available Problem statement: The ukulele is a trendy instrument in the present day. It is a member of the guitar family of instruments which employs four nylon or gut strings or four courses of strings. However, a statistical analysis of the pitch of this instrument has not been conducted. To analysis pitch or fundamental frequency of its main cords should be performed in an appropriate way. This study brings about its effective sound synthesis which is an important issue in the future. Approach: An efficient technique for the analysis of the fundamental frequency (F0 of the human speech had been applied to the analysis of main cords of the ukulele. The autocorrelation-based technique was used with the signal waveform to extract the optimal period or pitch for the corresponding analyzed frame in time domain. Then the corresponding fundamental frequency was calculated in the frequency domain. Results: The 21 main cords were chosen in the study. It had been seen that the existing fundamental frequency values were varied from one to three values. The value was ranging from 65.42 Hz-329.93 Hz. Conclusion: By using the analysis technique of fundamental frequency of the human speech, the output frequencies of all main cords can be extracted. It can be empirically seen that they have their unique values from each others."
Ryan, Harry; Junell, Justin; Albasini, Colby; O'Rourke, William; Le, Thang; Strain, Ted; Stiglets, Tim
2011-01-01
A package for the automation of the Engineering Analysis (EA) process at the Stennis Space Center has been customized. It provides the ability to assign and track analysis tasks electronically, and electronically route a task for approval. It now provides a mechanism to keep these analyses under configuration management. It also allows the analysis to be stored and linked to the engineering data that is needed to perform the analysis (drawings, etc.). PTC s (Parametric Technology Corp o ration) Windchill product was customized to allow the EA to be created, routed, and maintained under configuration management. Using Infoengine Tasks, JSP (JavaServer Pages), Javascript, a user interface was created within the Windchill product that allows users to create EAs. Not only does this interface allow users to create and track EAs, but it plugs directly into the out-ofthe- box ability to associate these analyses with other relevant engineering data such as drawings. Also, using the Windchill workflow tool, the Design and Data Management System (DDMS) team created an electronic routing process based on the manual/informal approval process. The team also added the ability for users to notify and track notifications to individuals about the EA. Prior to the Engineering Analysis creation, there was no electronic way of creating and tracking these analyses. There was also a feature that was added that would allow users to track/log e-mail notifications of the EA.
Improved superpower relations followed by the Soviet Union's collapse acted as catalysts for changing the mission at Rocky Flats. Now, environmental concerns command as much attention as production capability. As a result, laboratory instruments once dedicated to plutonium production have a new purpose - the analysis of mixed wastes. Waste drums destined for WIPP require headspace analysis by GS/MS (gas chromatography/mass spectrometry) for volatile and semi-volatile organic compounds (VOC and SVOC). Flame AA analysis provides information on inorganic constituents. EPA guidelines for waste analysis (SW-846) overlook the obstacles of glove box manipulations. Sometimes, SW-846 guidelines conflict with the Rocky Flats waste minimization effort. However, the EPA encourages SW-846 adaptations if experimental data confirms the results. For water and soil samples, AA analysis of laboratory control samples show method capability inside a glove box. Non-radioactive drum headspace samples use a revised version of USEPA compendium method TO-14. Radioactive drum headspace samples require new instrumentation and change to SW-846 methods
BAT - Bayesian Analysis Toolkit
One of the most vital steps in any data analysis is the statistical analysis and comparison with the prediction of a theoretical model. The many uncertainties associated with the theoretical model and the observed data require a robust statistical analysis tool. The Bayesian Analysis Toolkit (BAT) is a powerful statistical analysis software package based on Bayes' Theorem, developed to evaluate the posterior probability distribution for models and their parameters. It implements Markov Chain Monte Carlo to get the full posterior probability distribution that in turn provides a straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as Simulated Annealing, allow to evaluate the global mode of the posterior. BAT is developed in C++ and allows for a flexible definition of models. A set of predefined models covering standard statistical cases are also included in BAT. It has been interfaced to other commonly used software packages such as ROOT, Minuit, RooStats and CUBA. An overview of the software and its algorithms is provided along with several physics examples to cover a range of applications of this statistical tool. Future plans, new features and recent developments are briefly discussed.
[Demoralization: A Concept Analysis].
Li, Yu-Chi; Chou, Fan-Hao; Wang, Hsiu-Huang
2015-06-01
Demoralization is a relatively new diagnosis that has received increased attention in recent years. This condition is most frequently diagnosed in hospice, cancer, and critical illness patients. Serious demoralization may induce suicide ideation. The current literature on demoralization primarily elaborates on the history of this condition and on the reliability / validation of scales developed to assess demoralization in patients. The present paper conducts a concept analysis of demoralization in accordance with the concept analysis steps outlined by Walker and Avant. We elucidate the definition of relevant terms and identify the characteristic attributes, antecedents, and consequences of demoralization. Additionally, this paper provides examples of model, borderline, contrary, relevant cases and reviews relevant empirical data. We hope this concept analysis enhances the understanding of demoralization among clinical caregivers, increases the general understanding of the methods available to assess demoralization, and, ultimately, helps set patients free of their demoralization haze. PMID:26073961
Concept analysis of mentoring.
2013-10-01
The purpose of a concept analysis is to examine the structure and function of a concept by defining its attributes and internal structure. Concept analysis can clarify an overused or vague concept and promote mutual understanding by providing a precise operational definition. Mentoring is a concept more fully used by other fields, such as business, than in nursing and may not always translate well for use in nursing. Therefore, clarifying the meaning of the existing concept of mentoring and developing an operational definition for use in nursing are aims of this concept analysis. Mentoring is broadly based and concentrates on developing areas such as career progression, scholarly achievements, and personal development. Mentoring relationships are based around developing reciprocity and accountability between each partner. Mentoring is seen related to transition in practice, role acquisition, and socialization, as a way to support new colleagues. Mentorship is related to nurses' success in nursing practice linked to professionalism, nursing quality improvement, and self-confidence. PMID:24042140
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
Choudary, A D R
2014-01-01
The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Neutron signal transfer analysis
Pleinert, H; Lehmann, E
1999-01-01
A new method called neutron signal transfer analysis has been developed for quantitative determination of hydrogenous distributions from neutron radiographic measurements. The technique is based on a model which describes the detector signal obtained in the measurement as a result of the action of three different mechanisms expressed by signal transfer functions. The explicit forms of the signal transfer functions are determined by Monte Carlo computer simulations and contain only the distribution as a variable. Therefore an unknown distribution can be determined from the detector signal by recursive iteration. This technique provides a simple and efficient tool for analysis of this type while also taking into account complex effects due to the energy dependency of neutron interaction and single and multiple scattering. Therefore this method provides an efficient tool for precise quantitative analysis using neutron radiography, as for example quantitative determination of moisture distributions in porous buil...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Sutawanir Darwis
2012-05-01
Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
Gasiński, Leszek
2016-01-01
This second of two Exercises in Analysis volumes covers problems in five core topics of mathematical analysis: Function Spaces, Nonlinear and Multivalued Maps, Smooth and Nonsmooth Calculus, Degree Theory and Fixed Point Theory, and Variational and Topological Methods. Each of five topics corresponds to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. Exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic. This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of the ...
Trajectory Based Traffic Analysis
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most of...
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig
Schaefer, Gerald
2013-01-01
Since the early 20th century, medical imaging has been dominated by monochrome imaging modalities such as x-ray, computed tomography, ultrasound, and magnetic resonance imaging. As a result, color information has been overlooked in medical image analysis applications. Recently, various medical imaging modalities that involve color information have been introduced. These include cervicography, dermoscopy, fundus photography, gastrointestinal endoscopy, microscopy, and wound photography. However, in comparison to monochrome images, the analysis of color images is a relatively unexplored area. The multivariate nature of color image data presents new challenges for researchers and practitioners as the numerous methods developed for monochrome images are often not directly applicable to multichannel images. The goal of this volume is to summarize the state-of-the-art in the utilization of color information in medical image analysis.
Principles of harmonic analysis
Deitmar, Anton
2014-01-01
This book offers a complete and streamlined treatment of the central principles of abelian harmonic analysis: Pontryagin duality, the Plancherel theorem and the Poisson summation formula, as well as their respective generalizations to non-abelian groups, including the Selberg trace formula. The principles are then applied to spectral analysis of Heisenberg manifolds and Riemann surfaces. This new edition contains a new chapter on p-adic and adelic groups, as well as a complementary section on direct and projective limits. Many of the supporting proofs have been revised and refined. The book is an excellent resource for graduate students who wish to learn and understand harmonic analysis and for researchers seeking to apply it.
Power electronics reliability analysis.
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
Software safety hazard analysis
Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper
Sohrab, Houshang H
2014-01-01
This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies
This analysis defines and evaluates the surface water supply system from the existing J-13 well to the North Portal. This system includes the pipe running from J-13 to a proposed Booster Pump Station at the intersection of H Road and the North Portal access road. Contained herein is an analysis of the proposed Booster Pump Station with a brief description of the system that could be installed to the South Portal and the optional shaft. The tanks that supply the water to the North Portal are sized, and the supply system to the North Portal facilities and up to Topopah Spring North Ramp is defined
Harmonic analysis and applications
Heil, Christopher
2007-01-01
This self-contained volume in honor of John J. Benedetto covers a wide range of topics in harmonic analysis and related areas. These include weighted-norm inequalities, frame theory, wavelet theory, time-frequency analysis, and sampling theory. The chapters are clustered by topic to provide authoritative expositions that will be of lasting interest. The original papers collected are written by prominent researchers and professionals in the field. The book pays tribute to John J. Benedetto's achievements and expresses an appreciation for the mathematical and personal inspiration he has given to
Cunnold, Derek; Wang, Ray
2002-01-01
Publications from 1999-2002 describing research funded by the SAGE II contract to Dr. Cunnold and Dr. Wang are listed below. Our most recent accomplishments include a detailed analysis of the quality of SAGE II, v6.1, ozone measurements below 20 km altitude (Wang et al., 2002 and Kar et al., 2002) and an analysis of the consistency between SAGE upper stratospheric ozone trends and model predictions with emphasis on hemispheric asymmetry (Li et al., 2001). Abstracts of the 11 papers are attached.
Rudin, Walter
2011-01-01
In the late 1950s, many of the more refined aspects of Fourier analysis were transferred from their original settings (the unit circle, the integers, the real line) to arbitrary locally compact abelian (LCA) groups. Rudin's book, published in 1962, was the first to give a systematic account of these developments and has come to be regarded as a classic in the field. The basic facts concerning Fourier analysis and the structure of LCA groups are proved in the opening chapters, in order to make the treatment relatively self-contained.
Chambers, L.; Tromp, E.; Pechenizkiy, M.; M. Gaber
2012-01-01
Mobile devices play a significant part in a user’s communication methods and much data that they read and write is received and sent via mobile phones, for instance SMS messages, e-mails, Twitter tweets and social media networking feeds. One of the main goals is to make people aware of how much negative and positive content they read and write via their mobile phones. Existing sentiment analysis applications perform sentiment analysis on downloaded data from mobile phones or use an applicatio...
Provenance as Dependency Analysis
Cheney, James; Acar, Umut
2007-01-01
Provenance is information recording the source, derivation, or history of some information. Provenance tracking has been studied in a variety of settings; however, although many design points have been explored, the mathematical or semantic foundations of data provenance have received comparatively little attention. In this paper, we argue that dependency analysis techniques familiar from program analysis and program slicing provide a formal foundation for forms of provenance that are intended to show how (part of) the output of a query depends on (parts of) its input. We introduce a semantic characterization of such dependency provenance, show that this form of provenance is not computable, and provide dynamic and static approximation techniques.
Eliezer, C J; Maxwell, E A; Sneddon, I N
1963-01-01
Concise Vector Analysis is a five-chapter introductory account of the methods and techniques of vector analysis. These methods are indispensable tools in mathematics, physics, and engineering. The book is based on lectures given by the author in the University of Ceylon.The first two chapters deal with vector algebra. These chapters particularly present the addition, representation, and resolution of vectors. The next two chapters examine the various aspects and specificities of vector calculus. The last chapter looks into some standard applications of vector algebra and calculus.This book wil
In order to verify compliance with safeguards and draw conclusions on the absence of undeclared nuclear material and activities, the International Atomic Energy Agency (IAEA) collects and analyses trade information that it receives from open sources as well as from Member States. Although the IAEA does not intervene in national export controls, it has to monitor the trade of dual use items. Trade analysis helps the IAEA to evaluate global proliferation threats, to understand States' ability to report exports according to additional protocols but also to compare against State declarations. Consequently, the IAEA has explored sources of trade-related information and has developed analysis methodologies beyond its traditional safeguards approaches. (author)
Lartillot, Olivier
2016-01-01
Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...
This article provides some fundamental techniques of evaluating human performance and equipment related events which are in use in Krsko NPP. Before the large industrial accidents the human factor was considered as very reliable and was not accepted as a possible source of errors. Today it is evident that safety is a proper combination of factors associated with people, technology and organization. Determining the cause of equipment failures is a much more enjoyable, exercise than doing the same for human errors. People are emotional: they can be angry, scared, defensive, not trustful. Because of all that the determination of causes for human errors is much more difficult.In many cases the definition of human factors relates to operators as the source of the human errors. Such an approach restricts the search for the true root cause of an event. In reality the human factor is associated with operators as well as with managers, designers, instructors, maintenance people etc. Operating experience and in-depth analysis with the resulting lessons learnt are all evidence of the relevance of human errors for safety. The nuclear power plant industry has estimated the risk due to human errors closing to 70%. It is therefore obvious that sophisticated techniques are needed to focus on human errors. The root cause analysis in NPP Krsko is based on the following methods: Event and Causal Factor Charting, Change Analysis, Barrier Analysis, MORT (Management Oversight and Risk Tree Analysis) and Human Performance Evaluation. Event and casual Factor Charting is used for investigation of complex problems which need to be visualized in the form of a chart so as to provide a better understanding of the chronology of an event. Change Analysis is usually used for a particular problem with the equipment failure by using key questions: what?, when?, where?, who? and how? to find a final answer to the question WHY something happened. Barrier Analysis is used for procedural and
Towards Cognitive Component Analysis
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...... semantics, not only in text, but also in dynamic text (chat), images, and combinations of text and images. Here we further expand on the relevance of the ICA model for representing context, including two new analyzes of abstract data: social networks and musical features....
Peng, Chao-Ying Joanne
2008-01-01
"Peng provides an excellent overview of data analysis using the powerful statistical software package SAS. This book is quite appropriate as a self-placed tutorial for researchers, as well as a textbook or supplemental workbook for data analysis courses such as statistics or research methods. Peng provides detailed coverage of SAS capabilities using step-by-step procedures and includes numerous comprehensive graphics and figures, as well as SAS printouts. Readers do not need a background in computer science or programming. Includes numerous examples in education, health sciences, and business.
Associative Analysis in Statistics
Mihaela Muntean
2015-03-01
Full Text Available In the last years, the interest in technologies such as in-memory analytics and associative search has increased. This paper explores how you can use in-memory analytics and an associative model in statistics. The word “associative” puts the emphasis on understanding how datasets relate to one another. The paper presents the main characteristics of “associative” data model. Also, the paper presents how to design an associative model for labor market indicators analysis. The source is the EU Labor Force Survey. Also, this paper presents how to make associative analysis.
Environmental analysis support
Miller, R.L.
1996-06-01
Activities in environmental analysis support included assistance to the Morgantown and Pittsburgh Energy Technology Centers (METC and PETC) in reviewing and preparing documents required by the National Environmental Policy Act (NEPA) for projects selected for the Clean Coal Technology (CCT) Program. An important activity was the preparation for METC of a final Environmental Assessment (EA) for the proposed Externally Fired Combined Cycle (EFCC) Project in Warren, Pennsylvania. In addition, a post-project environmental analysis was prepared for PETC to evaluate the Demonstration of Advanced Combustion Techniques for a Tangentially-Fired Boiler in Lynn Haven, Florida.
The accident at Three Mile Island Unit 2 (TMI-2) provides an opportunity to benchmark severe accident analysis methods against full-scale, integrated facility data. In collaboration with the U.S. Department of Energy (DOE), the OECD Nuclear Energy Agency established a joint task group to analyse various periods of the accident and benchmark the relevant severe accident codes. In this paper the author presents one result from the TMI-2 analysis exercise that may be of interest in evaluating thermal hydraulics codes. (author). 11 figs., 3 refs
Analysis of maintenance strategies
The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view
Abrahams, J R; Hiller, N
1965-01-01
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther
Hoffman, Kenneth
2007-01-01
Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq
Z. Ceylan
1998-04-28
The purpose of this analysis is to determine the structural response of the 21 pressurized water reactor (PWR) uncanistered fuel (UCF) waste package (WP) to a tipover design basis event (DBE) dynamic load; the results will be reported in terms of stress magnitudes. Finite-element solution was performed by making use of the commercially available ANSYS finite-element code. A finite-element model of the waste package was developed and analyzed for a tipover DBE dynamic load. The results of this analysis were provided in tables and were also plotted in terms of the maximum stress contours to determine their locations.
Introduction to complex analysis
Priestley, H A
2003-01-01
Complex analysis is a classic and central area of mathematics, which is studied and exploited in a range of important fields, from number theory to engineering. Introduction to Complex Analysis was first published in 1985, and for this much awaited second edition the text has been considerably expanded, while retaining the style of the original. More detailed presentation is given of elementary topics, to reflect the knowledge base of current students. Exercise sets have beensubstantially revised and enlarged, with carefully graded exercises at the end of each chapter.This is the latest additi
Schramm, Michael J
2008-01-01
This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl
Kishore, K Lal
2008-01-01
Second Edition of the book Electronic Circuit Analysis is brought out with certain new Topics and reorganization of text matter into eight units. With addition of new topics, syllabi of many universities in this subject can be covered. Besides this, the book can also meet the requirements of M.Sc (Electronics), AMIETE, AMIE (Electronics) courses. Text matter is improved thoroughly. New topics like frequency effects in multistage amplifiers, amplifier circuit analysis, design of high frequency amplifiers, switching regulators, voltage multipliers, Uninterrupted Power Supplies (UPS), and Switchi
Harris, D. W.
1972-01-01
The radiation interface in spacecrafts using radioisotope thermoelectric generators is studied. A Monte Carlo analysis of the radiation field that includes scattered radiation effects, produced neutron and gamma photon isoflux contours as functions of distance from the RTG center line. It is shown that the photon flux is significantly depressed in the RTG axial direction because of selfshielding. Total flux values are determined by converting the uncollided flux values into an equivalent RTG surface source and then performing a Monte Carlo analysis for each specific dose point. Energy distributions of the particle spectra completely define the radiation interface for a spacecraft model.
Greenberg, Marc W.; Laing, William
2013-01-01
An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.
Learning Haskell data analysis
Church, James
2015-01-01
If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.
Proteoglycan isolation and analysis
Woods, A; Couchman, J R
Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis of...
Douglas, David M.
2016-01-01
Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it d
This document presents the results from the analysis of the shot loading trainer (SLT). This device will be used to test the procedure for installing shot into the annulus of the Project W-320 shipping container. To ensure that the shot is installed uniformly around the container, vibrators will be used to settle the shot. The SLT was analyzed to ensure that it would not jeopardize worker safety during operation. The results from the static analysis of the SLT under deadweight and vibrator operating loads show that the stresses in the SLT are below code allowables. The results from the modal analysis show that the natural frequencies of the SLT are far below the operating frequencies of the vibrators, provided the SLT is mounted on pneumatic tires. The SLT was also analyzed for wind, seismic, deadweight, and moving/transporting loads. Analysis of the SLT is in accordance with SDC-4.1 for safety class 3 structures (DOE-RL 1993) and the American Institute of Steel Construction (AISC) Manual of Steel Construction (AISC 1989)
Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi;
2015-01-01
This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....
Advanced biomedical image analysis
Haidekker, Mark A
2010-01-01
"This book covers the four major areas of image processing: Image enhancement and restoration, image segmentation, image quantification and classification, and image visualization. Image registration, storage, and compression are also covered. The text focuses on recently developed image processing and analysis operators and covers topical research"--Provided by publisher.
The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)
Multiscale principal component analysis
Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis
Bayesian Independent Component Analysis
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Szapacs, Cindy
2006-01-01
Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…
Radiochemical analysis. Chapter 6
A brief description is presented of sample selection and preparation, of radiochemical analysis principles, and of the determinations of tritium, phosphorus, strontium, zirconium, niobium, ruthenium, iodine, cesium, barium, cerium, polonium, radium, thorium, uranium, plutonium, and of rare earths. The occurrence of radionuclides in the environment and the prospects of radiochemical method applications are briefly described. (J.P.)
Conti, Roberto; Hong, Jeong Hee; Szymanski, Wojciech
2012-01-01
of such an algebra. Then we outline a powerful combinatorial approach to analysis of endomorphisms arising from permutation unitaries. The restricted Weyl group consists of automorphisms of this type. We also discuss the action of the restricted Weyl group on the diagonal MASA and its relationship...
Idris, Ivan
2014-01-01
This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.
Communication Network Analysis Methods.
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Silverman, Richard A
1984-01-01
A shorter version of A. I. Markushevich's masterly three-volume Theory of Functions of a Complex Variable, this edition is appropriate for advanced undergraduate and graduate courses in complex analysis. Numerous worked-out examples and more than 300 problems, some with hints and answers, make it suitable for independent study. 1967 edition.
Antonucci, F; Cavalleri, A; Congedo, G [Dipartimento di Fisica, Universita di Trento and INFN, Gruppo Collegato di Trento, 38123 Povo, Trento (Italy); Armano, M [European Space Astronomy Centre, European Space Agency, Villanueva de la Canada, 28692 Madrid (Spain); Audley, H; Bogenstahl, J; Danzmann, K [Albert-Einstein-Institut, Max-Planck-Institut fuer Gravitationsphysik und Universitaet Hannover, 30167 Hannover (Germany); Auger, G; Binetruy, P [APC UMR7164, Universite Paris Diderot, Paris (France); Benedetti, M [Dipartimento di Ingegneria dei Materiali e Tecnologie Industriali, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Boatella, C [CNES, DCT/AQ/EC, 18 Avenue Edouard Belin, 31401 Toulouse, Cedex9 (France); Bortoluzzi, D; Bosetti, P; Cristofolini, I [Dipartimento di Ingegneria Meccanica e Strutturale, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Caleno, M; Cesa, M [European Space Technology Centre, European Space Agency, Keplerlaan 1, 2200 AG Noordwijk (Netherlands); Chmeissani, M [IFAE, Universitat Autonoma de Barcelona, E-08193 Bellaterra (Barcelona) (Spain); Ciani, G [Department of Physics, University of Florida, Gainesville, FL 32611-8440 (United States); Conchillo, A [ICE-CSIC/IEEC, Facultat de Ciencies, E-08193 Bellaterra (Barcelona) (Spain); Cruise, M, E-mail: martin.hewitson@aei.mpg.de [Department of Physics and Astronomy, University of Birmingham, Birmingham (United Kingdom)
2011-05-07
As the launch of LISA Pathfinder (LPF) draws near, more and more effort is being put in to the preparation of the data analysis activities that will be carried out during the mission operations. The operations phase of the mission will be composed of a series of experiments that will be carried out on the satellite. These experiments will be directed and analysed by the data analysis team, which is part of the operations team. The operations phase will last about 90 days, during which time the data analysis team aims to fully characterize the LPF, and in particular, its core instrument the LISA Technology Package. By analysing the various couplings present in the system, the different noise sources that will disturb the system, and through the identification of the key physical parameters of the system, a detailed noise budget of the instrument will be constructed that will allow the performance of the different subsystems to be assessed and projected towards LISA. This paper describes the various aspects of the full data analysis chain that are needed to successfully characterize the LPF and build up the noise budget during mission operations.
As the launch of LISA Pathfinder (LPF) draws near, more and more effort is being put in to the preparation of the data analysis activities that will be carried out during the mission operations. The operations phase of the mission will be composed of a series of experiments that will be carried out on the satellite. These experiments will be directed and analysed by the data analysis team, which is part of the operations team. The operations phase will last about 90 days, during which time the data analysis team aims to fully characterize the LPF, and in particular, its core instrument the LISA Technology Package. By analysing the various couplings present in the system, the different noise sources that will disturb the system, and through the identification of the key physical parameters of the system, a detailed noise budget of the instrument will be constructed that will allow the performance of the different subsystems to be assessed and projected towards LISA. This paper describes the various aspects of the full data analysis chain that are needed to successfully characterize the LPF and build up the noise budget during mission operations.
Braun, W. John
2012-01-01
The Analysis of Variance is often taught in introductory statistics courses, but it is not clear that students really understand the method. This is because the derivation of the test statistic and p-value requires a relatively sophisticated mathematical background which may not be well-remembered or understood. Thus, the essential concept behind…
Computer aided safety analysis
The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs
Senegal : Country Environmental Analysis
World Bank
2008-01-01
The main objective of the Senegal Country Environmental Analysis (CEA) is to reinforce the ongoing dialogue on environmental issues between the World Bank and the Government of Senegal. The CEA also aims to support the ongoing Government implementation of a strategic results-based planning process at the Environment Ministry (MEPNBRLA). The main goal is to enable Senegal to have the necess...
Isaacson, Eugene
1994-01-01
This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Kubista, Mikael; Rusňáková, Vendula; Švec, David; Sjögreen, B.; Tichopád, Aleš
Essex: Caister Academic Press, 2012 - (Filion, M.), s. 63-84 ISBN 978-1-908230-01-0 Institutional research plan: CEZ:AV0Z50520701 Keywords : qPCR data analysis * real-time PCR * GenEx Subject RIV: EB - Genetics ; Molecular Biology www.caister.com
Shifted Independent Component Analysis
Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai
2007-01-01
Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...
Inverse correspondence analysis
Groenen, PJF; van de Velden, M
2004-01-01
In correspondence analysis (CA), rows and columns of a data matrix are depicted as points in low-dimensional space. The row and column profiles are approximated by minimizing the so-called weighted chi-squared distance between the original profiles and their approximations, see for example, [Theory
Statistical fluctuations of reactivity represent reactor noise. Analysis of reactor noise enables determining a series of reactor kinetic parameters. Fluctuations of power was measured by ionization chamber placed next to the tank of the RB reactor. The signal was digitized by an analog-digital converter. After calculation of the mean power, 3000 data obtained by sampling were analysed
Operando (micro) XAFS analysis
Arčon, Iztok; Dominko, Robert; Vogel-Mikuš, Katarina
2016-01-01
In the talk the principles of XAS methods were presented with practical examples which illustrate the possibilities and advanced approaches for their use in structural analysis of different types of materials. The emphasis will be on to the use of XAS spectroscopy in operando mode and in combination with X-ray microscopy.
This paper presents a short review of the parallel safety analysis of the various types of NPP. The NPP with PWR, WWER, BWR and HWR type reactors are mentioned. Technical, economic, location and ecology aspects of the safety of the NPP have been analysed. (author)
Russian Language Analysis Project
Serianni, Barbara; Rethwisch, Carolyn
2011-01-01
This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…
Atlas Distributed Analysis Tools
de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich
2008-06-01
The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.
ATLAS Distributed Analysis Tools
Gonzalez de la Hoz, Santiago; Liko, Dietrich
2008-01-01
The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...
Safeguards system analysis, (1)
A system analysis on the implementing safeguards system based on the traditional materials accountancy was done. This report describes about verification methods applied to operator's measurement data, MUF evaluation method, theories on the decision of PIT frequency and designing of inspection plan. (author)
Louis Kaplow; Steven Shavell
1999-01-01
This is a survey of the field of economic analysis of law, focusing on the work of economists. The survey covers the three central areas of civil law liability for accidents (tort law), property law, and contracts as well as the litigation process and public enforcement of law.
Shah, Anwar
2005-01-01
This book provides tools of analysis for discovering equity in tax burdens as well as in public spending and judging government performance in its role in safeguarding the interests of the poor and those otherwise disadvantaged members of society, such as women, children, and minorities. The book further provides a framework for a rights-based approach to citizen empowerment-in other words, ...
Polysome Profile Analysis - Yeast
Pospíšek, M.; Valášek, Leoš
2013-01-01
Roč. 530, č. 2013 (2013), s. 173-181. ISSN 0076-6879 Institutional support: RVO:61388971 Keywords : grow yeast cultures * polysome profile analysis * sucrose density gradient centrifugation Subject RIV: CE - Biochemistry Impact factor: 2.194, year: 2013
Kolmogorov, A N; Silverman, Richard A
1975-01-01
Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.
Information Security Risk Analysis
Peltier, Thomas R
2010-01-01
Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.
Doaa Mohey El-Din
2015-09-01
Full Text Available Sentiment analysis or opinion mining is used to automate the detection of subjective information such as opinions, attitudes, emotions, and feelings. Hundreds of thousands care about scientific research and take a long time to select suitable papers for their research. Online reviews on papers are the essential source to help them. The reviews save reading time and save papers cost. This paper proposes a new technique to analyze online reviews. It is called sentiment analysis of online papers (SAOOP. SAOOP is a new technique used for enhancing bag-of-words model, improving the accuracy and performance. SAOOP is useful in increasing the understanding rate of review's sentences through higher language coverage cases. SAOOP introduces solutions for some sentiment analysis challenges and uses them to achieve higher accuracy. This paper also presents a measure of topic domain attributes, which provides a ranking of total judging on each text review for assessing and comparing results across different sentiment techniques for a given text review. Finally, showing the efficiency of the proposed approach by comparing the proposed technique with two sentiment analysis techniques. The comparison terms are based on measuring accuracy, performance and understanding rate of sentences.
Elementary functional analysis
Shilov, Georgi E
1996-01-01
Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.
On frame multiresolution analysis
Christensen, Ole
We use the freedom in frame multiresolution analysis to construct tight wavelet frames (even in the case where the refinable function does not generate a tight frame). In cases where a frame multiresolution does not lead to a construction of a wavelet frame we show how one can nevertheless...... construct a wavelet frame with two generators....
Wood, P. J.; Gower, D. B.
This chapter covers the analysis of steroids with progesterone-like activity, classified as “progestagens”. Steroids in this group include the naturally occurring C21 steroids, progesterone (4-pregnene-3,20-dione) and its metabolites, together with synthetic steroids, such as norgestrel norethisterone (NE), and medroxyprogesterone acetate which also have progestational activity.
In some cases control rod worth efficiencies evaluated by inverse point kinetics from out-core detector currents remarkable differ from direct calculations. Explanation of this effects is given and is supported by the analysis of some WWER-440 rod drop experiments. (Authors)
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Kane, Jonathan M
2016-01-01
This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...
Dumais, Susan T.
2004-01-01
Presents a literature review that covers the following topics related to Latent Semantic Analysis (LSA): (1) LSA overview; (2) applications of LSA, including information retrieval (IR), information filtering, cross-language retrieval, and other IR-related LSA applications; (3) modeling human memory, including the relationship of LSA to other…
Introduction to Survival Analysis
Valenta, Zdeněk
Brno: Masarykova Univerzita, 2013 - (Pavlík, T.; Májek, O.), s. 44-56 ISBN 978-80-210-6305-1. [Summer School on Computational Biology /9./. Svratka (CZ), 10.09.2013-13.09.2013] Institutional support: RVO:67985807 Keywords : survival analysis * time-to-event data * censoring process * hazard function * survival time Subject RIV: IN - Informatics, Computer Science
Developing Word Analysis Skills.
Heilman, Arthur W.
The importance of word analysis skills to reading ability is discussed, and methodologies for teaching such skills are examined. It is stated that a child cannot become proficient in reading if he does not master the skill of associating printed letter symbols with the sounds they represent. Instructional procedures which augment the alphabet with…
This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Martin, Vance S.
2009-01-01
There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…
NONE
1997-10-01
The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ``Fugen`` was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ``Fugen`` has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)
Generalization of exergy analysis
Highlights: • The area of validity of standard exergy analysis is discussed carefully. • A generalization of exergy analysis is developed within classical irreversible thermodynamics. • The generalization is demonstrated on fuel cells, osmotic power plants and heat engines. • A rigorous method indicating where exactly in a device useful work is being lost is developed. • A general algorithm of thermodynamic optimization is formulated. - Abstract: Exergy analysis, which provides means of calculating efficiency losses in industrial devices, is reviewed, and the area of its validity is carefully discussed. Consequently, a generalization is proposed, which holds also beyond the area of applicability of exergy analysis. The generalization is formulated within the framework of classical irreversible thermodynamics, and interestingly it leads to minimization of a functional different from entropy production. Fuel cells, osmotic power plants and heat engines are analyzed within the theory. In particular, the theory is demonstrated on a toy model of solid oxide fuel cells quantitatively. Eventually, a new general algorithm of thermodynamic optimization is proposed
Haskell data analysis cookbook
Shukla, Nishant
2014-01-01
Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.
SLOWPOKE: neutron activation analysis
Neutron activation analysis permits the non-destructive determination of trace elements in crude oil and its derivatives at high sensitivity (up to 10-9 g/g) and good precision. This article consists of a quick survey of the method followed by an illustration based on the results of recent work at the SLOWPOKE reactor laboratory at the Ecole Polytechnique
Nielsen, S. Suzanne
Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet
Systems analysis - independent analysis and verification
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Critical Analysis of Multimodal Discourse
van Leeuwen, Theo
2013-01-01
This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples...... of recent work in the critical analysis of multimodal discourse....
Exploratory Bi-Factor Analysis
Jennrich, Robert I.; Bentler, Peter M.
2011-01-01
Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger. The bi-factor model has a general factor and a number of group factors. The purpose of this article is to introduce an exploratory form of bi-factor analysis. An advantage of using exploratory bi-factor analysis is that one need not provide a specific…
Regularized Generalized Canonical Correlation Analysis
Tenenhaus, Arthur; Tenenhaus, Michel
2011-01-01
Regularized generalized canonical correlation analysis (RGCCA) is a generalization of regularized canonical correlation analysis to three or more sets of variables. It constitutes a general framework for many multi-block data analysis methods. It combines the power of multi-block data analysis methods (maximization of well identified criteria) and…
Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim
2016-02-01
We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.
Best estimate containment analysis
Primary reactor coolant system pipe ruptures are postulated as part of the design basis for containment integrity and equipment qualification validation for nuclear power plants. Current licensing analysis uses bounding conditions and assumptions, outside the range of actual operation, to determine a conservative measure of the performance requirements. Although this method has been adequate in the past, it does often involve the inclusion of excessive conservatism. A new licensing approach is under development that considers the performance of realistic analysis which quantifies the true plant and response. A licensing limit is then quantified above the realistic requirements by applying the appropriate plant data and methodology uncertainties. This best estimate approach allows a true measure of the conservative margin, above the plant performance requirements, to be quantified. By utilizing a portion of this margin, the operation, surveillance and maintenance burden can be reduced by transferring the theoretical margin inherent in the licensing analysis to real margin applied at the plant. Relaxation of surveillance and maintenance intervals, relaxation of diesel loading and containment cooling requirements, increased quantity of necessary equipment allowed to be out of service, and allowances for equipment allowed to be out of service, and allowances for equipment degradation are all potential benefits of applying this approach. Significant margins exist in current calculations due to the bounding nature of the evaluations. Scoping studies, which help quantify the potential margin available through best estimate mass and energy release analysis, demonstrate this. Also discussed in this paper is the approach for best estimate loss-of-coolant accident mass and energy release and containment analysis, the computer programs, the projected benefits, and the expected future directions
Ceramic tubesheet design analysis
Mallett, R.H.; Swindeman, R.W.
1996-06-01
A transport combustor is being commissioned at the Southern Services facility in Wilsonville, Alabama to provide a gaseous product for the assessment of hot-gas filtering systems. One of the barrier filters incorporates a ceramic tubesheet to support candle filters. The ceramic tubesheet, designed and manufactured by Industrial Filter and Pump Manufacturing Company (EF&PM), is unique and offers distinct advantages over metallic systems in terms of density, resistance to corrosion, and resistance to creep at operating temperatures above 815{degrees}C (1500{degrees}F). Nevertheless, the operational requirements of the ceramic tubesheet are severe. The tubesheet is almost 1.5 m in (55 in.) in diameter, has many penetrations, and must support the weight of the ceramic filters, coal ash accumulation, and a pressure drop (one atmosphere). Further, thermal stresses related to steady state and transient conditions will occur. To gain a better understanding of the structural performance limitations, a contract was placed with Mallett Technology, Inc. to perform a thermal and structural analysis of the tubesheet design. The design analysis specification and a preliminary design analysis were completed in the early part of 1995. The analyses indicated that modifications to the design were necessary to reduce thermal stress, and it was necessary to complete the redesign before the final thermal/mechanical analysis could be undertaken. The preliminary analysis identified the need to confirm that the physical and mechanical properties data used in the design were representative of the material in the tubesheet. Subsequently, few exploratory tests were performed at ORNL to evaluate the ceramic structural material.
There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed
Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian
2012-05-01
GLORIA stands for "GLObal Robotic-telescopes Intelligent Array". GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.
Financial Analysis of an Enterprise
Machek, Tomáš
2012-01-01
The main goal of this bachelor thesis is to analyse the financial situation of the company TOPTHERM, Ltd. with the help of statement analysis during period under consideration from 2008 to 2011. The first part of the thesis contains theoretical bases of financial analysis. In the second part, the practical part, contains basic methods of financial analysis, such as horizontal analysis, vertical analysis of ration indicators and balance sheet financing arrangements. In conclusion of the thesis...
Financial analysis of chosen company
Bražniková, Klára
2011-01-01
The financial analysis is important when deciding on its future business. Financial analysis is an analysis of data, which are derived from financial statements. These statements are balance sheet, profit and loss account, notes to the financial statements and cash flows. The objective of financial analysis is to determine the financial health of the company and identify its weaknesses, leading to problems. The analysis must be comprehensive and systematic in its implementation. Users of the ...
Analysis of Muji's Business Strategy
范晶
2011-01-01
This article is a report of analysis of Muji's business strategy.First,the vision and mission are introduced.Second,the current strategy is identified.Then the industry analysis,industry driving forces,key success factors,value chain analysis,competitive advantage,competitive power of the competitive advantage were analyzed.At last,on the basis of the former analysis the SWOT analysis was worked out.
Trends in BWR transient analysis
While boiling water reactor (BWR) analysis methods for transient and loss of coolant accident analysis are well established, refinements and improvements continue to be made. This evolution of BWR analysis methods is driven by the new applications. This paper discusses some examples of these trends, specifically, time domain stability analysis and analysis of the simplified BWR (SBWR), General Electric's design approach involving a shift from active to passive safety systems and the elimination/simplification of systems for improved operation and maintenance
Exploratory data analysis with Matlab
Martinez, Wendy L; Solka, Jeffrey
2010-01-01
Since the publication of the bestselling first edition, many advances have been made in exploratory data analysis (EDA). Covering innovative approaches for dimensionality reduction, clustering, and visualization, Exploratory Data Analysis with MATLAB®, Second Edition uses numerous examples and applications to show how the methods are used in practice.New to the Second EditionDiscussions of nonnegative matrix factorization, linear discriminant analysis, curvilinear component analysis, independent component analysis, and smoothing splinesAn expanded set of methods for estimating the intrinsic di
Strategic analysis of Czech Airlines
Moiseeva, Polina
2016-01-01
The thesis called Strategic Analysis of Czech Airlines which completely analyses current situation within the company. It presents theoretical base for such an analysis and subsequently offers situational analysis, which includes the analysis of external environment, internal environment and suggestions for improvement. The thesis includes a complete companys SWOT analysis and offers the applying of Porters five forces framework. The thesis also includes recommendations and suggestions for th...
Kim, Young Suk; Kim, Nak Bae; Woo, Hyung Joo; Kim, Joon Kon; Kim, Gi Dong; Choi, Han Woo; Yoon, Yoon Yeol; Shim, Sang Kwun [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)
1997-12-01
Light elements in semiconductors, superconductors, magnetic or optical storage devices and surface hardened metals may have serious effects on the electrical, chemical and physical properties. Nevertheless, it is extremely difficult to quantitatively analyze their contents with conventional surface analysis tools like SIMS, AES, ESCA. The ERD-TOF (Elastic Recoil Detection - by Time Of Flight) method has recently been developed in a few prominent accelerator laboratories and proved to be very useful for such quantitative depth profiling of light elements. This project aims to construct an ERD-TOF system which can provide routine service of light elements analysis of thin films. The TOF spectrometer used in the system can be also utilized in HIRBS (Heavy Ion Rutherford Backscattering Spectrometry) for the better resolution and sensitivity than the conventional He RBS in certain cases. The works performed this year are: 1) Optimization of the ERD-TOF system for the practical use. 2) Construction of a separate HIRBS line. 3) Development of the analysis computer program and improvement of the data acquisition system. 4) Construction of the new vacuum chamber with an automatic target controller. The optimization has been done by considering such parameters as mass resolution, depth resolution, accessible depth, detection sensitivity. All these parameters have strong correlations with the sort, energy and dose of the beams to be used, the detection angle, target angle and flight length. In a practical analysis system, one cannot change the system parameter every time although there exists only one optimum condition for one measurement. Therefore, a condition is deduced which is applicable to majority of general semiconductor samples. For the practical analysis service a separate HIRBS line has been constructed. The line use the same TOF spectrometer as ERD line but the shape of the chambers are slightly modified. A computer program DoERD is written for the rapid analysis
Implementing Horn’s parallel analysis for principal component analysis and factor analysis
Alexis Dinno
2009-01-01
I present paran, an implementation of Horn's parallel analysis criteria for factor or component retention in common factor analysis or principal compo- nent analysis in Stata. The command permits classical parallel analysis and more recent extensions to it for the pca and factor commands. paran provides a needed extension to Stata’s built-in factor- and component-retention criteria.
Forensic analysis of biodiesel.
Goodman, Michael R; Kaley, Elizabeth A; Finney, Eric E
2016-06-01
The analysis of four different biodiesel blends, as well as homemade biodiesel prepared from vegetable oil, has been performed using gas chromatography-mass spectrometry. The identification of methyl esters within the biodiesel along with any background components is made possible by recognizing their mass spectral fragmentation patterns. These fuels were subjected to typical fire scene environments, specifically weathering and microbial degradation, to investigate how these environments affect the analysis. A matrix study was also performed on wood, carpet, and clothing in order to identify any interferences from these substrates. The data obtained herein will provide the forensic science community with the data needed to help recognize these increasingly common ignitable liquids. PMID:27060442
In Silico Expression Analysis.
Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz
2016-01-01
Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress. PMID:27557772
Intracochlear microprobe analysis
Bone, R.C.; Ryan, A.F.
1982-04-01
Energy dispersive x-ray analysis (EDXA) or microprobe analysis provides cochlear physiologists with a means of accurately assessing relative ionic concentrations in selected portions of the auditory mechanism. Rapid freezing followed by lyophilization allows the recovery of fluid samples in crystalline form not only from perilymphatic and endolymphatic spaces, but also from much smaller subregions of the cochlea. Because samples are examined in a solid state, there is no risk of diffusion into surrounding or juxtaposed fluids. Samples of cochlear tissues may also be evaluated without the danger of intercellular ionic diffusion. During direct visualization by scanning electron microscopy, determination of the biochemical makeup of the material being examined can be simultaneously, assuring the source of the data collected. Other potential advantages and disadvantages of EDXA are reviewed. Initial findings as they relate to endolymph, perilymph, stria vascularis, and the undersurface of the tectorial membrane are presented.
Kanjilal, S. K.; Lindquist, M. R.; Ulbricht, L. E.
1994-02-01
Jumper connectors are used for remotely connecting pipe lines containing transfer fluids ranging from hazardous chemicals to other nonhazardous liquids. The jumper connector assembly comprises hooks, hookpins, a block, a nozzle, an operating screw, and a nut. The hooks are tightened against the nozzle flanges by the operating screw that is tightened with a remotely connected torque wrench. Stress analysis for the jumper connector assembly (used extensively on the US Department of Energy's Hanford Site, near Richland, Washington) is performed by using hand calculation and finite-element techniques to determine the stress levels resulting from operating and seismic loads on components of the assembly. The analysis addresses loading conditions such as prestress, seismic, operating, thermal, and leakage. The preload torque-generated forces at which each component reaches its stress limits are presented in a tabulated format. Allowable operating loads for the jumper assembly are provided to prevent leakage of the assembly during operating cycles.
Generative pulsar timing analysis
Lentati, L; Hobson, M P
2014-01-01
A new Bayesian method for the analysis of folded pulsar timing data is presented that allows for the simultaneous evaluation of evolution in the pulse profile in either frequency or time, along with the timing model and additional stochastic processes such as red spin noise, or dispersion measure variations. We model the pulse profiles using `shapelets' - a complete ortho-normal set of basis functions that allow us to recreate any physical profile shape. Any evolution in the profiles can then be described as either an arbitrary number of independent profiles, or using some functional form. We perform simulations to compare this approach with established methods for pulsar timing analysis, and to demonstrate model selection between different evolutionary scenarios using the Bayesian evidence. %s The simplicity of our method allows for many possible extensions, such as including models for correlated noise in the pulse profile, or broadening of the pulse profiles due to scattering. As such, while it is a marked...
Kass, Robert E; Brown, Emery N
2014-01-01
Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.
CERN. Geneva; Fitch, Blake
2011-01-01
Traditionaly, the primary role of supercomputers was to create data, primarily for simulation applications. Due to usage and technology trends, supercomputers are increasingly also used for data analysis. Some of this data is from simulations, but there is also a rapidly increasingly amount of real-world science and business data to be analyzed. We briefly overview Blue Gene and other current supercomputer architectures. We outline future architectures, up to the Exascale supercomputers expected in the 2020 time frame. We focus on the data analysis challenges and opportunites, especially those concerning Flash and other up-and-coming storage class memory. About the speakers Blake G. Fitch has been with IBM Research, Yorktown Heights, NY since 1987, mainly pursuing interests in parallel systems. He joined the Scalable Parallel Systems Group in 1990, contributing to research and development that culminated in the IBM scalable parallel system (SP*) product. His research interests have focused on applicatio...
Intracochlear microprobe analysis
Energy dispersive x-ray analysis (EDXA) or microprobe analysis provides cochlear physiologists with a means of accurately assessing relative ionic concentrations in selected portions of the auditory mechanism. Rapid freezing followed by lyophilization allows the recovery of fluid samples in crystalline form not only from perilymphatic and endolymphatic spaces, but also from much smaller subregions of the cochlea. Because samples are examined in a solid state, there is no risk of diffusion into surrounding or juxtaposed fluids. Samples of cochlear tissues may also be evaluated without the danger of intercellular ionic diffusion. During direct visualization by scanning electron microscopy, determination of the biochemical makeup of the material being examined can be simultaneously, assuring the source of the data collected. Other potential advantages and disadvantages of EDXA are reviewed. Initial findings as they relate to endolymph, perilymph, stria vascularis, and the undersurface of the tectorial membrane are presented
What are the targets and criteria on which national energy policy should be based. What priorities should be set, and how can different social interests be matched. To answer these questions, a new instrument of decision theory is presented which has been applied with good results to controversial political issues in the USA. The new technique is known under the name of value tree analysis. Members of important West German organisations (BDI, VDI, RWE, the Catholic and Protestant Church, Deutscher Naturschutzring, and ecological research institutions) were asked about the goals of their organisations. These goals were then ordered systematically and arranged in a hierarchical tree structure. The value trees of different groups can be combined into a catalogue of social criteria of acceptability and policy assessment. The authors describe the philosophy and methodology of value tree analysis and give an outline of its application in the development of a socially acceptable energy policy. (orig.)
Hazen, Damian [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Hick, Jason [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)
2012-06-12
We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.
Visualization analysis and design
Munzner, Tamara
2015-01-01
Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...
Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels;
2015-01-01
This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...... of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......). Based on the sociology of associations and the mathematics of classical, fuzzy and rough set theories, this paper proposes a research program. The function of which is to design, develop and evaluate social set analytics in terms of fundamentally novel formal models, predictive methods and visual...
Layered Composite Analysis Capability
Narayanaswami, R.; Cole, J. G.
1985-01-01
Laminated composite material construction is gaining popularity within industry as an attractive alternative to metallic designs where high strength at reduced weights is of prime consideration. This has necessitated the development of an effective analysis capability for the static, dynamic and buckling analyses of structural components constructed of layered composites. Theoretical and user aspects of layered composite analysis and its incorporation into CSA/NASTRAN are discussed. The availability of stress and strain based failure criteria is described which aids the user in reviewing the voluminous output normally produced in such analyses. Simple strategies to obtain minimum weight designs of composite structures are discussed. Several example problems are presented to demonstrate the accuracy and user convenient features of the capability.
Leonard, J. I.
1977-01-01
The water balance of the Skylab crew was analyzed. Evaporative water loss using a whole body input/output balance equation, water, body tissue, and energy balance was analyzed. The approach utilizes the results of several major Skylab medical experiments. Subsystems were designed for the use of the software necessary for the analysis. A partitional water balance that graphically depicts the changes due to water intake is presented. The energy balance analysis determines the net available energy to the individual crewman during any period. The balances produce a visual description of the total change of a particular body component during the course of the mission. The information is salvaged from metabolic balance data if certain techniques are used to reduce errors inherent in the balance method.
Handbook of radioactivity analysis
2012-01-01
The updated and much expanded Third Edition of the "Handbook of Radioactivity Analysis" is an authoritative reference providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities and in the implementation of nuclear forensic analysis and nuclear safeguards. The Third Edition contains seven new chapters providing a reference text much broader in scope than the previous Second Edition, and all of the other chapters have been updated and expanded many with new authors. The book describes the basic principles of radiation detection and measurement, the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-ar...
This paper presents the structural analysis developed during the TBR-2 tokamak project studies. Starting with electromagnetic interaction forces on each, toroidal and poloidal coils many structural calculations have been carried out using locally developed usual E.M. interaction codes and a finite element method stress code. Following the analysis it has been determined that there is radially inward force of 1235 kN and overturning torque of 243 kNm acting on toroidal coils. This stress and displacements due to in plane loads have been calculated using a finite element code which show that the maximum stress of 240 MPa and displacement of 0.21 mm can be present at the inner part of the toroidal field coil. (Author)
Tohyama, Mikio
2015-01-01
What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...
Nonstandard asymptotic analysis
Berg, Imme
1987-01-01
This research monograph considers the subject of asymptotics from a nonstandard view point. It is intended both for classical asymptoticists - they will discover a new approach to problems very familiar to them - and for nonstandard analysts but includes topics of general interest, like the remarkable behaviour of Taylor polynomials of elementary functions. Noting that within nonstandard analysis, "small", "large", and "domain of validity of asymptotic behaviour" have a precise meaning, a nonstandard alternative to classical asymptotics is developed. Special emphasis is given to applications in numerical approximation by convergent and divergent expansions: in the latter case a clear asymptotic answer is given to the problem of optimal approximation, which is valid for a large class of functions including many special functions. The author's approach is didactical. The book opens with a large introductory chapter which can be read without much knowledge of nonstandard analysis. Here the main features of the t...
Olivarius, Signe
While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification of the...
As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)
Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben
2012-01-01
Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... optimized for light microscopy and the human eye. With improved technical methods and the acknowledgement that computerized readings are different from analysis by human eye, recognition has been achieved that to really empower DIA, histological slides must be optimized for the digital 'eye', with...... reproducible results correlating with clinical findings. In this review, we focus on the basic expectations and requirements for DIA to gain wider use in histopathological research and diagnostics. With a reference to studies that specifically compare DIA with conventional methods, this review discusses...
Paige Cooke
2013-01-01
This project presents a mathematical analysis of the high jump, a popular track and field event. The first and second stages of the high jump correspond to the athlete’s run along two distinct trajectories. The third stage is the actual jump. We propose an individual model for each of these stages and show how to combine these models to study the dynamics of the entire high jump.
Vončina, Bojan
2016-01-01
The purpose of the thesis was to analyse the acceptance of Scrum methodology, which has become one of the leading agile methodologies, and to find out which were the key factors that influenced the acceptance. The analysis was conducted in Comtrade, which is one of the largest Slovenian software development companies. The First part (theoretical part) contains an introduction chapter, a detailed presentation of Scrum methodology and the presentation of theoretical models, on which practical ...
Bayesian exploratory factor analysis
Gabriella Conti; Sylvia Frühwirth-Schnatter; James Heckman; Rémi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identifi cation criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study c...
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Bayesian Exploratory Factor Analysis
Gabriella Conti; Sylvia Fruehwirth-Schnatter; Heckman, James J.; Remi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on \\emph{ad hoc} classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo s...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo st...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Rocheta, Margarida; Dionísio, F Miguel; Fonseca, Luís; Pires, Ana M
2007-12-01
Paternity analysis using microsatellite information is a well-studied subject. These markers are ideal for parentage studies and fingerprinting, due to their high-discrimination power. This type of data is used to assign paternity, to compute the average selfing and outcrossing rates and to estimate the biparental inbreeding. There are several public domain programs that compute all this information from data. Most of the time, it is necessary to export data to some sort of format, feed it to the program and import the output to an Excel book for further processing. In this article we briefly describe a program referred from now on as Paternity Analysis in Excel (PAE), developed at IST and IBET (see the acknowledgments) that computes paternity candidates from data, and other information, from within Excel. In practice this means that the end user provides the data in an Excel sheet and, by pressing an appropriate button, obtains the results in another Excel sheet. For convenience PAE is divided into two modules. The first one is a filtering module that selects data from the sequencer and reorganizes it in a format appropriate to process paternity analysis, assuming certain conventions for the names of parents and offspring from the sequencer. The second module carries out the paternity analysis assuming that one parent is known. Both modules are written in Excel-VBA and can be obtained at the address (www.math.ist.utl.pt/~fmd/pa/pa.zip). They are free for non-commercial purposes and have been tested with different data and against different software (Cervus, FaMoz, and MLTR). PMID:17928093
无
2001-01-01
@@Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).
Anshul Sharma; Preeti Gulia
2014-01-01
Big Data is data that either is too large, grows too fast, or does not fit into traditional architectures. Within such data can be valuable information that can be discovered through data analysis [1]. Big data is a collection of complex and large data sets that are difficult to process and mine for patterns and knowledge using traditional database management tools or data processing and mining systems. Big Data is data whose scale, diversity and complexity require new architecture, technique...
Crawler Solids Unknown Analysis
Frandsen, Athela
2016-01-01
Crawler Transporter (CT) #2 has been undergoing refurbishment to carry the Space Launch System (SLS). After returning to normal operation, multiple filters of the gear box lubrication system failed/clogged and went on bypass during a test run to the launch pad. Analysis of the filters was done in large part with polarized light microscopy (PLM) to identify the filter contaminates and the source of origin.
Fiala, Tomáš
2011-01-01
Summary This bachelor thesis focuses on the actual use of DNA analysis. The emphasis will be placed mainly on DNA databases of Czech Republic and United Kingdom. Both systems have their own specific characteristics that significantly differ one from the other. Some of those characteristics could be considered to be advantages while other rather as disadvantages mainly with regard to the right for privacy. Therefore, will be introduction of the topic (as well as its historical development)...
Futures Trading: Task Analysis
Zeman, Jan
Praha: UTIA AV ČR, v.v.i, - DAR, 2009 - (Janžura; Ivánek). s. 38-38 [5th International Workshop on Data – Algorithms – Decision Making. 29.11.2009-01.12.2009, Plzeň] Institutional research plan: CEZ:AV0Z10750506 Keywords : decision making * futures trading * optimization Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2009/AS/zeman-futures trading task analysis .pdf
Deep Linear Discriminant Analysis
Dorfer, Matthias; Kelz, Rainer; WIDMER, Gerhard
2015-01-01
We introduce Deep Linear Discriminant Analysis (DeepLDA) which learns linearly separable latent representations in an end-to-end fashion. Classic LDA extracts features which preserve class separability and is used for dimensionality reduction for many classification problems. The central idea of this paper is to put LDA on top of a deep neural network. This can be seen as a non-linear extension of classic LDA. Instead of maximizing the likelihood of target labels for individual samples, we pr...
New Nonlinear Multigrid Analysis
Xie, Dexuan
1996-01-01
The nonlinear multigrid is an efficient algorithm for solving the system of nonlinear equations arising from the numerical discretization of nonlinear elliptic boundary problems. In this paper, we present a new nonlinear multigrid analysis as an extension of the linear multigrid theory presented by Bramble. In particular, we prove the convergence of the nonlinear V-cycle method for a class of mildly nonlinear second order elliptic boundary value problems which do not have full elliptic regularity.
Givoni, Inmar; Cheung, Vincent; Frey, Brendan J.
2012-01-01
Many tasks require finding groups of elements in a matrix of numbers, symbols or class likelihoods. One approach is to use efficient bi- or tri-linear factorization techniques including PCA, ICA, sparse matrix factorization and plaid analysis. These techniques are not appropriate when addition and multiplication of matrix elements are not sensibly defined. More directly, methods like bi-clustering can be used to classify matrix elements, but these methods make the overly-restrictive assumptio...
SAMPLING AND ANALYSIS PROTOCOLS
Jannik, T; P Fledderman, P
2007-02-09
Radiological sampling and analyses are performed to collect data for a variety of specific reasons covering a wide range of projects. These activities include: Effluent monitoring; Environmental surveillance; Emergency response; Routine ambient monitoring; Background assessments; Nuclear license termination; Remediation; Deactivation and decommissioning (D&D); and Waste management. In this chapter, effluent monitoring and environmental surveillance programs at nuclear operating facilities and radiological sampling and analysis plans for remediation and D&D activities will be discussed.
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical Product Input Department''. (BSC 2001b, Addendum B
KYU－HAWNYANG
2001-01-01
Risk analysis is a useful too for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data.contaminant residue levels,statistical tools,exposure values and relevant variants,Risk managers consider scientific evidence and risk estimates,along with statutory,engineering,economic,social,and political factors,in evaluating alternative regulatory options and choosing among those options(NRC,1983).
KYU-HWAN; YANG
2001-01-01
Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).……
Šemberová, Veronika
2014-01-01
Milk and milk products are important sources of protein, vitamins and minerals that are hard to substitute in the human nutrition. In last two decades agricultural underwent several changes and the size of cattle herd decreased. Share of imports on consumption of milk and milk products increased and simultaneously export of raw milk grew. Self-sustainability in milk production so decreased from 118 % to 103 % between 2004 and 2009. The main aim of this thesis called Analysis of the sector ...
Rangayyan, Rangaraj M
2015-01-01
The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.
Bayesian Benchmark Dose Analysis
Fang, Qijun; Piegorsch, Walter W.; Barnes, Katherine Y.
2014-01-01
An important objective in environmental risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs) that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). Appeal to Bayesian modeling and credible limits for building BMDLs is far less developed, however. Indee...
Dental Forensics: Bitemark Analysis
Elza Ibrahim Auerkari
2013-01-01
Forensic odontology (dental forensics) can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be pra...
Dental Forensics: Bitemark Analysis
Elza Ibrahim Auerkari
2013-06-01
Full Text Available Forensic odontology (dental forensics can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be practically unique for each individual. Therefore, finding such an imprint as a bitemark can bear a strong testimony that it was produced by the individual that has the matching dental pattern. However, the comparison of the observed bitemark and the suspected set of teeth will necessarily require human interpretation, and this is not infallible. Both technical challenges in the bitemarks and human errors in the interpretation are possible. To minimise such errors and to maximise the value of bitemark analysis, dedicated procedures and protocols have been developed, and the personnel taking care of the analysis need to be properly trained. In principle the action within the discipline should be conducted as in evidence-based dentristy, i.e. accepted procedures should have known error rates. Because of the involvement of human interpretation, even personal performance statistics may be required from legal expert statements. The requirements have been introduced largely due to cases where false convictions based on bitemark analysishave been overturned after DNA analysis.DOI: 10.14693/jdi.v15i2.76
In this paper it is presented a variational method for the limit analysis of an ideal plastic solid. This method has been denominated as Modified Secundary Creep and enables to find the collapse loads through a minimization of a functional and a limit process. Given an ideal plastic material it is shown how to determinate the associated secundary creep constitutive equation. Finally, as an application, it is found the limit load in an pressurized von Mises rigid plastic sphere. (Author)
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Sociology and Systems Analysis
Becker, H.A.
1982-01-01
The Management and Technology (MMT) Area of IIASA organizes, from time to time, seminars on topics that are of interest in connection with the work at the Institute. Since MMT sees the importance of investigating the broader management aspects when using systems analytical tools, it was of great interest to have Professor Henk Becker from the University of Utrecht give a seminar on "Sociology of Systems Analysis". As his presentation at this seminar should be of interest to a wider audie...
Heemink, Arnold; Jong, de, Franciska; Prins, Harrie
1991-01-01
In this paper we describe a new approach to the harmonic analysis of the tide. For a number of reasons the harmonic constants are not really constant but vary slowly in time. Therefore, we introduce a narrow-band noise process to model the time-varying behaviour of these harmonic parameters. Furthermore, since the measurements available are not perfect, we also introduce a, possibly time-varying, measurement noise process to model the errors associated with the measurement process. By employi...
Rajiv K Gupta; Thallam V Padmanabhan
2011-01-01
Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA) is one of such techniques which is most frequent...
无
2001-01-01
This review introduces the history and present status of data envelopment analysis (DEA) research, particularly the evaluation process. And extensions of some DEA models are also described. It is pointed out that mathematics, economics and management science are the main forces in the DEA development, optimization provides the fundamental method for the DEA research, and the wide range of applications enforces the rapid development of DEA.
PROFITABILITY ANALYSIS TOURIST HOSTELS
Cristiana Tindeche; Romeo Catalin Cretu
2015-01-01
This study aims for a comparative analysis on the economic efficiency of the Confort Penssion located in a rural area and the Danacris Penssion from the urban area. The reason for choosing these two units is that the types of tourism they represent are significant areas of operation, namely leisure tourism ("Confort " Penssion) from Suceava area and business tourism ("Danacris" Penssion) from Bucharest. Based on the existing methodology in the specialized literature we computed specific indic...
Gong, Shaogang
2011-01-01
This book presents a comprehensive treatment of visual analysis of behaviour from computational-modelling and algorithm-design perspectives. This title: covers learning-group activity models, unsupervised behaviour profiling, hierarchical behaviour discovery, learning behavioural context, modelling rare behaviours, and 'man-in-the-loop' active learning; examines multi-camera behaviour correlation, person re-identification, and 'connecting-the-dots' for abnormal behaviour detection; discusses Bayesian information criterion, Bayesian networks, 'bag-of-words' representation, canonical correlation
ORGANISATIONAL CULTURE ANALYSIS MODEL
Mihaela Simona Maracine
2012-01-01
The studies and researches undertaken have demonstrated the importance of studying organisational culture because of the practical valences it presents and because it contributes to increasing the organisation’s performance. The analysis of the organisational culture’s dimensions allows observing human behaviour within the organisation and highlighting reality, identifying the strengths and also the weaknesses which have an impact on its functionality and development. In this paper, we try to...
Liu, Jianghong
2004-01-01
The concept of aggression is important to nursing because further knowledge of aggression can help generate a better theoretical model to drive more effective intervention and prevention approaches. This paper outlines a conceptual analysis of aggression. First, the different forms of aggression are reviewed, including the clinical classification and the stimulus-based classification. Then the manifestations and measurement of aggression are described. Finally, the causes and consequences of ...
Adams, David; Branco, Miguel; Albrand, Solveig; Rybkine, G.; Orellana, F.; Liko, D.; Tan C.L.; Deng, W.; C. KANNAN; Harrison Karl; Fassi, Farida; Fulachier, J.; Chetan, N.; Haeberli, C.; Soroko, A.
2004-01-01
The ATLAS distributed analysis (ADA) system is described. The ATLAS experiment has more that 2000 physicists from 150 insititutions in 34 countries. Users, data and processing are distributed over these sites. ADA makes use of a collection of high-level web services whose interfaces are expressed in terms of AJDL (abstract job definition language) which includes descriptions of datasets, transformations and jobs. The high-level services are implemented using generic parts...
2016-01-01
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern r...
Kenya; Debt Sustainability Analysis
International Monetary Fund
2004-01-01
This report of the Debt Sustainability Analysis (DSA) indicates that the envisaged strategy of a partial substitution of domestic debt by increased inflows of external grants and concessional loans, as well as a rescheduling of external debt by the Paris and London Clubs, would facilitate the achievement of debt sustainability. The DSA also confirms that such a debt rescheduling could constitute an appropriate exit strategy for Kenya. The DSA also shows that debt sustainability would improve ...
SAMPLING AND ANALYSIS PROTOCOLS
Radiological sampling and analyses are performed to collect data for a variety of specific reasons covering a wide range of projects. These activities include: Effluent monitoring; Environmental surveillance; Emergency response; Routine ambient monitoring; Background assessments; Nuclear license termination; Remediation; Deactivation and decommissioning (D and D); and Waste management. In this chapter, effluent monitoring and environmental surveillance programs at nuclear operating facilities and radiological sampling and analysis plans for remediation and D and D activities will be discussed
Analysis of personnel management
Novotná, Petra
2012-01-01
The topic of this Bachelor's thesis is an analysis of personnel management in the company DIAMONDS INTERNATIONAL CORPORATION - D.I.C. plc., which is engaged in production and selling of diamond jewellery as well as the selling of invested diamonds. The thesis is divided into theoretical part, where the basic personnel activities are characterized, and into practical part, where the theoretical knowledge is compared to the practical reality in the company; in addition, the recommendations are ...
Salvesen, F.; Sandgren, J. [KanEnergi AS, Rud (Norway)
1997-12-31
The present energy situation in the target area is summarized: 20 million inhabitants without electricity in north- west Russia, 50 % of the people in the Baltic`s without electricity, very high technical skills, biggest problems is the finance. The energy situation, the advantages of the renewables, the restrictions, and examples for possible technical solutions are reviewed on the basis of short analysis and experience with the Baltics and Russia
Sroufe, Paul; Phithakkitnukoon, Santi; Dantu, Ram; Cangussu, João
2010-01-01
Email has become an integral part of everyday life. Without a second thought we receive bills, bank statements, and sales promotions all to our inbox. Each email has hidden features that can be extracted. In this paper, we present a new mechanism to characterize an email without using content or context called Email Shape Analysis. We explore the applications of the email shape by carrying out a case study; botnet detection and two possible applications: spam filtering, and social-context bas...
Contingency and behavior analysis
Lattal, Kennon A.
1995-01-01
The concept of contingency is central to theoretical discussions of learned behavior and in the application of learning research to problems of social significance. This paper reviews three aspects of the contingency concept as it has been developed by behavior analysts. The first is the empirical analysis of contingency through experimental studies of both human and nonhuman behavior. The second is the synthesis of experimental studies in theoretical and conceptual frameworks to yield a more...
Whelan, Paul F.; Ghita, O.
2008-01-01
This chapter presents a novel and generic framework for image segmentation using a compound image descriptor that encompasses both colour and texture information in an adaptive fashion. The developed image segmentation method extracts the texture information using low-level image descriptors (such as the Local Binary Patterns (LBP)) and colour information by using colour space partitioning. The main advantage of this approach is the analysis of the textured images at a micro-level using the l...
Analysis Bitcoin virtual currency
Potužník, Jan
2014-01-01
The goal of the submitted thesis “Analysis Bitcoin virtual currency” is to analyze the first decentralized digital currency called Bitcoin and become an active member of the mining process. The thesis is also supposed to analyze different decentralized digital currencies that were created based on Bitcoin and compare these currencies to Bitcoin. The practical part of this bachelor thesis is focused on the detailed description of how to become an active member of the mining process and if mini...
Sivakumar, R.; Ravindran, G.; Muthayya, M.; Lakshminarayanan, S.; Velmurughendran, C. U.
2005-01-01
Diabetic retinopathy is one of the common complications of diabetes. Unfortunately, in many cases the patient is not aware of any symptoms until it is too late for effective treatment. Through analysis of evoked potential response of the retina, the optical nerve, and the optical brain center, a way will be paved for early diagnosis of diabetic retinopathy and prognosis during the treatment process. In this paper, we present an artificial-neural-network-based method to classify diabetic retin...
Sivakumar, R.; Muthayya, M.; Lakshminarayanan, S.; Velmurughendran, C. U.
2003-01-01
Diabetic retinopathy is one of the common complications of diabetes. Unfortunately, in many cases the patient is not aware any symptom until it is too late for effective treatment. Through analysis of evoked potential response of the optical nerve and optical brain centre will pave a way for early diagnosis of diabetic retinopathy and prognosis during the treatment process. In this paper, we present a method to classify diabetic retinopathy subjects from changes in visual evoked potential spe...
Barsanti, M.L. [Naval Research Lab., Washington, DC (United States); Smutek, L.S. [Mission Research Corp., Newington, VA (United States); Armstrong, C.M. [Northrop Grumman Corp., Rolling Meadows, IL (United States)
1995-12-31
NRL studies of various microwave and millimeter wave amplifiers and the need to know and understand their characteristics have led to the development of a pulsed noise analysis system. This system is capable of measuring amplitude, phase, and total spectra of two signals simultaneously. It can also measure am/pm conversion, group or phase delay and linearity, and phase and amplitude jitter. The system downconverts the signal to be analyzed and then digitizes the intermediate frequency. The digitized sample is transferred to a personal computer equipped with LabVIEW, where all further processing and analysis takes place. It is digitally demodulated into I and Q channels which can then be separated into amplitude and phase information. Discrete Fourier Transforms are used to display the spectral information. Measurements have been performed on both conventional linear beam devices such as commercially available TWT and klystron amplifiers. As well as two gyro-devices, the NRL gyroklystron and gyrotwystron amplifiers. These measurements were expanded to include noise power dependency on various tube parameters such as beam current and alpha. The results of the analysis will be used as a basis for modifying the gyrotwystron amplifier to improve its operating characteristics.
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog
Medical Image Analysis Facility
1978-01-01
To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.
Generalized Linear Covariance Analysis
Carpenter, James R.; Markley, F. Landis
2014-01-01
This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.
The outputs from the drift degradation analysis support scientific analyses, models, and design calculations, including the following: (1) Abstraction of Drift Seepage; (2) Seismic Consequence Abstraction; (3) Structural Stability of a Drip Shield Under Quasi-Static Pressure; and (4) Drip Shield Structural Response to Rock Fall. This report has been developed in accordance with ''Technical Work Plan for: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The drift degradation analysis includes the development and validation of rockfall models that approximate phenomenon associated with various components of rock mass behavior anticipated within the repository horizon. Two drift degradation rockfall models have been developed: the rockfall model for nonlithophysal rock and the rockfall model for lithophysal rock. These models reflect the two distinct types of tuffaceous rock at Yucca Mountain. The output of this modeling and analysis activity documents the expected drift deterioration for drifts constructed in accordance with the repository layout configuration (BSC 2004 [DIRS 172801])
Exploration Laboratory Analysis
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
Andres, T.H
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Analysis of radioactive strontium
In environmental radiation survey, radioactive strontium has been analyzed in compliance with the manual ''Analyzing methods for radioactive strontium'' published in 1960 by the Science and Technology Agency, Japan, and revised in 1963. However, in a past decade, progress and development in analyzing methods and measuring equipments have been significant, therefore the manual was revised in 1974. Major revisions are as follows. (1) Analysis of 90Sr with long half life was changed to the main theme and that of 89Sr with short half life became a subordinate one. (2) Measuring criteria and sampling volume were revised. (3) Sample collection method was unified. (4) Analyzing method for soil was improved to NaOH-HCl method which has good recovery rate. (5) 90Y separation method of simple operation was added for sea water analysis besides EDTA and fuming nitric acid methods. (6) Flame spectrometry for quantitative analysis of stable strontium was revised to atomic absorption spectrometry. The contents of the manual comprises 11 chapters describing introduction, measuring criteria for 90Sr (89Sr), rain and dust, land water, sea water, soil, sea bottom and river bottom sediments (changed from human urine and human bones), crops, milk (the previous one chapter was divided into two), marine organisms, and everyday foods, respectively. (Wakatsuki, Y.)
Enhanced target factor analysis.
Rostami, Akram; Abdollahi, Hamid; Maeder, Marcel
2016-03-10
Target testing or target factor analysis, TFA, is a well-established soft analysis method. TFA answers the question whether an independent target test vector measured at the same wavelengths as the collection of spectra in a data matrix can be excluded as the spectrum of one of the components in the system under investigation. Essentially, TFA cannot positively prove that a particular test spectrum is the true spectrum of one of the components, it can, only reject a spectrum. However, TFA will not reject, or in other words TFA will accept, many spectra which cannot be component spectra. Enhanced Target Factor Analysis, ETFA addresses the above problem. Compared with traditional TFA, ETFA results in a significantly narrower range of positive results, i.e. the chance of a false positive test result is dramatically reduced. ETFA is based on feasibility testing as described in Refs. [16-19]. The method has been tested and validated with computer generated and real data sets. PMID:26893084
Analysis of simulator training
A method has been developed for systematic observation of operator performance in nuclear training simulators which combines training and research. It is based on generally accepted theories of operator models and decision making developed at Riso and elsewhere. It makes explicitly available the data which experienced instructors implicitly use in their assessment of operator performance. This means that the feed-back/debriefing function of the training is facilitated, it becomes possible to use normal training sessions to obtain data which can be used in further theoretical studies of e.g. operator decision making, and the generalized description of operator performance may be used to evaluate the training program as such. The method for observation is designed in cooperation with the instructors so that it does not interfere with their normal work. It is based on a detailed prior analysis of experienced transients, leading to a description of an expected performance, and some transient-in-dependent observation schemes, which are used to characterize points where the actual performance deviates from the expected performance. The analysis of the observations takes place according to the structure of a general model of analysis developed from numerous studies of operator performance, in real life and in simulators. (author)
Actinide isotopic analysis systems
This manual provides instructions and procedures for using the Lawrence Livermore National Laboratory's two-detector actinide isotope analysis system to measure plutonium samples with other possible actinides (including uranium, americium, and neptunium) by gamma-ray spectrometry. The computer program that controls the system and analyzes the gamma-ray spectral data is driven by a menu of one-, two-, or three-letter options chosen by the operator. Provided in this manual are descriptions of these options and their functions, plus detailed instructions (operator dialog) for choosing among the options. Also provided are general instructions for calibrating the actinide isotropic analysis system and for monitoring its performance. The inventory measurement of a sample's total plutonium and other actinides content is determined by two nondestructive measurements. One is a calorimetry measurement of the sample's heat or power output, and the other is a gamma-ray spectrometry measurement of its relative isotopic abundances. The isotopic measurements needed to interpret the observed calorimetric power measurement are the relative abundances of various plutonium and uranium isotopes and americium-241. The actinide analysis system carries out these measurements. 8 figs
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Deterministic uncertainty analysis
This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs
[Mindfulness: A Concept Analysis].
Chen, Tsai-Ling; Chou, Fan-Hao; Wang, Hsiu-Hung
2016-04-01
"Mindfulness" is an emerging concept in the field of healthcare. Ranging from stress relief to psychotherapy, mindfulness has been confirmed to be an effective tool to help individuals manage depression, anxiety, obsessive-compulsive disorder, and other health problems in clinical settings. Scholars currently use various definitions for mindfulness. While some of these definitions overlap, significant differences remain and a general scholarly consensus has yet to be reached. Several domestic and international studies have explored mindfulness-related interventions and their effectiveness. However, the majority of these studies have focused on the fields of clinical medicine, consultation, and education. Mindfulness has rarely been applied in clinical nursing practice and no related systematic concept analysis has been conducted. This paper conducts a concept analysis of mindfulness using the concept analysis method proposed by Walker and Avant (2011). We describe the defining characteristics of mindfulness, clarify the concept, and confirm the predisposing factors and effects of mindfulness using examples of typical cases, borderline cases, related cases, and contrary case. Findings may provide nursing staff with an understanding of the concept of mindfulness for use in clinical practice in order to help patients achieve a comfortable state of body and mind healing. PMID:27026563
The accident at Three Mile Island Unit 2 (TN-2) provides an opportunity to benchmark severe accident analysis methods against full-scale, integrated facility data. In collaboration with the U.S. Department of Energy (DOE), the OECD Nuclear Energy Agency established a joint task group to analyze various periods of the accident and benchmark the relevant severe accident codes. In this paper the author presents one result from the mu-2 analysis exercise that may be of interest in evaluating thermal hydraulics codes. Arguably one of the more interesting aspects of the analysis results is the predicted pressurizer level response during core heatup. A number of conclusions may be drawn from the predictions of pressurizer level. First the predicted pressurizer level after PORV block valve closure is determined by thermal hydraulics as well as the calculation of hydrogen generation. Second it is possible to arrive at a non-conservative prediction of pressurizer level (pressurizer drainage). Although some aspects of severe accidents may not depend greatly on thermal hydraulics, it is the author's conclusion that predictions of severe accidents require severe accident models properly coupled to reliable thermal hydraulics models
R.M. Forester
2000-03-14
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog.
Monogan III, James E
2015-01-01
Political Analysis Using R can serve as a textbook for undergraduate or graduate students as well as a manual for independent researchers. It is unique among competitor books in its usage of 21 example datasets that are all drawn from political research. All of the data and example code is available from the Springer website, as well as from Dataverse (http://dx.doi.org/10.7910/DVN/ARKOTI). The book provides a narrative of how R can be useful for addressing problems common to the analysis of public administration, public policy, and political science data specifically, in addition to the social sciences more broadly. While the book uses data drawn from political science, public administration, and policy analyses, it is written so that students and researchers in other fields should find it accessible and useful as well. Political Analysis Using R is perfect for the first-time R user who has no prior knowledge about the program. By working through the first seven chapters of this book, an entry-level user sho...
Anastassiou, George A
2015-01-01
This is the first numerical analysis text to use Sage for the implementation of algorithms and can be used in a one-semester course for undergraduates in mathematics, math education, computer science/information technology, engineering, and physical sciences. The primary aim of this text is to simplify understanding of the theories and ideas from a numerical analysis/numerical methods course via a modern programming language like Sage. Aside from the presentation of fundamental theoretical notions of numerical analysis throughout the text, each chapter concludes with several exercises that are oriented to real-world application. Answers may be verified using Sage. The presented code, written in core components of Sage, are backward compatible, i.e., easily applicable to other software systems such as Mathematica®. Sage is open source software and uses Python-like syntax. Previous Python programming experience is not a requirement for the reader, though familiarity with any programming language is a p...
Market analysis. Renewable fuels
The Agency for Renewable Resources (FNR) had on behalf of the Federal Ministry of Food and Agriculture created a study on the market development of renewable resources in Germany and published this in the year of 2006. The aim of that study was to identify of actual status and market performance of the individual market segments of the material and energetic use as a basis for policy recommendations for accelerated and long term successful market launch and market share expansion of renewable raw materials. On behalf of the FNR, a market analysis of mid-2011 was carried out until the beginning of 2013, the results of which are hereby resubmitted. This market analysis covers all markets of material and energetic use in the global context, taking account of possible competing uses. A market segmentation, which was based on the product classification of the Federal Statistical Office, formed the basis of the analysis. A total of ten markets have been defined, seven material and three energetic use.
Regional Shelter Analysis Methodology
Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.
The radiometric method of analysis is noted for its sensitivity and its simplicity in both apparatus and procedure. A few inexpensive radioactive reagents permit the analysis of a wide variety of chemical elements and compounds. Any particular procedure is generally applicable over a very wide range of concentrations. It is potentially an analytical method of great industrial significance. Specific examples of analyses are cited to illustrate the potentialities of ordinary equipment. Apparatus specifically designed for radiometric chemistry may shorten the time required, and increase the precision and accuracy for routine analyses. A sensitive and convenient apparatus for the routine performance of radiometric chemical analysis is a special type of centrifuge which has been used in obtaining the data presented in this paper. The radioactivity of the solution is measured while the centrifuge is spinning. This device has been used as the basis for an automatic analyser for phosphate ion, programmed to follow a sequence of unknown sampling, reagent mixing, centrifugation, counting data presentation, and phosphate replenishment. This analyser can repeatedly measure phosphate-concentration in the range of 5 to 50 ppm with an accuracy of ±5%. (author)
Methods have been developed for the determination of Np in a wide range of concentrations and for the elucidation of the oxidation number of Np in spent fuel solutions and reactor water. The elaboration of precise methods for the analysis of U and Pu gives the possibility to evaluate Np amounts in relation to U and Pu contents. For the determination of Np traces the isotope dilution activation analysis (IDAA) was developed. Two variants were applied using 239Np or 238Np as tracers before activation. Separations were carried out with the help of isotope dilution. The reached yield allows the determination within an accuracy of +- 5%. Using polarography and voltametry an interaction between U and Np ions in acetate solution was found. The increase in the limited diffusion current of the Np(IV/III) step is caused by the reaction of U(III) with Np(IV) at the surface of the electrode. In this way the polarographic determination of small amounts (0.05 mg) Np in the presence of U is possible. By voltametry the determination of Np in various oxidation states was investigated. In non-complexing solutions a redox reaction of Np(VI/V) takes place, but no reduction to Np(III) could be observed, as it was found in complexing media at the glass-carbon electrode. For the determination of Pu alpha spectroscopic isotope dilution analysis and for U in various concentrations potentiostatic coulometry, potentiometric titration, and IDAA were developed