Kaufmann, F; Wohlfarth, G; Diekert, G
1998-05-01
The ether-cleaving O-demethylase isolated from syringate-grown cells of Acetobacterium dehalogenans (formerly named strain MC) consists of four proteins, components A, B, C and D. The enzyme system converts only phenyl methyl ethers with a hydroxyl group in the ortho position to the methoxyl moiety. The presence of a carboxyl group in the aromatic compound was not required for O-demethylase reaction. Component B mediated the conversion of vanillate to 3,4-dihydroxybenzoate in the presence of the Ti(III)-reduced corrinoid-containing component A. After addition of component D and tetrahydrofolate, methyl tetrahydrofolate was formed from vanillate in stoichiometric amounts. Titanium(III) citrate as a reductant could be replaced by H2, methyl viologen or ferredoxin, partially purified hydrogenase, purified component C obtained from A. dehalogenans, and ATP. From these findings, it was deduced that component B serves as vanillate:corrinoid protein methyltransferase (methyltransferase I) mediating the methyl transfer from vanillate to the reduced corrinoid protein component A. Component D functions as methylcorrinoid protein:tetrahydrofolate transferase (methyltransferase II). The role of component C is probably that of an activating protein reversing accidental oxidation of the protein-bound cob(I)alamin to cob(II)alamin in the presence of ATP and reducing equivalents supplied by the enzymatic oxidation of hydrogen.
Kaufmann, F; Wohlfarth, G; Diekert, G
1998-10-15
The ether-cleaving O-demethylase from the strictly anaerobic homoacetogen Acetobacterium dehalogenans catalyses the methyltransfer from 4-hydroxy-3-methoxy-benzoate (vanillate) to tetrahydrofolate. In the first step a vanillate :corrinoid protein methyltransferase (methyltransferase I) mediates the methylation of a 25-kDa corrinoid protein with the cofactor reduced to cob(I)alamin. The methyl group is then transferred to tetrahydrofolate by the action of a methylcorrinoid protein:tetrahydrofolate methyltransferase (methyltransferase II). Using primers derived from the amino-terminal sequences of the corrinoid protein and the vanillate:corrinoid protein methyltransferase (methyltransferase I), a 723-bp fragment was amplified by PCR, which contained the gene odmA encoding the corrinoid protein of O-demethylase. Downstream of odmA, part of the odmB gene encoding methyltransferase I was identified. The amino acid sequence deduced from odmA showed about 60% similarity to the cobalamin-binding domain of methionine synthase from Escherichia coli (MetH) and to corrinoid proteins of methyltransferase systems involved in methanogenesis from methanol and methylamines. The sequence contained the DXHXXG consensus sequence typical for displacement of the dimethylbenzimidazole base of the corrinoid cofactor by a histidine from the protein. Heterologous expression of odmA in E. coli yielded a colourless, oxygen-insensitive apoprotein, which was able to bind one mol cobalamin or methylcobalamin/mol protein. Both of these reconstituted forms of the protein were active in the overall O-demethylation reaction. OdmA reconstituted with hydroxocobalamin and reduced by titanium(III) citrate to the cob(I)alamin form was methylated with vanillate by methyltransferase I in an irreversible reaction. Methylcobalamin carrying OdmA served as methyl group donor for the methylation of tetrahydrofolate by methyltransferase II. This reaction was found to be reversible, since methyltranSferase II
Kruse, T.; Pas, van de B.A.; Atteia, A.; Krab, K.; Hagen, W.R.; Goodwin, L.; Chain, P.; Boeren, S.; Maphosa, F.; Schraa, G.; Vos, de W.M.; Oost, van der J.; Smidt, H.; Stams, A.J.M.
2015-01-01
Desulfitobacterium dehalogenans is able to grow by organohalide respiration using 3-chloro-4-hydroxyphenyl acetate (Cl-OHPA) as an electron acceptor. We used a combination of genome sequencing, biochemical analysis of redox active components and shotgun proteomics to study elements of the organohali
Pokkuluri, P. Raj; Dwulit-Smith, Jeff; Duke, Norma E; Wilton, Rosemarie; Mack, Jamey C; Bearden, Jessica; Rakowski, Ella; Babnigg, Gyorgy; Szurmant, Hendrik; Joachimiak, Andrzej; Schiffer, Marianne
2013-01-01
Anaeromyxobacter dehalogenans is a δ-proteobacterium found in diverse soils and sediments. It is of interest in bioremediation efforts due to its dechlorination and metal-reducing capabilities. To gain an understanding on A. dehalogenans' abilities to adapt to diverse environments we analyzed its signal transduction proteins. The A. dehalogenans genome codes for a large number of sensor histidine kinases (HK) and methyl-accepting chemotaxis proteins (MCP); among these 23 HK and 11 MCP protein...
Pokkuluri, P Raj; Dwulit-Smith, Jeff; Duke, Norma E; Wilton, Rosemarie; Mack, Jamey C; Bearden, Jessica; Rakowski, Ella; Babnigg, Gyorgy; Szurmant, Hendrik; Joachimiak, Andrzej; Schiffer, Marianne
2013-10-01
Anaeromyxobacter dehalogenans is a δ-proteobacterium found in diverse soils and sediments. It is of interest in bioremediation efforts due to its dechlorination and metal-reducing capabilities. To gain an understanding on A. dehalogenans' abilities to adapt to diverse environments we analyzed its signal transduction proteins. The A. dehalogenans genome codes for a large number of sensor histidine kinases (HK) and methyl-accepting chemotaxis proteins (MCP); among these 23 HK and 11 MCP proteins have a sensor domain in the periplasm. These proteins most likely contribute to adaptation to the organism's surroundings. We predicted their three-dimensional folds and determined the structures of two of the periplasmic sensor domains by X-ray diffraction. Most of the domains are predicted to have either PAS-like or helical bundle structures, with two predicted to have solute-binding protein fold, and another predicted to have a 6-phosphogluconolactonase like fold. Atomic structures of two sensor domains confirmed the respective fold predictions. The Adeh_2942 sensor (HK) was found to have a helical bundle structure, and the Adeh_3718 sensor (MCP) has a PAS-like structure. Interestingly, the Adeh_3718 sensor has an acetate moiety bound in a binding site typical for PAS-like domains. Future work is needed to determine whether Adeh_3718 is involved in acetate sensing by A. dehalogenans. PMID:23897711
Smidt, H.; Leest, de, H.T.J.I.; Oost, van der, J.; De Vos
2000-01-01
To characterize the expression and possible regulation of reductive dehalogenation in halorespiring bacteria, a 11.5-kb genomic fragment containing the o-chlorophenol reductive dehalogenase-encoding cprBA genes of the gram-positive bacterium Desulfitobacterium dehalogenans was subjected to detailed molecular characterization. Sequence analysis revealed the presence of eight designated genes with the order cprTKZEBACD and with the same polarity except for cprT. The deduced cprC and cprK gene p...
Wiegel, Juergen; Zhang, Xiaoming; Wu, Qingzhong
1999-01-01
Ten years after reports on the existence of anaerobic dehalogenation of polychlorinated biphenyls (PCBs) in sediment slurries, we report here on the rapid reductive dehalogenation of para-hydroxylated PCBs (HO-PCBs), the excreted main metabolites of PCB in mammals, which can exhibit estrogenic and antiestrogenic activities in humans. The anaerobic bacterium Desulfitobacterium dehalogenans completely dehalogenates all flanking chlorines (chlorines in ortho position to the para-hydroxyl group) ...
Transformation of tetrachloromethane to dichloromethane and carbon dioxide by Acetobacterium woodii
International Nuclear Information System (INIS)
Five anaerobic bacteria were tested for their abilities to transform tetrachloromethane so that information about enzymes involved in reductive dehalogenations of polychloromethanes could be obtained. Cultures of the sulfate reducer Desulfobacterium autotrophicum transformed some 80 μM tetrachloromethane to trichloromethane and a small amount of dichloromethane in 18 days under conditions of heterotrophic growth. The acetogens Acetobacterium woodii and Clostridium thermoaceticum in fructose-salts and glucose-salts media, respectively, degraded some 80 μM tetrachloromethane completely within 3 days. Trichloromethane accumulated as a transient intermediate, but the only chlorinated methanes recovered at the end of the incubation were 8 μM dichloromethane and traces of chloromethane. Desulfobacter hydrogenophilus and an autotrophic, nitrate-reducing bacterium were unable to transform tetrachloromethane. Reduction of chlorinated methanes was thus observed only in the organisms with the acetyl-coenzyme A pathway. Experiments with [14C]tetrachloromethane were done to determine the fate of this compound in the acetogen A. woodii. Radioactivity in an 11-day heterotrophic culture was largely (67%) recovered in CO2, acetate, pyruvate, and cell material. In experiments with cell suspensions to which [14C]tetrachloromethane was added, 14CO2 appeared within 20 s as the major transformation product. A. woodii thus catalyzes reductive dechlorinations and transforms tetrachloromethane to CO2 by a series of unknown reactions
Utkin, I.; Dalton, D. D.; Wiegel, J.
1995-01-01
Resting cells of Desulfitobacterium dehalogenans JW/IU-DC1 growth with pyruvate and 3-chloro-4-hydroxyphenylacetate (3-Cl-4-OHPA) as the electron acceptor and inducer of dehalogenation reductively ortho-dehalogenate pentachlorophenol (PCP); tetrachlorophenols (TeCPs); the trichlorophenols 2,3,4-TCP, 2,3,6-TCP, and 2,4,6-TCP; the dichlorophenols 2,3-DCP, 2,4-DCP, and 2,6-DCP; 2,6-dichloro-4-R-phenols (2,6-DCl-4-RPs, where R is -H, -F, -Cl, -NO2, -CO2, or -COOCH3; 2-chloro-4-R-phenols (2-Cl-4-R...
Molecular characterization of anaerobic dehalogenation by Desulfitobacterium dehalogenans
Smidt, H.
2001-01-01
Haloorganics such as chlorophenols and chlorinated ethenes are among the most abundant pollutants in soil, sediments and groundwater, mainly caused by past and present industrial and agricultural activities. Due to bioaccumulation and toxicity, these compounds threaten the integrity of the environment, and human and animal health. A recently discovered, phylogenetically diverse, group of anaerobic so-called halorespiring bacteria is able to couple the reductive dehalogenation of various haloo...
Molecular characterization of anaerobic dehalogenation by Desulfitobacterium dehalogenans
Smidt, H.
2001-01-01
Haloorganics such as chlorophenols and chlorinated ethenes are among the most abundant pollutants in soil, sediments and groundwater, mainly caused by past and present industrial and agricultural activities. Due to bioaccumulation and toxicity, these compounds threaten the integrity of the environme
Directory of Open Access Journals (Sweden)
Jin-Feng eLiu
2015-03-01
Full Text Available Sequestration of CO2 in oil reservoirs is considered to be one of the feasible options for mitigating atmospheric CO2 building up and also for the in situ potential bioconversion of stored CO2 to methane. However, the information on these functional microbial communities and the impact of CO2 storage on them is hardly available. In this paper a comprehensive molecular survey was performed on microbial communities in production water samples from oil reservoirs experienced CO2-flooding by analysis of functional genes involved in the process, including cbbM, cbbL, fthfs, [FeFe]-hydrogenase and mcrA. As a comparison, these functional genes in the production water samples from oil reservoir only experienced water-flooding in areas of the same oil bearing bed were also analyzed. It showed that these functional genes were all of rich diversity in these samples, and the functional microbial communities and their diversity were strongly affected by a long-term exposure to injected CO2. More interestingly, microorganisms affiliated with members of the genera Methanothemobacter, Acetobacterium and Halothiobacillus as well as hydrogen producers in CO2 injected area either increased or remained unchanged in relative abundance compared to that in water-flooded area, which implied that these microorganisms could adapt to CO2 injection and, if so, demonstrated the potential for microbial fixation and conversion of CO2 into methane in subsurface oil reservoirs.
Sanford, Robert A.; Cole, James R.; Tiedje, James M
2002-01-01
Five strains were isolated which form a physiologically and phylogenetically coherent group of chlororespiring microorganisms and represent the first taxon in the Myxobacteria capable of anaerobic growth. The strains were enriched and isolated from various soils and sediments based on their ability to grow using acetate as an electron donor and 2-chlorophenol (2-CPh) as an electron acceptor. They are slender gram-negative rods with a bright red pigmentation that exhibit gliding motility and f...
Maurin, Krzysztof
1980-01-01
The extraordinarily rapid advances made in mathematics since World War II have resulted in analysis becoming an enormous organism spread ing in all directions. Gone for good surely are the days of the great French "courses of analysis" which embodied the whole of the "ana lytical" knowledge of the times in three volumes-as the classical work of Camille Jordan. Perhaps that is why present-day textbooks of anal ysis are disproportionately modest relative to the present state of the art. More: they have "retreated" to the state before Jordan and Goursat. In recent years the scene has been changing rapidly: Jean Dieudon ne is offering us his monumentel Elements d'Analyse (10 volumes) written in the spirit of the great French Course d'Analyse. To the best of my knowledge, the present book is the only one of its size: starting from scratch-from rational numbers, to be precise-it goes on to the theory of distributions, direct integrals, analysis on com plex manifolds, Kahler manifolds, the theory of sheave...
Abdelazeem, Maha; El-Sawy, El-Sawy K.; Gobashy, Mohamed M.
2013-06-01
Ar Rika fault zone constitutes one of the two major parts of the NW-SE Najd fault system (NFS), which is one of the most prominent structural features located in the east of the center of the Arabian Shield, Saudi Arabia. By using Enhancement Thematic Mapper data (ETM+) and Principle Component Analysis (PCA), surface geological characteristics, distribution of rock types, and the different trends of linear features and faults are determined in the study area. First and second order magnetic gradients of the geomagnetic field at the North East of Wadi Ar Rika have been calculated in the frequency domain to map both surface and subsurface lineaments and faults. Lineaments as deduced from previous studies, suggest an extension of the NFS beneath the cover rocks in the study area. In the present study, integration of magnetic gradients and remote sensing analysis that resulted in different valuable derivative maps confirm the subsurface extension of some of the surface features. The 3D Euler deconvolution, the total gradient, and the tilt angle maps have been utilized to determine accurately the distribution of shear zones, the tectonic implications, and the internal structures of the terranes in the Ar Rika quadrangle in three dimensions.
Energy Technology Data Exchange (ETDEWEB)
Richardson, Ruth [Cornell Univ., Ithaca, NY (United States)
2016-02-28
Our overall goal was to improve the understanding of microbial iron and sulfate reduction by evaluating a diverse iron and sulfate reducing organisms utilizing a multi-omics approach combining “top-down” and “bottom-up” omics methodologies. We initiated one of the first combined comparative genomics, shotgun proteomics, RTqPCR, and heterologous expression studies in pursuit of our project objectives. Within the first year of this project, we created a new bioinformatics tool for ortholog identification (“SPOCS”). SPOCS is described in our publication, Curtis et al., 2013. Using this tool we were able to identify conserved orthologous groups across diverse iron and sulfate reducing microorganisms from Firmicutes, gamma-proteobacteria and delta-proteobacteria. For six iron and sulfate reducers we also performed shotgun proteomics (“bottom-up” proteomics including accurate mass and time (AMT) tag and iTRAQ approaches). Cultures include Gram (-) and Gram (+) microbes. Gram (-) were: Geobacter sulfureducens (grown on iron citrate and fumarate), Geobacter bemidjiensis (grown on iron citrate and fumarate), Shewanella oneidiensis (grown on iron citrate and fumarate) and Anaeromyxobacter dehalogenans (grown on iron citrate and fumarate). Although all cultures grew on insoluble iron, the iron precipitates interfered with protein extraction and analysis; which remains a major challenge for researchers in disparate study systems. Among the Gram (-) organisms studied, Anaeromyxobacter dehalogenans remains the most poorly characterized. Yet, it is arguably the most versatile organisms we studied. In this work we have used comparative proteomics to hypothesize which two of the dozens of predicted c-type cytochromes within Anaeromyxobacter dehalogenans may be directly involved in soluble iron reduction. Unfortunately, heterologous expression of these Anaeromyxobacter dehalogenans ctype cytochromes led to poor protein production and/or formation of inclusion bodies
International Nuclear Information System (INIS)
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
Gionis, Aristides
2013-01-01
The objective of this report is to highlight opportunities for enhancing global research data infrastructures from the point of view of data analysis. We discuss various directions and data-analysis functionalities for supporting such infrastructures.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
International Nuclear Information System (INIS)
The neutron activation analysis, which appears to be in limits for further advance, is the most suitable for providing information on the principal as well as the microcomponents in any sample of solid form. Then, instrumental activation analysis is capable of determination of far many elements in various samples. Principally on the neutron activation analysis, the following are described in literature survey from 1982 to middle 1984: bibliography, review, data collection, etc.; problems in spectral analysis and measurement; activation analysis with neutrons; charged particle and photo-nucleus reactions; chemical separation, isotopic dilution activation analysis; molecular activation analysis; standard materials; life and its relation samples; environmental, food, court trial and archaeological samples; space and earth sciences. (Mori, K.)
Directory of Open Access Journals (Sweden)
Oehler Dirk
2012-12-01
Full Text Available Abstract Background Thermacetogenium phaeum is a thermophilic strictly anaerobic bacterium oxidizing acetate to CO2 in syntrophic association with a methanogenic partner. It can also grow in pure culture, e.g., by fermentation of methanol to acetate. The key enzymes of homoacetate fermentation (Wood-Ljungdahl pathway are used both in acetate oxidation and acetate formation. The obvious reversibility of this pathway in this organism is of specific interest since syntrophic acetate oxidation operates close to the energetic limitations of microbial life. Results The genome of Th. phaeum is organized on a single circular chromosome and has a total size of 2,939,057 bp. It comprises 3.215 open reading frames of which 75% could be assigned to a gene function. The G+C content is 53.88 mol%. Many CRISPR sequences were found, indicating heavy phage attack in the past. A complete gene set for a phage was found in the genome, and indications of phage action could also be observed in culture. The genome contained all genes required for CO2 reduction through the Wood-Ljungdahl pathway, including two formyl tetrahydrofolate ligases, three carbon monoxide dehydrogenases, one formate hydrogenlyase complex, three further formate dehydrogenases, and three further hydrogenases. The bacterium contains a menaquinone MQ-7. No indications of cytochromes or Rnf complexes could be found in the genome. Conclusions The information obtained from the genome sequence indicates that Th. phaeum differs basically from the three homoacetogenic bacteria sequenced so far, i.e., the sodium ion-dependent Acetobacterium woodii, the ethanol-producing Clostridium ljungdahlii, and the cytochrome-containing Moorella thermoacetica. The specific enzyme outfit of Th. phaeum obviously allows ATP formation both in acetate formation and acetate oxidation.
Chládek, Vítězslav
2012-01-01
The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...
Bartuňková, Alena
2008-01-01
The objective of this Bachelor thesis is to carry out a strategic analysis of a Czech owned limited company, Česky národní podnik s.r.o. This company sells traditional Czech products and manufactures cosmetics and body care products. The first part of the thesis provides theoretical background and methodology that are used later for the strategic analysis of the company. The theory outlined in this paper is based on the analysis of external and internal factors. Firstly the PEST analysis has ...
Lanczos, Cornelius
2010-01-01
Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.
Li, L.; Braat, L.C.; Lei, G.; Arets, E.J.M.M.; Liu, J.; Jiang, L.; Fan, Z.; Liu, W.; He, H.; Sun, X.
2014-01-01
This chapter presents the results of the scenario analysis of China’s ecosystems focusing on forest, grassland, and wetland ecosystems. The analysis was undertaken using Conversion of Land Use Change and its Effects (CLUE) modeling and an ecosystem service matrix (as explained below) complemented by
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
DEFF Research Database (Denmark)
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...
Tan, Qingming
2011-01-01
Dimensional analysis is an essential scientific method and a powerful tool for solving problems in physics and engineering. This book starts by introducing the Pi Theorem, which is the theoretical foundation of dimensional analysis. It also provides ample and detailed examples of how dimensional analysis is applied to solving problems in various branches of mechanics. The book covers the extensive findings on explosion mechanics and impact dynamics contributed by the author's research group over the past forty years at the Chinese Academy of Sciences. The book is intended for advanced undergra
Schiffrin, Deborah
1990-01-01
Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…
Energy Technology Data Exchange (ETDEWEB)
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Donoho, Steve
Link analysis is a collection of techniques that operate on data that can be represented as nodes and links. This chapter surveys a variety of techniques including subgraph matching, finding cliques and K-plexes, maximizing spread of influence, visualization, finding hubs and authorities, and combining with traditional techniques (classification, clustering, etc). It also surveys applications including social network analysis, viral marketing, Internet search, fraud detection, and crime prevention.
Paul, Debra; Cadle, James
2010-01-01
Throughout the business world, public, private and not-for-profit organisations face huge challenges. Business analysts must respond by developing practical, creative and financially sound solutions. This excellent guide gives them the necessary tools. It supports everyone wanting to achieve university and industry qualifications in business analysis and information systems. It is particularly beneficial for those studying for ISEB qualifications in Business Analysis. Some important additions since the first edition (2006): the inclusion of new techniques such as Ishikawa diagrams and spaghe
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from m
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
International Nuclear Information System (INIS)
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
DiBenedetto, Emmanuele
2016-01-01
The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...
Rao, G Shanker
2006-01-01
About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Nanda, Sudarsan
2013-01-01
"Nonlinear analysis" presents recent developments in calculus in Banach space, convex sets, convex functions, best approximation, fixed point theorems, nonlinear operators, variational inequality, complementary problem and semi-inner-product spaces. Nonlinear Analysis has become important and useful in the present days because many real world problems are nonlinear, nonconvex and nonsmooth in nature. Although basic concepts have been presented here but many results presented have not appeared in any book till now. The book could be used as a text for graduate students and also it will be useful for researchers working in this field.
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
Steuding, Jorn
2005-01-01
While its roots reach back to the third century, diophantine analysis continues to be an extremely active and powerful area of number theory. Many diophantine problems have simple formulations, they can be extremely difficult to attack, and many open problems and conjectures remain. Diophantine Analysis examines the theory of diophantine approximations and the theory of diophantine equations, with emphasis on interactions between these subjects. Beginning with the basic principles, the author develops his treatment around the theory of continued fractions and examines the classic theory, inclu
International Nuclear Information System (INIS)
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
DEFF Research Database (Denmark)
. The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries......The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Institute of Scientific and Technical Information of China (English)
Song Xuexia
2005-01-01
@@ In the past, more attempts had been made to explore the ways for teachers to teach English but fewer for learners to learn the language. Learner analysis is to analyze "What the learner is", including age, attitude, motivation, intelligence,aptitude, personality , and etc, with the purpose to realize the transition from "teacher-centered" into "learner-oriented".
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
Colver, David
2010-01-01
Inclusion analysis is the name given by Operis to a black box testing technique that it has found to make the checking of key financial ratios calculated by spreadsheet models quicker, easier and more likely to find omission errors than code inspection.
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
David P. MacKinnon; Fairchild, Amanda J.; Fritz, Matthew S.
2007-01-01
Mediating variables are prominent in psychological theory and research. A mediating variable transmits the effect of an independent variable on a dependent variable. Differences between mediating variables and confounders, moderators, and covariates are outlined. Statistical methods to assess mediation and modern comprehensive approaches are described. Future directions for mediation analysis are discussed.
DEFF Research Database (Denmark)
Assaf, A. George; Josiassen, Alexander
2016-01-01
and macro applications of these approaches, summarizing and critically reviewing the characteristics of the existing studies. We also conduct a meta-analysis to create an overview of the efficiency results of frontier applications. This allows for an investigation of the impact of frontier methodology...
DEFF Research Database (Denmark)
Dovjak, M.; Simone, Angela; Kolarik, Jakub;
2011-01-01
Exergy analysis enables us to make connections among processes inside the human body and processes in a building. So far, only the effect of different combinations of air temperatures and mean radiant temperatures have been studied, with constant relative humidity in experimental conditions...
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance relation
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
Donaldson, J. A.
1984-01-01
Simple continuum models used in the design, analysis, and control of large space structures are examined. Particular emphasis is placed on boundary value problems associated with the Load Correction Method and control problems involving partial differential equations for the large space structure models. Partial differential equations will be used to model a large space structure, base the design of an optimal controller on this model, approximate the resulting optimal control model, and compare the results with data from other methods.
DEFF Research Database (Denmark)
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different ....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
International Nuclear Information System (INIS)
General remarks on sensitivity analysis, the study of changes in a model output produced by varying model inputs, are made first. Sampling methods are discussed, and three sensitivity measures: partial rank correlation, derivative or response surface, and partial variance are described. Some sample results for a 16-input, 13-output hydrodynamics model are given. Both agreement and disagreement were found among the sensitivity measures. 4 figures
Energy Technology Data Exchange (ETDEWEB)
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
Popovová, Šárka
2015-01-01
The aim of this bachelor thesis is to define basic methods, which are used for the preparation of a business strategy and to use those methods in a real situation. The theoretical part describes the methodology of external and internal analysis. The practical part then applies single methods such as PEST, VRIO, Porter`s five forces and value chain in order to define competitive advantages of Dr. Popov company. At the end of the Bachelor thesis will be assessment of the current situation and s...
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
DEFF Research Database (Denmark)
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonance...... and alliteration, parallelism and the widely use of imagery creates coherence in the psalm but at the same time ambiguity. According to the heading, the psalm is a song of ascents, but it can be read both as a psalm for pilgrimage and as a psalm of trust. The metaphors can be understood both as metaphors...
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Institute of Scientific and Technical Information of China (English)
PEI Yuan-yuan
2013-01-01
As the development of the global communication all over the world, English has become the international language for the global communication. Furthermore, speaking as a skill in English language learning becomes more and more important. As it is known, spoken English has specific features, thus to gain an explicit understanding of the features data analysis is helpful. Moreover it is useful for language teacher to make development in teaching to satisfy learner’s needs. Grammatical and phonological are the two remarkable aspects and specific features in spoken language therefore to discover elements into these aspects seems helpful to teaching spoken Enlgish.
Wald, Abraham
2013-01-01
In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
Energy Technology Data Exchange (ETDEWEB)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
... Home Visit Global Sites Search Help? Pericardial Fluid Analysis Share this page: Was this page helpful? Formal name: Pericardial Fluid Analysis Related tests: Pleural Fluid Analysis , Peritoneal Fluid Analysis , ...
... Home Visit Global Sites Search Help? Peritoneal Fluid Analysis Share this page: Was this page helpful? Formal name: Peritoneal Fluid Analysis Related tests: Pleural Fluid Analysis , Pericardial Fluid Analysis , ...
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
International Nuclear Information System (INIS)
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
Röder, Martin; Weber, Wolfgang
2016-07-01
The fundamental requirement when testing for and ensuring compliance with legally required labelling regulations is the reliable analysis of food allergens. This can be carried out by means of either DNA (deoxyribonucleic acid) or protein detection. Protein detection has the advantage of directly detecting the allergenic component and can currently be carried out using immunological (enzyme-linked immunosorbent assay [ELISA])/lateral flow devices [LFD]) or mass spectrometry-based techniques. DNA detection is indirect, but allows the presence of food allergens to be validated through the use of another marker. Each method has its pros and cons, which have to be considered on a case-by-case basis. ELISA is quantitative, quick and easy to carry out and has high sensitivity. LFD testing is ideal for industrial applications, as the tests can be carried out on-site. Both antibody-based tests may have problems with processed foods and false positive results. Mass-spectrometric techniques show a lot of promise, but are currently still time-consuming and complex to carry out. They also run into problems with processed foods and their degree of sensitivity is matrix and parameter dependent. For these reasons, this technique is only occasionally used. Polymerase chain reaction (PCR) provides the highest specificity and, depending on the target sequence, a very good to good level of sensitivity. Despite the high stability of DNA, PCR is still subject to the influence of processing and matrix related factors. Due to natural variation and production-related changes in the structures relevant in the process of detection, all methods exhibit a relatively high level of uncertainty of measurement. At present, there is no method which provides the absolute correct quantification. However, by means of laboratory-based analyses it is possible to calibrate for the allergen in question and thus be able to make reliable measurements using methods that are already available. PMID
International Nuclear Information System (INIS)
Basic principles of neutron activation analysis are outlined. Examples of its use in police science include analysis for gunshot residues, toxic element determinations and multielement comparisons. Advantages of neutron activation analysis over other techniques are described. (R.L.)
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
Institute of Scientific and Technical Information of China (English)
杜梅香
2006-01-01
This paper is about the discourse analysis and illustrate the approach to analysis the Critical discourses and the discourses about the educational situation of China. And it also includes the condensed theoretical support of the Critical discourse analysis and analysis of the sample I of the discourses between an illiterate person and the literate.
International Nuclear Information System (INIS)
Using orCAD Pspice EDA software, the circuit simulation and the analysis such as transient analysis, noise analysis, temperature analysis, are made for charge-sensitive preamplifier. By calculation and comparison, the result shows circuit noise responses according to the temperature changes. (authors)
Quantitative analysis chemistry
International Nuclear Information System (INIS)
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Energy Technology Data Exchange (ETDEWEB)
PECH, S.H.
2000-08-23
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
Analysis of Precision of Activation Analysis Method
DEFF Research Database (Denmark)
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Hazard Analysis Database Report
Grams, W H
2000-01-01
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Event history analysis: overview
DEFF Research Database (Denmark)
Keiding, Niels
2001-01-01
Survival analysis, Multi-state models, Counting processes, Aalen-Johansen estimator, Markov processes......Survival analysis, Multi-state models, Counting processes, Aalen-Johansen estimator, Markov processes...
Directory of Open Access Journals (Sweden)
Georgiana Cristina NUKINA
2012-07-01
Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
Energy Technology Data Exchange (ETDEWEB)
Hostick, Donna J.; Nicholls, Andrew K.; McDonald, Sean C.; Hollomon, Jonathan B.
2005-08-01
A joint NREL, ORNL, and PNNL team conducted market analysis to help inform DOE/EERE's Weatherization and Intergovernmental Program planning and management decisions. This chapter presents the results of the market analysis for the Buildings sector.
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
DEFF Research Database (Denmark)
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose;
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene......This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis...
Analysis of Business Environment
Horáková, Eva
2012-01-01
This bachelor's theses deals with analysis of the entrepreneurial environment of the company ALBO okna - dveře, s.r.o. . With the help of SWOT analysis, passportisation and Porter's analysis of five rival companies an analysis of the entrepreneurial environment, in which the company ALBO okna - dveře, s.r.o. is situated, will be carried out. This theses also comprises the evaluation of opportunities and risks resulting from the given entrepreneurial environment.
SOFAS: Software Analysis Services
Ghezzi, G
2010-01-01
We propose a distributed and collaborative software analysis platform to enable seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. In particular, we devise software analysis tools as services that can be accessed and composed over the Internet. These distributed services shall be widely accessible through a software analysis broker where organizations and research groups can register and share their tools. To enable (semi)-automat...
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
Energy Technology Data Exchange (ETDEWEB)
Corliss, William R.
1968-01-01
In activation analysis, a sample of an unknown material is first irradiated (activated) with nuclear particles. In practice these nuclear particles are almost always neutrons. The success of activation analysis depends upon nuclear reactions which are completely independent of an atom's chemical associations. The value of activation analysis as a research tool was recognized almost immediately upon the discovery of artificial radioactivity. This book discusses activation analysis experiments, applications and technical considerations.
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Discourse analysis and Foucault's
Directory of Open Access Journals (Sweden)
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Extending Scalasca's analysis features
Lorenz, Daniel; Böhme, David; Mohr, Bernd; Strube, Alexandre; Szebenyi, Zoltan
2013-01-01
Scalasca is a performance analysis tool, which parses the trace of an application run for certain patterns that indicate performance inefficiencies. In this paper, we present recently developed new features in Scalasaca. In particular, we describe two newly implemented analysis methods: the root cause analysis which tries to identify the cause of a delay and the critical path analysis, which analyses the path of execution that determines the application runtime. Furthermore, we present time-s...
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Hibbert, DBrynn
2005-01-01
Based on D Brynn Hibbert''s lectures on data analysis to undergraduates and graduate students, this book covers topics including measurements, means and confidence intervals, hypothesis testing, analysis of variance, and calibration models. It is meant as an entry level book targeted at learning and teaching undergraduate data analysis.
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Gabriel Data Analysis (GDA): from data analysis to food analysis
OLIVE, Gilles
2011-01-01
GDA is a software belonging to the Gabriel package and is devoted to data analysis. Year after year some new features have been introduced and the latest introductions are more dedicated to food. GDA is built around modules and we describe here the most widely used in food chemistry. GDA can be obtained free of charge upon request.
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
DEFF Research Database (Denmark)
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Fundamentals of functional analysis
Farenick, Douglas
2016-01-01
This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
International Nuclear Information System (INIS)
The heat transport systems of MONJU are three main heat transport loops, each loop consist of the primary, the secondary loop and the water-steam system, in addition, the auxiliary cooling system. These systems are under the influence one another on plant transient. So it is important to evaluate the flow and heat characteristics of the heat transport systems on calculating plant transient. We made the plant dynamic analysis codes of MONJU to calculate the plant transient analysis and evaluate the plant characteristics by the disturbance on the on-power operation and the performance of the plant control systems. In this paper, one of the main plant dynamic simulation code of MONJU, the calculation conditions on analysis, the plant safety analysis, the plant stability analysis and the plant thermal transient analysis are discribed. (author)
Johnson, Andrew
2009-01-01
This essay provides a detailed strategic analysis of 3rdWhale, a Vancouver-based start-up in the sustainability sector, along with an analysis of the smartphone applications industry. Porter?s five forces model is used to perform an industry analysis of the smartphone application industry and identify key success factors for application developers. Using the identified factors, 3rdWhale is compared to its indirect competitors to identify opportunities and threats and produce a range of strate...
Procházková, Gabriela
2014-01-01
The Bachelor thesis focuses on the analysis of the marketing mix which was applied to a company called Bill Ltd. The theoretical part focuses on the general description of the marketing mix and its individual elements. In the practical part is mentioned the history and present of the company Billa Ltd. It also includes a specific analysis of individual elements of the marketing mix of the company Bill Ltd. and analysis of the results of the electronic questioning, that focuses on br...
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
International Nuclear Information System (INIS)
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Möncke, Ulrich R.
1986-01-01
This paper specifies the theoretical basis for the implementation of different generators of the OPTRAN system. Grammar Flow analysis transports the techniques of data flow analysis to the meta level of compiler construction. The analogon to the states in data flow analysis are the syntax trees together with some information, which is associated with trees by propagation functions. One example is the association of characteristic graphs, another example the association of sets of matching tre...
Ranganath, Rajesh; Perotte, Adler; Elhadad, Noémie; Blei, David
2016-01-01
The electronic health record (EHR) provides an unprecedented opportunity to build actionable tools to support physicians at the point of care. In this paper, we investigate survival analysis in the context of EHR data. We introduce deep survival analysis, a hierarchical generative approach to survival analysis. It departs from previous approaches in two primary ways: (1) all observations, including covariates, are modeled jointly conditioned on a rich latent structure; and (2) the observation...
DEFF Research Database (Denmark)
Clemmensen, Line Katrine Harder; Hastie, Trevor; Witten, Daniela;
2011-01-01
commonplace in biological and medical applications. In this setting, a traditional approach involves performing feature selection before classification. We propose sparse discriminant analysis, a method for performing linear discriminant analysis with a sparseness criterion imposed such that classification...... and feature selection are performed simultaneously. Sparse discriminant analysis is based on the optimal scoring interpretation of linear discriminant analysis, and can be extended to perform sparse discrimination via mixtures of Gaussians if boundaries between classes are nonlinear or if subgroups...... are present within each class. Our proposal also provides low-dimensional views of the discriminative directions. © 2011 American Statistical Association and the American Society for Qualitys....
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Risk analysis methodology survey
Batson, Robert G.
1987-01-01
NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.
2010-01-01
Finite element analysis is an engineering method for the numerical analysis of complex structures. This book provides a bird's eye view on this very broad matter through 27 original and innovative research studies exhibiting various investigation directions. Through its chapters the reader will have access to works related to Biomedical Engineering, Materials Engineering, Process Analysis and Civil Engineering. The text is addressed not only to researchers, but also to professional engineers, engineering lecturers and students seeking to gain a better understanding of where Finite Element Analysis stands today.
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
National Oceanic and Atmospheric Administration, Department of Commerce — Space Weather Analysis archives are model output of ionospheric, thermospheric and magnetospheric particle populations, energies and electrodynamics
DEFF Research Database (Denmark)
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
Hazard Analysis Database Report
Energy Technology Data Exchange (ETDEWEB)
GRAMS, W.H.
2000-12-28
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.
Murty, PSR
2007-01-01
Power system analysis is a pre-requisite course for electrical engineering students. This book introduces concepts of a power system, network model faults and analysis and the primitive network stability. It also deals with graph theory relevant to various incidence matrices, building of network matrices and power flow studies. It further discusses with short circuit analysis, unbalanced fault analysis and power system stability problems, such as, steady state stability, transient stability and dynamic stability. Salient Features: Number of worked examples are followed after explaining theory
HIRENASD analysis Information Package
National Aeronautics and Space Administration — Updated November 2, 2011 Contains summary information and analysis condition details for the Aeroelastic Prediction Workshop Information plotted in this package is...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Risk Analysis of Marine Structures
DEFF Research Database (Denmark)
Hansen, Peter Friis
1998-01-01
Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated.......Basic concepts of risk analysis is introduced. Formulation and analysis of fault and event trees are treated....
Casebeer, A
SWOT analysis is an effective and simple planning technique which addresses one aspect of many strategic planning processes. Given the complex nature of modern health care systems, the ability to use this type of technique can enable health professionals to participate more fully in the analysis and implementation of health care improvement.
Advanced Analysis Environments - Summary
International Nuclear Information System (INIS)
This is a summary of the panel discussion on Advanced Analysis Environments. Rene Brun, Tony Johnson, and Lassi Tuura shared their insights about the trends and challenges in analysis environments. This paper contains the initial questions, a summary of the speakers' presentation, and the questions asked by the audience
Structural analysis for Diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2001-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...
Laboratory analysis of stardust.
Zinner, Ernst
2013-02-01
Tiny dust grains extracted from primitive meteorites are identified to have originated in the atmospheres of stars on the basis of their anomalous isotopic compositions. Although isotopic analysis with the ion microprobe plays a major role in the laboratory analysis of these stardust grains, many other microanalytical techniques are applied to extract the maximum amount of information.
Louis Kaplow; Steven Shavell
1999-01-01
This entry for the forthcoming The New Palgrave Dictionary of Economics (Second Edition) surveys the economic analysis of five primary fields of law: property law; liability for accidents; contract law; litigation; and public enforcement and criminal law. It also briefly considers some criticisms of the economic analysis of law.
Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2012-01-01
-type synchronisation – in particular, on our variant of IMC with a more permissive syntax, i.e. with a possibility to start a bounded number of new processes. We prove that the defined Pathway Analysis captures all the properties of the systems, i.e. is precise. The results of the Pathway Analysis can be therefore...
Directory of Open Access Journals (Sweden)
Ian M. Franks
2004-06-01
Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
DEFF Research Database (Denmark)
Josefsen, Knud; Nielsen, Henrik
2011-01-01
Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membran...
Flynn, Donald L.
Traditional approaches to the cost analysis of educational programs involve examining annual budgets. Such approaches do not properly consider the cost of either new capital expenditures or the current value of previously purchased items. This paper presents the methodology for a new approach to educational cost analysis that identifies the actual…
Analysis of Design Documentation
DEFF Research Database (Denmark)
Hansen, Claus Thorp
1998-01-01
has been established where we seek to identify useful design work patterns by retrospective analyses of documentation created during design projects. This paper describes the analysis method, a tentatively defined metric to evaluate identified work patterns, and presents results from the first...... analysis accomplished....
Chakravorti, Sivaji
2015-01-01
This book prepares newcomers to dive into the realm of electric field analysis. The book details why one should perform electric field analysis and what are its practical implications. It emphasizes both the fundamentals and modern computational methods of electric machines. The book covers practical applications of the numerical methods in high voltage equipment, including transmission lines, power transformers, cables, and gas insulated systems.
Spool assembly support analysis
International Nuclear Information System (INIS)
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Analysis of Industrial Wastewaters.
Mancy, K. H.; Weber, W. J., Jr.
A comprehensive, documented discussion of certain operating principles useful as guidelines for the analysis of industrial wastewaters is presented. Intended primarily for the chemist, engineer, or other professional person concerned with all aspects of industrial wastewater analysis, it is not to be considered as a substitute for standard manuals…
Analysis in usability evaluations
DEFF Research Database (Denmark)
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper Anders Søren
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations...
Liberdová, I.
2015-01-01
This article is focused on the dynamic drawing analysis. It deals with temporal segmentation methods for hand-drawn pictures. The automatic vectorization of segmentation results is considered as well. Dynamic drawing analysis may significantly improves tracing drawing test utilization in the clinical physiology trials.
Casebeer, A
SWOT analysis is an effective and simple planning technique which addresses one aspect of many strategic planning processes. Given the complex nature of modern health care systems, the ability to use this type of technique can enable health professionals to participate more fully in the analysis and implementation of health care improvement. PMID:8472105
2013-01-01
Time series analysis can be used to quantitatively monitor, describe, explain, and predict road safety developments. Time series analysis techniques offer the possibility of quantitatively modelling road safety developments in such a way that the dependencies between the observations of time series
Bass, Roger
2010-01-01
Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless--a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to…
DEFF Research Database (Denmark)
Josefsen, Knud; Nielsen, Henrik
2011-01-01
Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... closing the gap to the more laborious nuclease protection experiments....
Per Object statistical analysis
DEFF Research Database (Denmark)
Groom, Geoffrey Brian
2008-01-01
This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Obje...
DEFF Research Database (Denmark)
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysi...
Gait analysis: clinical facts.
Baker, Richard; Esquenazi, Alberto; Benedetti, Maria G; Desloovere, Kaat
2016-08-01
Gait analysis is a well-established tool for the quantitative assessment of gait disturbances providing functional diagnosis, assessment for treatment planning, and monitoring of disease progress. There is a large volume of literature on the research use of gait analysis, but evidence on its clinical routine use supports a favorable cost-benefit ratio in a limited number of conditions. Initially gait analysis was introduced to clinical practice to improve the management of children with cerebral palsy. However, there is good evidence to extend its use to patients with various upper motor neuron diseases, and to lower limb amputation. Thereby, the methodology for properly conducting and interpreting the exam is of paramount relevance. Appropriateness of gait analysis prescription and reliability of data obtained are required in the clinical environment. This paper provides an overview on guidelines for managing a clinical gait analysis service and on the principal clinical domains of its application: cerebral palsy, stroke, traumatic brain injury and lower limb amputation.
IAC - INTEGRATED ANALYSIS CAPABILITY
Frisch, H. P.
1994-01-01
The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model
International Nuclear Information System (INIS)
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
Energy Technology Data Exchange (ETDEWEB)
Porten, D.R.; Crowe, R.D.
1994-12-16
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall.
Robust Sparse Analysis Regularization
Vaiter, Samuel; Dossal, Charles; Fadili, Jalal
2011-01-01
This paper studies the properties of L1-analysis regularization for the resolution of linear inverse problems. Most previous works consider sparse synthesis priors where the sparsity is measured as the L1 norm of the coefficients that synthesize the signal in a given dictionary. In contrast, the more general analysis regularization minimizes the L1 norm of the correlations between the signal and the atoms in the dictionary. The corresponding variational problem includes several well-known regularizations such as the discrete total variation and the fused lasso. We first prove that a solution of analysis regularization is a piecewise affine function of the observations. Similarly, it is a piecewise affine function of the regularization parameter. This allows us to compute the degrees of freedom associated to sparse analysis estimators. Another contribution gives a sufficient condition to ensure that a signal is the unique solution of the analysis regularization when there is no noise in the observations. The s...
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Statistical Analysis of Thermal Analysis Margin
Garrison, Matthew B.
2011-01-01
NASA Goddard Space Flight Center requires that each project demonstrate a minimum of 5 C margin between temperature predictions and hot and cold flight operational limits. The bounding temperature predictions include worst-case environment and thermal optical properties. The purpose of this work is to: assess how current missions are performing against their pre-launch bounding temperature predictions and suggest any possible changes to the thermal analysis margin rules
Meaning from curriculum analysis
Finegold, Menahem; Mackeracher, Dorothy
This paper reports on the analysis of science curricula carried out across Canada within the framework of the Second International Science Study (SISS). The organization of Canadian education in twelve autonomous educational jurisdictions is briefly described and problems are noted in relation to the analysis of curricula on a national scale. The international design for curriculum analysis is discussed and an alternative design, more suited to the diversity of science education in Canada, is introduced. The analysis of curriculum documents is described and three patterns which emerge from this analysis are identified. These derive from the concepts of commonality, specificity and prescriptiveness. Commonality relates to topics listed in curriculum guideline documents by a number of jurisdictions. Specificity refers to the richness of curriculum documents. Prescriptiveness is a measure of the extent to which jurisdictions do or do not make provision for local options in curriculum design. The Canadian analysis, using the concepts of the common curriculum, specificity and prescriptiveness, is described and research procedures are exemplified. Outcomes of curriculum analysis are presented in graphical form.
DEFF Research Database (Denmark)
Bemman, Brian; Meredith, David
In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing it with a “......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...... it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis...
Strategic analysis of the company
Matoušková, Irena
2012-01-01
Strategic analysis of the company In my thesis I developed a strategic analysis of the company Pacovské strojírny a.s. I describe the various methods of internal and external strategic analysis in the theoretical part. I followed the methods used in the practical part. In an internal strategic analysis, I focused on the identification of internal resources and capabilities, the financial analysis and the chain of creating value. External strategic analysis includes PEST analysis, Porter's fiv...
DEFF Research Database (Denmark)
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
NASA Enterprise Visual Analysis
Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck
2007-01-01
NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.
DEFF Research Database (Denmark)
Nielsen, Christoffer Rosenkilde; Nielson, Hanne Riis
2006-01-01
operation blinding. In this paper we study the theoretical foundations for one of the successful approaches to validating cryptographic protocols and we extend it to handle the blinding primitive. Our static analysis approach is based on Flow Logic; this gives us a clean separation between the specification...... of the analysis and its realisation in an automatic tool. We concentrate on the former in the present paper and provide the semantic foundation for our analysis of protocols using blinding - also in the presence of malicious attackers....
International Nuclear Information System (INIS)
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
Iremonger, M J
1982-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Finite Discrete Gabor Analysis
DEFF Research Database (Denmark)
Søndergaard, Peter Lempel
2007-01-01
Gabor analysis is a method for analyzing signals through the use of a set of basic building blocks. The building blocks consists of a certain function (the window) that is shifted in time and frequency. The Gabor expansion of a signal contains information on the behavior of the signal in certain...... discrete time/frequency and Gabor analysis. It is intended to be both an educational and a computational tool. The toolbox was developed as part of this Ph.D. project to provide a solid foundation for the field of computational Gabor analysis....
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
DEFF Research Database (Denmark)
Hansen, Elo Harald
2004-01-01
This chapter provides an introduction to automated chemical analysis, which essentially can be divided into two groups: batch assays, where the solution is stationary while the container is moved through a number of stations where various unit operations performed; and continuous-flow procedures......, where the system is stationary while the solution moves through a set of conduits in which all required manipulations are performed. Emphasis is placed on flow injection analysis (FIA) and its further developments, that is, sequential injection analysis (SIA) and the Lab-on-Valve (LOV) approach. Since...
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
Pesticide Instrumental Analysis
International Nuclear Information System (INIS)
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
US Fish and Wildlife Service, Department of the Interior — This document is an analysis and summary of progress toward achieving the interim management objectives for the Russian River during the 1979 season. Additionally,...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Canonical Information Analysis
DEFF Research Database (Denmark)
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Energy Technology Data Exchange (ETDEWEB)
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
... may experience difficulties. Several factors can affect the sperm count or other semen analysis values, including use of alcohol, tobacco, ... the News Article Index About This Site Send Us Your Comments ...
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Energy Technology Data Exchange (ETDEWEB)
NONE
2005-06-14
This Standard establishes a practice for the proximate analysis of coal, that is, the coal is analysed for the content of moisture, ash and volatile matter; fixed carbon is calculated. The standard provides a basis for the comparison of coals.
Barth, Howard G.; Sun, Shao-Tang
1989-01-01
Presents a review of research focusing on scattering, elution techniques, electrozone sensing, filtration, centrifugation, comparison of techniques, data analysis, and particle size standards. The review covers the period 1986-1988. (MVL)
Crabb, J W; West, K A; Dodson, W S; Hulmes, J D
2001-05-01
Amino acid analysis (AAA) is one of the best methods to quantify peptides and proteins. Two general approaches to quantitative AAA exist, namely, classical postcolumn derivatization following ion-exchange chromatography and precolumn derivatization followed by reversed-phase HPLC (RP-HPLC). Excellent instrumentation and several specific methodologies are available for both approaches, and both have advantages and disadvantages. This unit focuses on picomole-level AAA of peptides and proteins using the most popular precolumn-derivatization method, namely, phenylthiocarbamyl amino acid analysis (PTC-AAA). It is directed primarily toward those interested in establishing the technology with a modest budget. PTC derivatization and analysis conditions are described, and support and alternate protocols describe additional techniques necessary or useful for most any AAA method--e.g., sample preparation, hydrolysis, instrument calibration, data interpretation, and analysis of difficult or unusual residues such as cysteine, tryptophan, phosphoamino acids, and hydroxyproline. PMID:18429107
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2009-01-01
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it....
DEFF Research Database (Denmark)
Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan
2015-01-01
In contrast to classical Fourier analysis, time–frequency analysis is concerned with localized Fourier transforms. Gabor analysis is an important branch of time–frequency analysis. Although significantly different, it shares with the wavelet transform methods the ability to describe the smoothness...... of a given function in a location-dependent way. The main tool is the sliding window Fourier transform or short-time Fourier transform (STFT) in the context of audio signals. It describes the correlation of a signal with the time–frequency shifted copies of a fixed function (or window or atom). Thus......, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Henson, C Ward; Kechris, Alexander S; Odell, Edward; Finet, Catherine; Michaux, Christian; Cassels, J W S
2003-01-01
This volume comprises articles from four outstanding researchers who work at the cusp of analysis and logic. The emphasis is on active research topics; many results are presented that have not been published before and open problems are formulated.
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
13. seminar 'Activation analysis'
International Nuclear Information System (INIS)
Collection of the abstracts of contributions to the seminar covering broad ranges of application of activation analysis and improvements of systems and process steps. Most of them have been prepared separately for the energy data bases. (RB)
Unsupervised Linear Discriminant Analysis
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
An algorithm for unsupervised linear discriminant analysis was presented. Optimal unsupervised discriminant vectors are obtained through maximizing covariance of all samples and minimizing covariance of local k-nearest neighbor samples. The experimental results show our algorithm is effective.
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Perspectives in shape analysis
Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie
2016-01-01
This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Hartmanová, Dominika
2013-01-01
Bachelor Thesis Analysis of the marketing mix describes a marketing mix of company Lego Tradings, s. r. o. The theoretical part includes specification of basic concepts, such as marketing, marketing mix, tools of marketing mix, product, price, place and promotion. The second part is devoted to custom solutions. The introducion of the Lego company comes first. There are also analysis of the tools of marketing mix. In this part the results are described for a marketing research, namely a quest...
Theoretical numerical analysis
Wendroff, Burton
2014-01-01
Theoretical Numerical Analysis focuses on the presentation of numerical analysis as a legitimate branch of mathematics. The publication first elaborates on interpolation and quadrature and approximation. Discussions focus on the degree of approximation by polynomials, Chebyshev approximation, orthogonal polynomials and Gaussian quadrature, approximation by interpolation, nonanalytic interpolation and associated quadrature, and Hermite interpolation. The text then ponders on ordinary differential equations and solutions of equations. Topics include iterative methods for nonlinear systems, matri
Analysis of educational blogs.
Christenová, Jindřiška
2014-01-01
Abstract The bachelor's thesis Analysis of educational blogs.is focused on theme content analizing of educational blogs, which has been written by teachers from czech republic in czech language. Further aim of thesis was to finding and description of sructure of themes and their attributes. In the reaserch was used method of theme content analysis with elemnts of inductive anylisis method. Main research sample was fifteen post from ten educational blogs written by teachers and educationali...
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Economic Analysis of Constitutions
Roger B. Myerson
2000-01-01
This paper is a preliminary draft of an article to appear in Chicago Law Review (2000), as part of a symposium reviewing two new books on economic analysis of constitutions: Dennis Mueller's Constitutional Democracy and Robert Cooter's Strategic Constitution. Some of the basic questions of constitutional analysis are introduced, and the importance of work in this area is shown as one of the major new developments in social theory. The methods of economic theory are then shown to be particular...
MEAD retrospective analysis report
Hasager, Charlotte Bay; CARSTENSEN J.; Frohn, L. M.; Gustafson, B.; Brandt, J.; Conley, D.; Geernaert, G.; Henriksen, P.; C. A. Skjøth; Johnsson, M.
2003-01-01
The retrospective analysis investigates links between atmospheric nitrogen deposition and algal bloom development in the Kattegat Sea from April to September 1989-1999. The analysis is based on atmospheric deposition model results from the ACDEP model,hydrodynamic deep-water flux results, phytoplankton abundance observations from Danish and Swedish marine monitoring stations and optical satellite data. Approximately 70 % of the atmospheric deposition consists of wet depostion of highly episod...
Bayesian Group Factor Analysis
Virtanen, Seppo; Klami, Arto; Khan, Suleiman A; Kaski, Samuel
2011-01-01
We introduce a factor analysis model that summarizes the dependencies between observed variable groups, instead of dependencies between individual variables as standard factor analysis does. A group may correspond to one view of the same set of objects, one of many data sets tied by co-occurrence, or a set of alternative variables collected from statistics tables to measure one property of interest. We show that by assuming group-wise sparse factors, active in a subset of the sets, the variat...
Klami, Arto; Virtanen, Seppo; Leppäaho, Eemeli; Kaski, Samuel
2014-01-01
Factor analysis provides linear factors that describe relationships between individual variables of a data set. We extend this classical formulation into linear factors that describe relationships between groups of variables, where each group represents either a set of related variables or a data set. The model also naturally extends canonical correlation analysis to more than two sets, in a way that is more flexible than previous extensions. Our solution is formulated as variational inferenc...
Institute of Scientific and Technical Information of China (English)
XIAO Xin
2015-01-01
In this article an attempt to enhance the awareness of hedging use in discourse analysis and academic writing is made by analyzing hedges employed in two comparable texts. The discourse analysis is conducted from“content-oriented”hedges and“reader-oriented”hedges. The article suggests that hedging can dampen utterances and statements, weaken the force of what one says and show politeness to the listeners or readers, which varies from different discourse styles of various genres.
Vávrová, Eva
2014-01-01
This thesis deals with the analysis of EEG during various sleep stages, which is done by calculating the selected parameters from the time and frequency domain. These parameters are calculated from individual segments of EEG signals that correspond with various sleep stages. Based on the analysis it decides which EEG parameters are appropriate for the automatic detection of the phases and which method is more suitable for evaluation of data in hypnogram. The programme MATLAB was used for the ...
Energy Technology Data Exchange (ETDEWEB)
Boggs, Paul T.; Althsuler, Alan (Exagrid Engineering); Larzelere, Alex R. (Exagrid Engineering); Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Submarine hydraulic control analysis
Bower, Michael J.
1980-01-01
Approved for public release; distribution unlimited A mathematical model was developed to include line effects in the submarine hydraulic system dynamic performance analysis. The project was undertaken in an effort to demonstrate the necessity of coupling the entire hydraulic power network for an accurate analysis of any of the subsystems rather than the current practice of treating a component loop as an isolated system. It was intended that the line model could be co...
Mateev, Nikolay; Menon, Vijay; Pingali, Keshav
2000-01-01
Restructuring compilers use dependence analysis to prove that the meaning of a program is not changed by a transformation. A well-known limitation of dependence analysis is that it examines only the memory locations read and written by a statement, and does not assume any particular interpretation for the operations in that statement. Exploiting the semantics of these operations enables a wider set of transformations to be used, and is critical for optimizing important codes such as LU factor...
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
Bass, Roger
2010-01-01
Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless—a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to Enlightenment and Samādhi. The concept of stimulus singularity is introduced to account for why, within Zen's frame of reference, its methods can be stu...
Temperature reconstruction analysis
Scafetta, N; Grigolini, P; Roberts, J; Scafetta, Nicola; Imholt, Tim; Grigolini, Paolo; Roberts, Jim
2002-01-01
This paper presents a wavelet multiresolution analysis of a time series dataset to study the correlation between the real temperature data and three temperature model reconstructions at different scales. We show that the Mann et.al. model reconstructs the temperature better at all temporal resolutions. We show and discuss the wavelet multiresolution analysis of the Mann's temperature reconstruction for the period from 1400 to 2000 A.D.E.
International Nuclear Information System (INIS)
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic
2012-01-01
The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...
Provenance as Dependency Analysis
Cheney, James; Ahmed, Amal; Acar, Umut,
2007-01-01
Provenance is information recording the source, derivation, or history of some information. Provenance tracking has been studied in a variety of settings; however, although many design points have been explored, the mathematical or semantic foundations of data provenance have received comparatively little attention. In this paper, we argue that dependency analysis techniques familiar from program analysis and program slicing provide a formal foundation for forms of provenance that are intende...
Diversity and abundance of bacteria in an underground oil-storage cavity
Directory of Open Access Journals (Sweden)
Kodama Yumiko
2002-08-01
Full Text Available Abstract Background Microorganisms inhabiting subterranean oil fields have recently attracted much attention. Since intact groundwater can easily be obtained from the bottom of underground oil-storage cavities without contamination by surface water, studies on such oil-storage cavities are expected to provide valuable information to understand microbial ecology of subterranean oil fields. Results DNA was extracted from the groundwater obtained from an oil-storage cavity situated at Kuji in Iwate, Japan, and 16S rRNA gene (16S rDNA fragments were amplified by PCR using combinations of universal and Bacteria-specific primers. The sequence analysis of 154 clones produced 31 different bacterial sequence types (a unique clone or group of clones with sequence similarity of > 98. Major sequence types were related to Desulfotomaculum, Acetobacterium, Desulfovibrio, Desulfobacula, Zoogloea and Thiomicrospira denitrificans. The abundance in the groundwater of bacterial populations represented by these major sequence types was assessed by quantitative competitive PCR using specific primers, showing that five rDNA types except for that related to Desulfobacula shared significant proportions (more than 1% of the total bacterial rDNA. Conclusions Bacteria inhabiting the oil-storage cavity were unexpectedly diverse. A phylogenetic affiliation of cloned 16S rDNA sequences suggests that bacteria exhibiting different types of energy metabolism coexist in the cavity.
Activation analysis in forensic studies
International Nuclear Information System (INIS)
Application of neutron activation analysis in forensics are grouped into 3 categories: firearms-discharge applications, elemental analysis of other nonbiological evidence materials (paint, other), and elemental analysis of biological evidence materials (multielemental analysis of hair, analysis of hair for As and Hg). 18 refs
Software reliability analysis in probabilistic risk analysis
International Nuclear Information System (INIS)
Probabilistic Risk Analysis (PRA) is a tool which can reveal shortcomings of the NPP design in general. PRA analysts have not had sufficient guiding principles in modelling particular digital components malfunctions. Digital I and C systems are mostly analysed simply and the software reliability estimates are engineering judgments often lacking a proper justification. The OECD/NEA Working Group RISK's task DIGREL develops a taxonomy of failure modes of digital I and C systems. The EU FP7 project HARMONICS develops software reliability estimation method based on an analytic approach and Bayesian belief network. (author)
International Nuclear Information System (INIS)
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Dewhurst, A.; Legger, F.
2015-12-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.
Legger, Federica; The ATLAS collaboration
2015-01-01
The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...
Gonzalez Caballero, Isidro
2012-01-01
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of ...
Failure Analysis for Improved Reliability
Sood, Bhanu
2016-01-01
Outline: Section 1 - What is reliability and root cause? Section 2 - Overview of failure mechanisms. Section 3 - Failure analysis techniques (1. Non destructive analysis techniques, 2. Destructive Analysis, 3. Materials Characterization). Section 4 - Summary and Closure
Neutron multiplicity analysis tool
Energy Technology Data Exchange (ETDEWEB)
Stewart, Scott L [Los Alamos National Laboratory
2010-01-01
I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Directory of Open Access Journals (Sweden)
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
International Nuclear Information System (INIS)
The high sensitivity of high-flux (reactor) thermal-neutron activation analysis (NAA) for the detection and quantitative measurement of a large number of elements has led, in recent years, to a considerable degree of application of the method in the area of scientific crime investigation (criminalistics). Thus, in a Forensic Activation Analysis Bibliography recently compiled by the author, some 135 publications in this field are listed - and more are appearing quite rapidly. The nondestructive character of the purely-instrumental form of the method is an added advantage in forensic work, since evidence samples involved in actual criminal cases are not destroyed during analysis, but are preserved intact for possible presentation in court. Quite aside from, or in addition to, use in court, NAA results can be very helpful in the investigative stage of particular criminal cases. The ultra sensitivity of the method often enables one to analyze evidence specimens that are too tiny for meaningful analysis by more conventional elemental analysis methods. Also, this high sensitivity often enables one to characterize, or individualize, evidence specimens as to the possibility of common origin - via the principle of multi-element trace-constituent characterization
Directory of Open Access Journals (Sweden)
Suphattharachai Chomphan
2012-01-01
Full Text Available Problem statement: The ukulele is a trendy instrument in the present day. It is a member of the guitar family of instruments which employs four nylon or gut strings or four courses of strings. However, a statistical analysis of the pitch of this instrument has not been conducted. To analysis pitch or fundamental frequency of its main cords should be performed in an appropriate way. This study brings about its effective sound synthesis which is an important issue in the future. Approach: An efficient technique for the analysis of the fundamental frequency (F0 of the human speech had been applied to the analysis of main cords of the ukulele. The autocorrelation-based technique was used with the signal waveform to extract the optimal period or pitch for the corresponding analyzed frame in time domain. Then the corresponding fundamental frequency was calculated in the frequency domain. Results: The 21 main cords were chosen in the study. It had been seen that the existing fundamental frequency values were varied from one to three values. The value was ranging from 65.42 Hz-329.93 Hz. Conclusion: By using the analysis technique of fundamental frequency of the human speech, the output frequencies of all main cords can be extracted. It can be empirically seen that they have their unique values from each others."
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
Trajectory Based Traffic Analysis
DEFF Research Database (Denmark)
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...
Principles of harmonic analysis
Deitmar, Anton
2014-01-01
This book offers a complete and streamlined treatment of the central principles of abelian harmonic analysis: Pontryagin duality, the Plancherel theorem and the Poisson summation formula, as well as their respective generalizations to non-abelian groups, including the Selberg trace formula. The principles are then applied to spectral analysis of Heisenberg manifolds and Riemann surfaces. This new edition contains a new chapter on p-adic and adelic groups, as well as a complementary section on direct and projective limits. Many of the supporting proofs have been revised and refined. The book is an excellent resource for graduate students who wish to learn and understand harmonic analysis and for researchers seeking to apply it.
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Choudary, A D R
2014-01-01
The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...
DEFF Research Database (Denmark)
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Energy Technology Data Exchange (ETDEWEB)
Donahue, C.J.; Rais, E.A. [University of Michigan, Dearborn, MI (USA)
2009-02-15
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter, fixed carbon, and ash content are determined for each sample and comparisons are made. Proximate analysis is performed on a coal sample from a local electric utility. From the weight percent sulfur found in the coal (determined by a separate procedure the Eschka method) and the ash content, students calculate the quantity of sulfur dioxide emissions and ash produced annually by a large coal-fired electric power plant.
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
Vágner, Petr; Maršík, František
2016-01-01
The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.
Kalaitzis, Alfredo A
2011-01-01
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data set in the presence of independent spherical Gaussian noise, Sigma = (sigma^2)*I. The maximum likelihood solution for the model is an eigenvalue problem on the sample covariance matrix. In this paper we consider the situation where the data variance is already partially explained by other factors, e.g. covariates of interest, or temporal correlations leaving some residual variance. We decompose the residual variance into its components through a generalized eigenvalue problem, which we call residual component analysis (RCA). We show that canonical covariates analysis (CCA) is a special case of our algorithm and explore a range of new algorithms that arise from the framework. We illustrate the ideas on a gene expression time series data set and the recovery of human pose from silhouette.
Gasiński, Leszek
2016-01-01
This second of two Exercises in Analysis volumes covers problems in five core topics of mathematical analysis: Function Spaces, Nonlinear and Multivalued Maps, Smooth and Nonsmooth Calculus, Degree Theory and Fixed Point Theory, and Variational and Topological Methods. Each of five topics corresponds to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. Exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic. This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of the ...
Power electronics reliability analysis.
Energy Technology Data Exchange (ETDEWEB)
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
Neutron signal transfer analysis
Pleinert, H; Lehmann, E
1999-01-01
A new method called neutron signal transfer analysis has been developed for quantitative determination of hydrogenous distributions from neutron radiographic measurements. The technique is based on a model which describes the detector signal obtained in the measurement as a result of the action of three different mechanisms expressed by signal transfer functions. The explicit forms of the signal transfer functions are determined by Monte Carlo computer simulations and contain only the distribution as a variable. Therefore an unknown distribution can be determined from the detector signal by recursive iteration. This technique provides a simple and efficient tool for analysis of this type while also taking into account complex effects due to the energy dependency of neutron interaction and single and multiple scattering. Therefore this method provides an efficient tool for precise quantitative analysis using neutron radiography, as for example quantitative determination of moisture distributions in porous buil...
Energy Technology Data Exchange (ETDEWEB)
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
Foundations of VISAR analysis.
Energy Technology Data Exchange (ETDEWEB)
Dolan, Daniel H.
2006-06-01
The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.
Software safety hazard analysis
Energy Technology Data Exchange (ETDEWEB)
Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)
1996-02-01
Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.
Sohrab, Houshang H
2014-01-01
This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical tec
Energy Technology Data Exchange (ETDEWEB)
Z. Ceylan
1998-04-28
The purpose of this analysis is to determine the structural response of the 21 pressurized water reactor (PWR) uncanistered fuel (UCF) waste package (WP) to a tipover design basis event (DBE) dynamic load; the results will be reported in terms of stress magnitudes. Finite-element solution was performed by making use of the commercially available ANSYS finite-element code. A finite-element model of the waste package was developed and analyzed for a tipover DBE dynamic load. The results of this analysis were provided in tables and were also plotted in terms of the maximum stress contours to determine their locations.
Schramm, Michael J
2008-01-01
This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl
Energy Technology Data Exchange (ETDEWEB)
Witholder, R.E.
1980-04-01
The Solar Energy Research Institute has conducted a limited sensitivity analysis on a System for Projecting the Utilization of Renewable Resources (SPURR). The study utilized the Domestic Policy Review scenario for SPURR agricultural and industrial process heat and utility market sectors. This sensitivity analysis determines whether variations in solar system capital cost, operation and maintenance cost, and fuel cost (biomass only) correlate with intuitive expectations. The results of this effort contribute to a much larger issue: validation of SPURR. Such a study has practical applications for engineering improvements in solar technologies and is useful as a planning tool in the R and D allocation process.
Harmonic analysis and applications
Heil, Christopher
2007-01-01
This self-contained volume in honor of John J. Benedetto covers a wide range of topics in harmonic analysis and related areas. These include weighted-norm inequalities, frame theory, wavelet theory, time-frequency analysis, and sampling theory. The chapters are clustered by topic to provide authoritative expositions that will be of lasting interest. The original papers collected are written by prominent researchers and professionals in the field. The book pays tribute to John J. Benedetto's achievements and expresses an appreciation for the mathematical and personal inspiration he has given to
Analysis of maintenance strategies
International Nuclear Information System (INIS)
The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view
Eliezer, C J; Maxwell, E A; Sneddon, I N
1963-01-01
Concise Vector Analysis is a five-chapter introductory account of the methods and techniques of vector analysis. These methods are indispensable tools in mathematics, physics, and engineering. The book is based on lectures given by the author in the University of Ceylon.The first two chapters deal with vector algebra. These chapters particularly present the addition, representation, and resolution of vectors. The next two chapters examine the various aspects and specificities of vector calculus. The last chapter looks into some standard applications of vector algebra and calculus.This book wil
Introduction to complex analysis
Priestley, H A
2003-01-01
Complex analysis is a classic and central area of mathematics, which is studied and exploited in a range of important fields, from number theory to engineering. Introduction to Complex Analysis was first published in 1985, and for this much awaited second edition the text has been considerably expanded, while retaining the style of the original. More detailed presentation is given of elementary topics, to reflect the knowledge base of current students. Exercise sets have beensubstantially revised and enlarged, with carefully graded exercises at the end of each chapter.This is the latest additi
Kishore, K Lal
2008-01-01
Second Edition of the book Electronic Circuit Analysis is brought out with certain new Topics and reorganization of text matter into eight units. With addition of new topics, syllabi of many universities in this subject can be covered. Besides this, the book can also meet the requirements of M.Sc (Electronics), AMIETE, AMIE (Electronics) courses. Text matter is improved thoroughly. New topics like frequency effects in multistage amplifiers, amplifier circuit analysis, design of high frequency amplifiers, switching regulators, voltage multipliers, Uninterrupted Power Supplies (UPS), and Switchi
Environmental analysis support
Energy Technology Data Exchange (ETDEWEB)
Miller, R.L.
1996-06-01
Activities in environmental analysis support included assistance to the Morgantown and Pittsburgh Energy Technology Centers (METC and PETC) in reviewing and preparing documents required by the National Environmental Policy Act (NEPA) for projects selected for the Clean Coal Technology (CCT) Program. An important activity was the preparation for METC of a final Environmental Assessment (EA) for the proposed Externally Fired Combined Cycle (EFCC) Project in Warren, Pennsylvania. In addition, a post-project environmental analysis was prepared for PETC to evaluate the Demonstration of Advanced Combustion Techniques for a Tangentially-Fired Boiler in Lynn Haven, Florida.
Provenance as Dependency Analysis
Cheney, James; Acar, Umut
2007-01-01
Provenance is information recording the source, derivation, or history of some information. Provenance tracking has been studied in a variety of settings; however, although many design points have been explored, the mathematical or semantic foundations of data provenance have received comparatively little attention. In this paper, we argue that dependency analysis techniques familiar from program analysis and program slicing provide a formal foundation for forms of provenance that are intended to show how (part of) the output of a query depends on (parts of) its input. We introduce a semantic characterization of such dependency provenance, show that this form of provenance is not computable, and provide dynamic and static approximation techniques.
Strictness and Totality Analysis
DEFF Research Database (Denmark)
Solberg, K. L.; Nielson, Hanne Riis; Nielson, Flemming
1998-01-01
We define a novel inference system for strictness and totality analysis for the simply-typed lazy lambda-calculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those...... terms that definitely do not denote bottom (i.e. do evaluate to WHNF). The analysis is presented as an annotated type system allowing conjunctions at ?top-level? only. We give examples of its use and prove the correctness with respect to a natural-style operational semantics....
Towards Cognitive Component Analysis
DEFF Research Database (Denmark)
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...... semantics, not only in text, but also in dynamic text (chat), images, and combinations of text and images. Here we further expand on the relevance of the ICA model for representing context, including two new analyzes of abstract data: social networks and musical features....
Abrahams, J R; Hiller, N
1965-01-01
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther
Hoffman, Kenneth
2007-01-01
Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq
Rudin, Walter
2011-01-01
In the late 1950s, many of the more refined aspects of Fourier analysis were transferred from their original settings (the unit circle, the integers, the real line) to arbitrary locally compact abelian (LCA) groups. Rudin's book, published in 1962, was the first to give a systematic account of these developments and has come to be regarded as a classic in the field. The basic facts concerning Fourier analysis and the structure of LCA groups are proved in the opening chapters, in order to make the treatment relatively self-contained.
Associative Analysis in Statistics
Directory of Open Access Journals (Sweden)
Mihaela Muntean
2015-03-01
Full Text Available In the last years, the interest in technologies such as in-memory analytics and associative search has increased. This paper explores how you can use in-memory analytics and an associative model in statistics. The word “associative” puts the emphasis on understanding how datasets relate to one another. The paper presents the main characteristics of “associative” data model. Also, the paper presents how to design an associative model for labor market indicators analysis. The source is the EU Labor Force Survey. Also, this paper presents how to make associative analysis.
Peng, Chao-Ying Joanne
2008-01-01
"Peng provides an excellent overview of data analysis using the powerful statistical software package SAS. This book is quite appropriate as a self-placed tutorial for researchers, as well as a textbook or supplemental workbook for data analysis courses such as statistics or research methods. Peng provides detailed coverage of SAS capabilities using step-by-step procedures and includes numerous comprehensive graphics and figures, as well as SAS printouts. Readers do not need a background in computer science or programming. Includes numerous examples in education, health sciences, and business.
Institute of Scientific and Technical Information of China (English)
ZHANG Hong; WANG Xin; LI Junwei; CAO Xianguang
2006-01-01
A new unsupervised feature extraction method called similar component analysis (SCA) is proposed in this paper. SCA method has a self-aggregation property that the data objects will move towards each other to form clusters through SCA theoretically,which can reveal the inherent pattern of similarity hidden in the dataset. The inputs of SCA are just the pairwise similarities of the dataset,which makes it easier for time series analysis due to the variable length of the time series. Our experimental results on many problems have verified the effectiveness of SCA on some engineering application.
1976-01-01
Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.
Harris, D. W.
1972-01-01
The radiation interface in spacecrafts using radioisotope thermoelectric generators is studied. A Monte Carlo analysis of the radiation field that includes scattered radiation effects, produced neutron and gamma photon isoflux contours as functions of distance from the RTG center line. It is shown that the photon flux is significantly depressed in the RTG axial direction because of selfshielding. Total flux values are determined by converting the uncollided flux values into an equivalent RTG surface source and then performing a Monte Carlo analysis for each specific dose point. Energy distributions of the particle spectra completely define the radiation interface for a spacecraft model.
Greenberg, Marc W.; Laing, William
2013-01-01
An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.
Inverse correspondence analysis
Groenen, PJF; van de Velden, M
2004-01-01
In correspondence analysis (CA), rows and columns of a data matrix are depicted as points in low-dimensional space. The row and column profiles are approximated by minimizing the so-called weighted chi-squared distance between the original profiles and their approximations, see for example, [Theory
Retrospective landscape analysis
DEFF Research Database (Denmark)
Fritzbøger, Bo
2011-01-01
On the basis of maps from the 18th and 19th centuries, a retrospective analysis was carried out of documentary settlement and landscape data extending back to the Middle Ages with the intention of identifying and dating general structural and dynamic features of the cultural landscape in a selected...
Haskell data analysis cookbook
Shukla, Nishant
2014-01-01
Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.
Kolmogorov, A N; Silverman, Richard A
1975-01-01
Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.
Information Security Risk Analysis
Peltier, Thomas R
2010-01-01
Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.
Kane, Jonathan M
2016-01-01
This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
International Nuclear Information System (INIS)
This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)
International Nuclear Information System (INIS)
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Senegal : Country Environmental Analysis
World Bank
2008-01-01
The main objective of the Senegal Country Environmental Analysis (CEA) is to reinforce the ongoing dialogue on environmental issues between the World Bank and the Government of Senegal. The CEA also aims to support the ongoing Government implementation of a strategic results-based planning process at the Environment Ministry (MEPNBRLA). The main goal is to enable Senegal to have the necess...
D'Amico, Teresa; Donahue, Craig J.; Rais, Elizabeth A.
2008-01-01
This lab experiment illustrates the use of differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) in the measurement of polymer properties. A total of seven exercises are described. These are dry exercises: students interpret previously recorded scans. They do not perform the experiments. DSC was used to determine the…
Braun, W. John
2012-01-01
The Analysis of Variance is often taught in introductory statistics courses, but it is not clear that students really understand the method. This is because the derivation of the test statistic and p-value requires a relatively sophisticated mathematical background which may not be well-remembered or understood. Thus, the essential concept behind…
Elementary functional analysis
Shilov, Georgi E
1996-01-01
Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.
Silverman, Richard A
1984-01-01
A shorter version of A. I. Markushevich's masterly three-volume Theory of Functions of a Complex Variable, this edition is appropriate for advanced undergraduate and graduate courses in complex analysis. Numerous worked-out examples and more than 300 problems, some with hints and answers, make it suitable for independent study. 1967 edition.
Operando (micro) XAFS analysis
Arčon, Iztok; Dominko, Robert; Vogel-Mikuš, Katarina
2016-01-01
In the talk the principles of XAS methods were presented with practical examples which illustrate the possibilities and advanced approaches for their use in structural analysis of different types of materials. The emphasis will be on to the use of XAS spectroscopy in operando mode and in combination with X-ray microscopy.
Martin, Vance S.
2009-01-01
There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…
Multiphasic analysis of growth.
Koops, W.J.
1989-01-01
The central theme of this thesis is the mathematical analysis of growth in animals, based on the theory of multiphasic growth. Growth in biological terms is related to increase in size and shape. This increase is determined by internal (genetical) and external (environmental) factors. Well known mat
Russian Language Analysis Project
Serianni, Barbara; Rethwisch, Carolyn
2011-01-01
This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…
Isaacson, Eugene
1994-01-01
This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.
Safeguards system analysis, (1)
International Nuclear Information System (INIS)
A system analysis on the implementing safeguards system based on the traditional materials accountancy was done. This report describes about verification methods applied to operator's measurement data, MUF evaluation method, theories on the decision of PIT frequency and designing of inspection plan. (author)
Westerhuis, J.A.; Derks, E.P.P.A.; Hoefsloot, H.C.J.; Smilde, A.K.
2007-01-01
The interpretation of principal component analysis (PCA) models of complex biological or chemical data can be cumbersome because in PCA the decomposition is performed without any knowledge of the system at hand. Prior information of the system is not used to improve the interpretation. In this paper
Advanced biomedical image analysis
Haidekker, Mark A
2010-01-01
"This book covers the four major areas of image processing: Image enhancement and restoration, image segmentation, image quantification and classification, and image visualization. Image registration, storage, and compression are also covered. The text focuses on recently developed image processing and analysis operators and covers topical research"--Provided by publisher.
Learning: An Evolutionary Analysis
Swann, Joanna
2009-01-01
This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…
Developing Word Analysis Skills.
Heilman, Arthur W.
The importance of word analysis skills to reading ability is discussed, and methodologies for teaching such skills are examined. It is stated that a child cannot become proficient in reading if he does not master the skill of associating printed letter symbols with the sounds they represent. Instructional procedures which augment the alphabet with…
DEFF Research Database (Denmark)
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid;
2016-01-01
automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...
Learning Haskell data analysis
Church, James
2015-01-01
If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.
Communication Network Analysis Methods.
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
DEFF Research Database (Denmark)
Poulsen, Mikael Zebbelin
2002-01-01
, by the implementation of the Simpy tool box. This is an object oriented system implemented in the Python language. It can be used for analysis of DAEs, ODEs and non-linear equation and uses e.g. symbolic representations of expressions and equations. The presentations of theory and algorithms for structural index...
Shifted Independent Component Analysis
DEFF Research Database (Denmark)
Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai
2007-01-01
Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...
Banerjee, Sudipto
2016-01-01
With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains.
International Nuclear Information System (INIS)
The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)
Banerjee, Sudipto
2016-01-01
With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains. PMID:26789381
Institute of Scientific and Technical Information of China (English)
于昌利
2008-01-01
This paper is based on Chinese and English examples,presenting a systematic research and analysis on zero anaphora,pronominal anaphora and NP anaphora. It is found that semantic features,pragmatic elements.contexts and syntactic structures play important roles in our choice and interpreting them.
Directory of Open Access Journals (Sweden)
Doaa Mohey El-Din
2015-09-01
Full Text Available Sentiment analysis or opinion mining is used to automate the detection of subjective information such as opinions, attitudes, emotions, and feelings. Hundreds of thousands care about scientific research and take a long time to select suitable papers for their research. Online reviews on papers are the essential source to help them. The reviews save reading time and save papers cost. This paper proposes a new technique to analyze online reviews. It is called sentiment analysis of online papers (SAOOP. SAOOP is a new technique used for enhancing bag-of-words model, improving the accuracy and performance. SAOOP is useful in increasing the understanding rate of review's sentences through higher language coverage cases. SAOOP introduces solutions for some sentiment analysis challenges and uses them to achieve higher accuracy. This paper also presents a measure of topic domain attributes, which provides a ranking of total judging on each text review for assessing and comparing results across different sentiment techniques for a given text review. Finally, showing the efficiency of the proposed approach by comparing the proposed technique with two sentiment analysis techniques. The comparison terms are based on measuring accuracy, performance and understanding rate of sentences.
Wood, P. J.; Gower, D. B.
This chapter covers the analysis of steroids with progesterone-like activity, classified as “progestagens”. Steroids in this group include the naturally occurring C21 steroids, progesterone (4-pregnene-3,20-dione) and its metabolites, together with synthetic steroids, such as norgestrel norethisterone (NE), and medroxyprogesterone acetate which also have progestational activity.
Anisotropic generalized Procrustes analysis
Bennani Dosse, Mohammed; Kiers, Henk A.L.; Ten Berge, Jos M.F.
2011-01-01
Generalized Procrustes analysis is a popular method for matching several configurations by translations, rotations/reflections and scaling constants. It aims at producing a group average from these Euclidean similarity transformations followed by bi-linear approximation of this group average for gra
W. de Nooy
2009-01-01
Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the si
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Transactional Analysis in Management.
Hewson, Julie; Turner, Colin
Although Transactional Analysis (TA) has heavily influenced psychotherapy, little has been written to parallel that influence in areas of organization theory, organization behavior, or management studies. This book is intended primarily for people working in management roles. In part one, personal experiences are drawn upon to describe a fictional…
DEFF Research Database (Denmark)
Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi;
2015-01-01
This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....
Shah, Anwar
2005-01-01
This book provides tools of analysis for discovering equity in tax burdens as well as in public spending and judging government performance in its role in safeguarding the interests of the poor and those otherwise disadvantaged members of society, such as women, children, and minorities. The book further provides a framework for a rights-based approach to citizen empowerment-in other words, ...
Idris, Ivan
2014-01-01
This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.
Douglas, David M.
2016-01-01
Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it d
Stekelenburg, G.J. van; Koorevaar, G.
1971-01-01
Additional analytical information is given about the method of head space analysis. From the data presented it can be concluded that this technique may be advantageous for enzyme kinetic studies in turbid solutions, provided a volatile organic substance is involved in the chemical reaction. Also som
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-10-01
The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ``Fugen`` was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ``Fugen`` has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)
Timmerman, M.E.
2006-01-01
A general framework for the exploratory component analysis of multilevel data (MLCA) is proposed. In this framework, a separate component model is specified for each group of objects at a certain level. The similarities between the groups of objects at a given level can be expressed by imposing cons
Proteoglycan isolation and analysis
DEFF Research Database (Denmark)
Woods, A; Couchman, J R
2001-01-01
Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...
Energy Technology Data Exchange (ETDEWEB)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Critical Analysis of Multimodal Discourse
DEFF Research Database (Denmark)
van Leeuwen, Theo
2013-01-01
This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples...... of recent work in the critical analysis of multimodal discourse....
Nielsen, S. Suzanne
Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet
Systems analysis - independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Complementing Gender Analysis Methods.
Kumar, Anant
2016-01-01
The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756
Ceramic tubesheet design analysis
Energy Technology Data Exchange (ETDEWEB)
Mallett, R.H.; Swindeman, R.W.
1996-06-01
A transport combustor is being commissioned at the Southern Services facility in Wilsonville, Alabama to provide a gaseous product for the assessment of hot-gas filtering systems. One of the barrier filters incorporates a ceramic tubesheet to support candle filters. The ceramic tubesheet, designed and manufactured by Industrial Filter and Pump Manufacturing Company (EF&PM), is unique and offers distinct advantages over metallic systems in terms of density, resistance to corrosion, and resistance to creep at operating temperatures above 815{degrees}C (1500{degrees}F). Nevertheless, the operational requirements of the ceramic tubesheet are severe. The tubesheet is almost 1.5 m in (55 in.) in diameter, has many penetrations, and must support the weight of the ceramic filters, coal ash accumulation, and a pressure drop (one atmosphere). Further, thermal stresses related to steady state and transient conditions will occur. To gain a better understanding of the structural performance limitations, a contract was placed with Mallett Technology, Inc. to perform a thermal and structural analysis of the tubesheet design. The design analysis specification and a preliminary design analysis were completed in the early part of 1995. The analyses indicated that modifications to the design were necessary to reduce thermal stress, and it was necessary to complete the redesign before the final thermal/mechanical analysis could be undertaken. The preliminary analysis identified the need to confirm that the physical and mechanical properties data used in the design were representative of the material in the tubesheet. Subsequently, few exploratory tests were performed at ORNL to evaluate the ceramic structural material.
Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian
2012-05-01
GLORIA stands for "GLObal Robotic-telescopes Intelligent Array". GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.
Complementing Gender Analysis Methods.
Kumar, Anant
2016-01-01
The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.
Strategic analysis of Czech Airlines
Moiseeva, Polina
2016-01-01
The thesis called Strategic Analysis of Czech Airlines which completely analyses current situation within the company. It presents theoretical base for such an analysis and subsequently offers situational analysis, which includes the analysis of external environment, internal environment and suggestions for improvement. The thesis includes a complete companys SWOT analysis and offers the applying of Porters five forces framework. The thesis also includes recommendations and suggestions for th...
Analysis of Muji's Business Strategy
Institute of Scientific and Technical Information of China (English)
范晶
2011-01-01
This article is a report of analysis of Muji's business strategy.First,the vision and mission are introduced.Second,the current strategy is identified.Then the industry analysis,industry driving forces,key success factors,value chain analysis,competitive advantage,competitive power of the competitive advantage were analyzed.At last,on the basis of the former analysis the SWOT analysis was worked out.
Trends in BWR transient analysis
International Nuclear Information System (INIS)
While boiling water reactor (BWR) analysis methods for transient and loss of coolant accident analysis are well established, refinements and improvements continue to be made. This evolution of BWR analysis methods is driven by the new applications. This paper discusses some examples of these trends, specifically, time domain stability analysis and analysis of the simplified BWR (SBWR), General Electric's design approach involving a shift from active to passive safety systems and the elimination/simplification of systems for improved operation and maintenance
Exploratory data analysis with Matlab
Martinez, Wendy L; Solka, Jeffrey
2010-01-01
Since the publication of the bestselling first edition, many advances have been made in exploratory data analysis (EDA). Covering innovative approaches for dimensionality reduction, clustering, and visualization, Exploratory Data Analysis with MATLAB®, Second Edition uses numerous examples and applications to show how the methods are used in practice.New to the Second EditionDiscussions of nonnegative matrix factorization, linear discriminant analysis, curvilinear component analysis, independent component analysis, and smoothing splinesAn expanded set of methods for estimating the intrinsic di
International Nuclear Information System (INIS)
As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)
Tohyama, Mikio
2015-01-01
What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...
DEFF Research Database (Denmark)
Lartillot, Olivier
2016-01-01
Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...
Intracochlear microprobe analysis
Energy Technology Data Exchange (ETDEWEB)
Bone, R.C.; Ryan, A.F.
1982-04-01
Energy dispersive x-ray analysis (EDXA) or microprobe analysis provides cochlear physiologists with a means of accurately assessing relative ionic concentrations in selected portions of the auditory mechanism. Rapid freezing followed by lyophilization allows the recovery of fluid samples in crystalline form not only from perilymphatic and endolymphatic spaces, but also from much smaller subregions of the cochlea. Because samples are examined in a solid state, there is no risk of diffusion into surrounding or juxtaposed fluids. Samples of cochlear tissues may also be evaluated without the danger of intercellular ionic diffusion. During direct visualization by scanning electron microscopy, determination of the biochemical makeup of the material being examined can be simultaneously, assuring the source of the data collected. Other potential advantages and disadvantages of EDXA are reviewed. Initial findings as they relate to endolymph, perilymph, stria vascularis, and the undersurface of the tectorial membrane are presented.
CERN. Geneva; Fitch, Blake
2011-01-01
Traditionaly, the primary role of supercomputers was to create data, primarily for simulation applications. Due to usage and technology trends, supercomputers are increasingly also used for data analysis. Some of this data is from simulations, but there is also a rapidly increasingly amount of real-world science and business data to be analyzed. We briefly overview Blue Gene and other current supercomputer architectures. We outline future architectures, up to the Exascale supercomputers expected in the 2020 time frame. We focus on the data analysis challenges and opportunites, especially those concerning Flash and other up-and-coming storage class memory. About the speakers Blake G. Fitch has been with IBM Research, Yorktown Heights, NY since 1987, mainly pursuing interests in parallel systems. He joined the Scalable Parallel Systems Group in 1990, contributing to research and development that culminated in the IBM scalable parallel system (SP*) product. His research interests have focused on applicatio...
Layered Composite Analysis Capability
Narayanaswami, R.; Cole, J. G.
1985-01-01
Laminated composite material construction is gaining popularity within industry as an attractive alternative to metallic designs where high strength at reduced weights is of prime consideration. This has necessitated the development of an effective analysis capability for the static, dynamic and buckling analyses of structural components constructed of layered composites. Theoretical and user aspects of layered composite analysis and its incorporation into CSA/NASTRAN are discussed. The availability of stress and strain based failure criteria is described which aids the user in reviewing the voluminous output normally produced in such analyses. Simple strategies to obtain minimum weight designs of composite structures are discussed. Several example problems are presented to demonstrate the accuracy and user convenient features of the capability.
Elemental analysis in biotechnology.
Hann, Stephan; Dernovics, Mihaly; Koellensperger, Gunda
2015-02-01
This article focuses on analytical strategies integrating atomic spectroscopy in biotechnology. The rationale behind developing such methods is inherently linked to unique features of the key technique in elemental analysis, which is inductively coupled plasma mass spectrometry: (1) the high sensitivity and selectivity of state of the art instrumentation, (2) the possibility of accurate absolute quantification even in complex matrices, (3) the capability of combining elemental detectors with chromatographic separation methods and the versatility of the latter approach, (4) the complementarity of inorganic and organic mass spectrometry, (5) the multi-element capability and finally (6) the capability of isotopic analysis. The article highlights the most recent bio-analytical developments exploiting these methodological advantages and shows the potential in biotechnological applications.
Wyer, J C; Salzinger, F H
1983-01-01
Many common management techniques have little use in managing a medical group practice. Ratio analysis, however, can easily be adapted to the group practice setting. Acting as broad-gauge indicators, financial ratios provide an early warning of potential problems and can be very useful in planning for future operations. The author has gathered a collection of financial ratios which were developed by participants at an education seminar presented for the Virginia Medical Group Management Association. Classified according to the human element, system component, and financial factor, the ratios provide a good sampling of measurements relevant to medical group practices and can serve as an example for custom-tailoring a ratio analysis system for your medical group.
In Silico Expression Analysis.
Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz
2016-01-01
Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress.
DEFF Research Database (Denmark)
Feng, Ling
2008-01-01
This dissertation concerns the investigation of the consistency of statistical regularities in a signaling ecology and human cognition, while inferring appropriate actions for a speech-based perceptual task. It is based on unsupervised Independent Component Analysis providing a rich spectrum...... of audio contexts along with pattern recognition methods to map components to known contexts. It also involves looking for the right representations for auditory inputs, i.e. the data analytic processing pipelines invoked by human brains. The main ideas refer to Cognitive Component Analysis, defined...... as the process of unsupervised grouping of generic data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. Its hypothesis runs ecologically: features which are essentially independent in a context defined ensemble, can be efficiently coded as sparse...
DEFF Research Database (Denmark)
Olivarius, Signe
While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...
Visualization analysis and design
Munzner, Tamara
2015-01-01
Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...
Handbook of radioactivity analysis
2012-01-01
The updated and much expanded Third Edition of the "Handbook of Radioactivity Analysis" is an authoritative reference providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities and in the implementation of nuclear forensic analysis and nuclear safeguards. The Third Edition contains seven new chapters providing a reference text much broader in scope than the previous Second Edition, and all of the other chapters have been updated and expanded many with new authors. The book describes the basic principles of radiation detection and measurement, the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-ar...
Generative pulsar timing analysis
Lentati, L; Hobson, M P
2014-01-01
A new Bayesian method for the analysis of folded pulsar timing data is presented that allows for the simultaneous evaluation of evolution in the pulse profile in either frequency or time, along with the timing model and additional stochastic processes such as red spin noise, or dispersion measure variations. We model the pulse profiles using `shapelets' - a complete ortho-normal set of basis functions that allow us to recreate any physical profile shape. Any evolution in the profiles can then be described as either an arbitrary number of independent profiles, or using some functional form. We perform simulations to compare this approach with established methods for pulsar timing analysis, and to demonstrate model selection between different evolutionary scenarios using the Bayesian evidence. %s The simplicity of our method allows for many possible extensions, such as including models for correlated noise in the pulse profile, or broadening of the pulse profiles due to scattering. As such, while it is a marked...
Goorhuis-Brouwer, S M
1989-01-01
There are children who are assumed to have normal hearing and normal intelligence, but who do not speak at all. The phoniatric diagnosis for these children used to be audimutitas. Nowadays these children are not specified any more. They are called language-disturbed and are diagnosed and treated like all other language-disturbed children. An analysis is made of 46 nonspeaking children, aged 1.5-3, who are assumed to understand language in a proper way. The analysis deals with their language (language comprehension, language production and pragmatics) as well as with medical and psychological factors influencing the language disorder. We found no correlations between language comprehension, language production and pragmatics. So, when we know one language aspect, we cannot be sure of the others. We also found various factors contributing to the language problem.
Kass, Robert E; Brown, Emery N
2014-01-01
Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.
Energy Technology Data Exchange (ETDEWEB)
Kanjilal, S.K.; Lindquist, M.R.; Ulbricht, L.E.
1994-02-01
Jumper connectors are used for remotely connecting pipe lines containing transfer fluids ranging from hazardous chemicals to other nonhazardous liquids. The jumper connector assembly comprises hooks, hookpins, a block, a nozzle, an operating screw, and a nut. The hooks are tightened against the nozzle flanges by the operating screw that is tightened with a remotely connected torque wrench. Stress analysis for the jumper connector assembly (used extensively on the US Department of Energy`s Hanford Site, near Richland, Washington) is performed by using hand calculation and finite-element techniques to determine the stress levels resulting from operating and seismic loads on components of the assembly. The analysis addresses loading conditions such as prestress, seismic, operating, thermal, and leakage. The preload toruqe-generated forces at which each component reaches its stress limits are presented in a tabulated format. Allowable operating loads for the jumper assembly are provided to prevent leakage of the assembly during operating cycles.
Pavlovic, Dusko
2012-01-01
Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all attributes in the second, and vice versa. Many applications, though, provide contexts with quantitative information, telling not just whether an object satisfies an attribute, but also quantifying this satisfaction. Contexts in this form arise as rating matrices in recommender systems, as occurrence matrices in text analysis, as pixel intensity matrices in digital image processing, etc. Such applications have attracted a lot of attention, and several numeric extensions of FCA have been proposed. We propose the framework of proximity sets (proxets), which subsume partially ordered sets (posets) as well as metric spaces. One feature of this approach is that it extracts from quantified contexts quantified concepts, and thu...
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
DEFF Research Database (Denmark)
Esteban, Marta; Schindler, Birgit K; Jiménez-Guerrero, José A;
2015-01-01
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical...... assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating...... laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0...
Communication Analysis modelling techniques
España, Sergio; Pastor, Óscar; Ruiz, Marcela
2012-01-01
This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...
In Silico Expression Analysis.
Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz
2016-01-01
Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress. PMID:27557772
Energy Technology Data Exchange (ETDEWEB)
Hazen, Damian [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Hick, Jason [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)
2012-06-12
We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.
Rangayyan, Rangaraj M
2015-01-01
The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.
Energy Technology Data Exchange (ETDEWEB)
Salvesen, F.; Sandgren, J. [KanEnergi AS, Rud (Norway)
1997-12-31
The present energy situation in the target area is summarized: 20 million inhabitants without electricity in north- west Russia, 50 % of the people in the Baltic`s without electricity, very high technical skills, biggest problems is the finance. The energy situation, the advantages of the renewables, the restrictions, and examples for possible technical solutions are reviewed on the basis of short analysis and experience with the Baltics and Russia
International Nuclear Information System (INIS)
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical Product Input Department''. (BSC 2001b, Addendum B
Šemberová, Veronika
2014-01-01
Milk and milk products are important sources of protein, vitamins and minerals that are hard to substitute in the human nutrition. In last two decades agricultural underwent several changes and the size of cattle herd decreased. Share of imports on consumption of milk and milk products increased and simultaneously export of raw milk grew. Self-sustainability in milk production so decreased from 118 % to 103 % between 2004 and 2009. The main aim of this thesis called Analysis of the sector ...
Vončina, Bojan
2016-01-01
The purpose of the thesis was to analyse the acceptance of Scrum methodology, which has become one of the leading agile methodologies, and to find out which were the key factors that influenced the acceptance. The analysis was conducted in Comtrade, which is one of the largest Slovenian software development companies. The First part (theoretical part) contains an introduction chapter, a detailed presentation of Scrum methodology and the presentation of theoretical models, on which practical ...
Institute of Scientific and Technical Information of China (English)
KYU-HWAN; YANG
2001-01-01
Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).……
Institute of Scientific and Technical Information of China (English)
KYU－HAWNYANG
2001-01-01
Risk analysis is a useful too for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data.contaminant residue levels,statistical tools,exposure values and relevant variants,Risk managers consider scientific evidence and risk estimates,along with statutory,engineering,economic,social,and political factors,in evaluating alternative regulatory options and choosing among those options(NRC,1983).
Rocheta, Margarida; Dionísio, F Miguel; Fonseca, Luís; Pires, Ana M
2007-12-01
Paternity analysis using microsatellite information is a well-studied subject. These markers are ideal for parentage studies and fingerprinting, due to their high-discrimination power. This type of data is used to assign paternity, to compute the average selfing and outcrossing rates and to estimate the biparental inbreeding. There are several public domain programs that compute all this information from data. Most of the time, it is necessary to export data to some sort of format, feed it to the program and import the output to an Excel book for further processing. In this article we briefly describe a program referred from now on as Paternity Analysis in Excel (PAE), developed at IST and IBET (see the acknowledgments) that computes paternity candidates from data, and other information, from within Excel. In practice this means that the end user provides the data in an Excel sheet and, by pressing an appropriate button, obtains the results in another Excel sheet. For convenience PAE is divided into two modules. The first one is a filtering module that selects data from the sequencer and reorganizes it in a format appropriate to process paternity analysis, assuming certain conventions for the names of parents and offspring from the sequencer. The second module carries out the paternity analysis assuming that one parent is known. Both modules are written in Excel-VBA and can be obtained at the address (www.math.ist.utl.pt/~fmd/pa/pa.zip). They are free for non-commercial purposes and have been tested with different data and against different software (Cervus, FaMoz, and MLTR). PMID:17928093
Rocheta, Margarida; Dionísio, F Miguel; Fonseca, Luís; Pires, Ana M
2007-12-01
Paternity analysis using microsatellite information is a well-studied subject. These markers are ideal for parentage studies and fingerprinting, due to their high-discrimination power. This type of data is used to assign paternity, to compute the average selfing and outcrossing rates and to estimate the biparental inbreeding. There are several public domain programs that compute all this information from data. Most of the time, it is necessary to export data to some sort of format, feed it to the program and import the output to an Excel book for further processing. In this article we briefly describe a program referred from now on as Paternity Analysis in Excel (PAE), developed at IST and IBET (see the acknowledgments) that computes paternity candidates from data, and other information, from within Excel. In practice this means that the end user provides the data in an Excel sheet and, by pressing an appropriate button, obtains the results in another Excel sheet. For convenience PAE is divided into two modules. The first one is a filtering module that selects data from the sequencer and reorganizes it in a format appropriate to process paternity analysis, assuming certain conventions for the names of parents and offspring from the sequencer. The second module carries out the paternity analysis assuming that one parent is known. Both modules are written in Excel-VBA and can be obtained at the address (www.math.ist.utl.pt/~fmd/pa/pa.zip). They are free for non-commercial purposes and have been tested with different data and against different software (Cervus, FaMoz, and MLTR).
Sjogren, William L.
1987-01-01
Work on three different efforts related to gravity data analysis is discussed. The reduction of raw Doppler data from the Apollo 15 subsatellite to produce acceleration profiles as a function of latitude, longitude and altitude; an investigation related to fitting long arcs of Pioneer Venus Orbiter tracking data; and a study of gravity/topography ratios which were found to have a linear trend with longitude are discussed.
2016-01-01
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern r...
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
@@Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).
Kenya; Debt Sustainability Analysis
International Monetary Fund
2004-01-01
This report of the Debt Sustainability Analysis (DSA) indicates that the envisaged strategy of a partial substitution of domestic debt by increased inflows of external grants and concessional loans, as well as a rescheduling of external debt by the Paris and London Clubs, would facilitate the achievement of debt sustainability. The DSA also confirms that such a debt rescheduling could constitute an appropriate exit strategy for Kenya. The DSA also shows that debt sustainability would improve ...
Dental Forensics: Bitemark Analysis
Elza Ibrahim Auerkari
2013-01-01
Forensic odontology (dental forensics) can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be pra...
Dental Forensics: Bitemark Analysis
Directory of Open Access Journals (Sweden)
Elza Ibrahim Auerkari
2013-06-01
Full Text Available Forensic odontology (dental forensics can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be practically unique for each individual. Therefore, finding such an imprint as a bitemark can bear a strong testimony that it was produced by the individual that has the matching dental pattern. However, the comparison of the observed bitemark and the suspected set of teeth will necessarily require human interpretation, and this is not infallible. Both technical challenges in the bitemarks and human errors in the interpretation are possible. To minimise such errors and to maximise the value of bitemark analysis, dedicated procedures and protocols have been developed, and the personnel taking care of the analysis need to be properly trained. In principle the action within the discipline should be conducted as in evidence-based dentristy, i.e. accepted procedures should have known error rates. Because of the involvement of human interpretation, even personal performance statistics may be required from legal expert statements. The requirements have been introduced largely due to cases where false convictions based on bitemark analysishave been overturned after DNA analysis.DOI: 10.14693/jdi.v15i2.76
International Nuclear Information System (INIS)
In this paper it is presented a variational method for the limit analysis of an ideal plastic solid. This method has been denominated as Modified Secundary Creep and enables to find the collapse loads through a minimization of a functional and a limit process. Given an ideal plastic material it is shown how to determinate the associated secundary creep constitutive equation. Finally, as an application, it is found the limit load in an pressurized von Mises rigid plastic sphere. (Author)
Sroufe, Paul; Phithakkitnukoon, Santi; Dantu, Ram; Cangussu, João
2010-01-01
Email has become an integral part of everyday life. Without a second thought we receive bills, bank statements, and sales promotions all to our inbox. Each email has hidden features that can be extracted. In this paper, we present a new mechanism to characterize an email without using content or context called Email Shape Analysis. We explore the applications of the email shape by carrying out a case study; botnet detection and two possible applications: spam filtering, and social-context bas...
Analysis Bitcoin virtual currency
Potužník, Jan
2014-01-01
The goal of the submitted thesis “Analysis Bitcoin virtual currency” is to analyze the first decentralized digital currency called Bitcoin and become an active member of the mining process. The thesis is also supposed to analyze different decentralized digital currencies that were created based on Bitcoin and compare these currencies to Bitcoin. The practical part of this bachelor thesis is focused on the detailed description of how to become an active member of the mining process and if mini...
Whelan, Paul F.; Ghita, O.
2008-01-01
This chapter presents a novel and generic framework for image segmentation using a compound image descriptor that encompasses both colour and texture information in an adaptive fashion. The developed image segmentation method extracts the texture information using low-level image descriptors (such as the Local Binary Patterns (LBP)) and colour information by using colour space partitioning. The main advantage of this approach is the analysis of the textured images at a micro-level using the l...
Introduction to abstract analysis
Goldstein, Marvin E
2015-01-01
Developed from lectures delivered at NASA's Lewis Research Center, this concise text introduces scientists and engineers with backgrounds in applied mathematics to the concepts of abstract analysis. Rather than preparing readers for research in the field, this volume offers background necessary for reading the literature of pure mathematics. Starting with elementary set concepts, the treatment explores real numbers, vector and metric spaces, functions and relations, infinite collections of sets, and limits of sequences. Additional topics include continuity and function algebras, Cauchy complet
Crawler Solids Unknown Analysis
Frandsen, Athela
2016-01-01
Crawler Transporter (CT) #2 has been undergoing refurbishment to carry the Space Launch System (SLS). After returning to normal operation, multiple filters of the gear box lubrication system failed/clogged and went on bypass during a test run to the launch pad. Analysis of the filters was done in large part with polarized light microscopy (PLM) to identify the filter contaminates and the source of origin.
Contingency and behavior analysis
Lattal, Kennon A.
1995-01-01
The concept of contingency is central to theoretical discussions of learned behavior and in the application of learning research to problems of social significance. This paper reviews three aspects of the contingency concept as it has been developed by behavior analysts. The first is the empirical analysis of contingency through experimental studies of both human and nonhuman behavior. The second is the synthesis of experimental studies in theoretical and conceptual frameworks to yield a more...
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
This review introduces the history and present status of data envelopment analysis (DEA) research, particularly the evaluation process. And extensions of some DEA models are also described. It is pointed out that mathematics, economics and management science are the main forces in the DEA development, optimization provides the fundamental method for the DEA research, and the wide range of applications enforces the rapid development of DEA.
SAMPLING AND ANALYSIS PROTOCOLS
Energy Technology Data Exchange (ETDEWEB)
Jannik, T; P Fledderman, P
2007-02-09
Radiological sampling and analyses are performed to collect data for a variety of specific reasons covering a wide range of projects. These activities include: Effluent monitoring; Environmental surveillance; Emergency response; Routine ambient monitoring; Background assessments; Nuclear license termination; Remediation; Deactivation and decommissioning (D&D); and Waste management. In this chapter, effluent monitoring and environmental surveillance programs at nuclear operating facilities and radiological sampling and analysis plans for remediation and D&D activities will be discussed.
Givoni, Inmar; Cheung, Vincent; Frey, Brendan J.
2012-01-01
Many tasks require finding groups of elements in a matrix of numbers, symbols or class likelihoods. One approach is to use efficient bi- or tri-linear factorization techniques including PCA, ICA, sparse matrix factorization and plaid analysis. These techniques are not appropriate when addition and multiplication of matrix elements are not sensibly defined. More directly, methods like bi-clustering can be used to classify matrix elements, but these methods make the overly-restrictive assumptio...
Sociology and Systems Analysis
Becker, H.A.
1982-01-01
The Management and Technology (MMT) Area of IIASA organizes, from time to time, seminars on topics that are of interest in connection with the work at the Institute. Since MMT sees the importance of investigating the broader management aspects when using systems analytical tools, it was of great interest to have Professor Henk Becker from the University of Utrecht give a seminar on "Sociology of Systems Analysis". As his presentation at this seminar should be of interest to a wider audie...
Anshul Sharma; Preeti Gulia
2014-01-01
Big Data is data that either is too large, grows too fast, or does not fit into traditional architectures. Within such data can be valuable information that can be discovered through data analysis [1]. Big data is a collection of complex and large data sets that are difficult to process and mine for patterns and knowledge using traditional database management tools or data processing and mining systems. Big Data is data whose scale, diversity and complexity require new architecture, technique...
Rajiv K Gupta; Thallam V Padmanabhan
2011-01-01
Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA) is one of such techniques which is most frequent...
PROFITABILITY ANALYSIS TOURIST HOSTELS
Cristiana Tindeche; Romeo Catalin Cretu
2015-01-01
This study aims for a comparative analysis on the economic efficiency of the Confort Penssion located in a rural area and the Danacris Penssion from the urban area. The reason for choosing these two units is that the types of tourism they represent are significant areas of operation, namely leisure tourism ("Confort " Penssion) from Suceava area and business tourism ("Danacris" Penssion) from Bucharest. Based on the existing methodology in the specialized literature we computed specific indic...
ORGANISATIONAL CULTURE ANALYSIS MODEL
Mihaela Simona Maracine
2012-01-01
The studies and researches undertaken have demonstrated the importance of studying organisational culture because of the practical valences it presents and because it contributes to increasing the organisation’s performance. The analysis of the organisational culture’s dimensions allows observing human behaviour within the organisation and highlighting reality, identifying the strengths and also the weaknesses which have an impact on its functionality and development. In this paper, we try to...
Analysis of personnel management
Novotná, Petra
2012-01-01
The topic of this Bachelor's thesis is an analysis of personnel management in the company DIAMONDS INTERNATIONAL CORPORATION - D.I.C. plc., which is engaged in production and selling of diamond jewellery as well as the selling of invested diamonds. The thesis is divided into theoretical part, where the basic personnel activities are characterized, and into practical part, where the theoretical knowledge is compared to the practical reality in the company; in addition, the recommendations are ...
DEFF Research Database (Denmark)
Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben
2012-01-01
Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....
Liu, Jianghong
2004-01-01
The concept of aggression is important to nursing because further knowledge of aggression can help generate a better theoretical model to drive more effective intervention and prevention approaches. This paper outlines a conceptual analysis of aggression. First, the different forms of aggression are reviewed, including the clinical classification and the stimulus-based classification. Then the manifestations and measurement of aggression are described. Finally, the causes and consequences of ...
International Nuclear Information System (INIS)
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog
International Nuclear Information System (INIS)
The outputs from the drift degradation analysis support scientific analyses, models, and design calculations, including the following: (1) Abstraction of Drift Seepage; (2) Seismic Consequence Abstraction; (3) Structural Stability of a Drip Shield Under Quasi-Static Pressure; and (4) Drip Shield Structural Response to Rock Fall. This report has been developed in accordance with ''Technical Work Plan for: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The drift degradation analysis includes the development and validation of rockfall models that approximate phenomenon associated with various components of rock mass behavior anticipated within the repository horizon. Two drift degradation rockfall models have been developed: the rockfall model for nonlithophysal rock and the rockfall model for lithophysal rock. These models reflect the two distinct types of tuffaceous rock at Yucca Mountain. The output of this modeling and analysis activity documents the expected drift deterioration for drifts constructed in accordance with the repository layout configuration (BSC 2004 [DIRS 172801])
[Mindfulness: A Concept Analysis].
Chen, Tsai-Ling; Chou, Fan-Hao; Wang, Hsiu-Hung
2016-04-01
"Mindfulness" is an emerging concept in the field of healthcare. Ranging from stress relief to psychotherapy, mindfulness has been confirmed to be an effective tool to help individuals manage depression, anxiety, obsessive-compulsive disorder, and other health problems in clinical settings. Scholars currently use various definitions for mindfulness. While some of these definitions overlap, significant differences remain and a general scholarly consensus has yet to be reached. Several domestic and international studies have explored mindfulness-related interventions and their effectiveness. However, the majority of these studies have focused on the fields of clinical medicine, consultation, and education. Mindfulness has rarely been applied in clinical nursing practice and no related systematic concept analysis has been conducted. This paper conducts a concept analysis of mindfulness using the concept analysis method proposed by Walker and Avant (2011). We describe the defining characteristics of mindfulness, clarify the concept, and confirm the predisposing factors and effects of mindfulness using examples of typical cases, borderline cases, related cases, and contrary case. Findings may provide nursing staff with an understanding of the concept of mindfulness for use in clinical practice in order to help patients achieve a comfortable state of body and mind healing.
Regional Shelter Analysis Methodology
Energy Technology Data Exchange (ETDEWEB)
Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.
Energy Technology Data Exchange (ETDEWEB)
R.M. Forester
2000-03-14
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog.
Exploration Laboratory Analysis
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
International Nuclear Information System (INIS)
The radiometric method of analysis is noted for its sensitivity and its simplicity in both apparatus and procedure. A few inexpensive radioactive reagents permit the analysis of a wide variety of chemical elements and compounds. Any particular procedure is generally applicable over a very wide range of concentrations. It is potentially an analytical method of great industrial significance. Specific examples of analyses are cited to illustrate the potentialities of ordinary equipment. Apparatus specifically designed for radiometric chemistry may shorten the time required, and increase the precision and accuracy for routine analyses. A sensitive and convenient apparatus for the routine performance of radiometric chemical analysis is a special type of centrifuge which has been used in obtaining the data presented in this paper. The radioactivity of the solution is measured while the centrifuge is spinning. This device has been used as the basis for an automatic analyser for phosphate ion, programmed to follow a sequence of unknown sampling, reagent mixing, centrifugation, counting data presentation, and phosphate replenishment. This analyser can repeatedly measure phosphate-concentration in the range of 5 to 50 ppm with an accuracy of ±5%. (author)
Enhanced target factor analysis.
Rostami, Akram; Abdollahi, Hamid; Maeder, Marcel
2016-03-10
Target testing or target factor analysis, TFA, is a well-established soft analysis method. TFA answers the question whether an independent target test vector measured at the same wavelengths as the collection of spectra in a data matrix can be excluded as the spectrum of one of the components in the system under investigation. Essentially, TFA cannot positively prove that a particular test spectrum is the true spectrum of one of the components, it can, only reject a spectrum. However, TFA will not reject, or in other words TFA will accept, many spectra which cannot be component spectra. Enhanced Target Factor Analysis, ETFA addresses the above problem. Compared with traditional TFA, ETFA results in a significantly narrower range of positive results, i.e. the chance of a false positive test result is dramatically reduced. ETFA is based on feasibility testing as described in Refs. [16-19]. The method has been tested and validated with computer generated and real data sets. PMID:26893084
Anastassiou, George A
2015-01-01
This is the first numerical analysis text to use Sage for the implementation of algorithms and can be used in a one-semester course for undergraduates in mathematics, math education, computer science/information technology, engineering, and physical sciences. The primary aim of this text is to simplify understanding of the theories and ideas from a numerical analysis/numerical methods course via a modern programming language like Sage. Aside from the presentation of fundamental theoretical notions of numerical analysis throughout the text, each chapter concludes with several exercises that are oriented to real-world application. Answers may be verified using Sage. The presented code, written in core components of Sage, are backward compatible, i.e., easily applicable to other software systems such as Mathematica®. Sage is open source software and uses Python-like syntax. Previous Python programming experience is not a requirement for the reader, though familiarity with any programming language is a p...
Energy Technology Data Exchange (ETDEWEB)
Andres, T.H
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Medical Image Analysis Facility
1978-01-01
To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.
Market analysis. Renewable fuels
International Nuclear Information System (INIS)
The Agency for Renewable Resources (FNR) had on behalf of the Federal Ministry of Food and Agriculture created a study on the market development of renewable resources in Germany and published this in the year of 2006. The aim of that study was to identify of actual status and market performance of the individual market segments of the material and energetic use as a basis for policy recommendations for accelerated and long term successful market launch and market share expansion of renewable raw materials. On behalf of the FNR, a market analysis of mid-2011 was carried out until the beginning of 2013, the results of which are hereby resubmitted. This market analysis covers all markets of material and energetic use in the global context, taking account of possible competing uses. A market segmentation, which was based on the product classification of the Federal Statistical Office, formed the basis of the analysis. A total of ten markets have been defined, seven material and three energetic use.
Nominal analysis of "variance".
Weiss, David J
2009-08-01
Nominal responses are the natural way for people to report actions or opinions. Because nominal responses do not generate numerical data, they have been underutilized in behavioral research. On those occasions in which nominal responses are elicited, the responses are customarily aggregated over people or trials so that large-sample statistics can be employed. A new analysis is proposed that directly associates differences among responses with particular sources in factorial designs. A pair of nominal responses either matches or does not; when responses do not match, they vary. That analogue to variance is incorporated in the nominal analysis of "variance" (NANOVA) procedure, wherein the proportions of matches associated with sources play the same role as do sums of squares in an ANOVA. The NANOVA table is structured like an ANOVA table. The significance levels of the N ratios formed by comparing proportions are determined by resampling. Fictitious behavioral examples featuring independent groups and repeated measures designs are presented. A Windows program for the analysis is available.
Monogan III, James E
2015-01-01
Political Analysis Using R can serve as a textbook for undergraduate or graduate students as well as a manual for independent researchers. It is unique among competitor books in its usage of 21 example datasets that are all drawn from political research. All of the data and example code is available from the Springer website, as well as from Dataverse (http://dx.doi.org/10.7910/DVN/ARKOTI). The book provides a narrative of how R can be useful for addressing problems common to the analysis of public administration, public policy, and political science data specifically, in addition to the social sciences more broadly. While the book uses data drawn from political science, public administration, and policy analyses, it is written so that students and researchers in other fields should find it accessible and useful as well. Political Analysis Using R is perfect for the first-time R user who has no prior knowledge about the program. By working through the first seven chapters of this book, an entry-level user sho...
Generalized Linear Covariance Analysis
Carpenter, James R.; Markley, F. Landis
2014-01-01
This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.
Deterministic uncertainty analysis
International Nuclear Information System (INIS)
This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs
[Mindfulness: A Concept Analysis].
Chen, Tsai-Ling; Chou, Fan-Hao; Wang, Hsiu-Hung
2016-04-01
"Mindfulness" is an emerging concept in the field of healthcare. Ranging from stress relief to psychotherapy, mindfulness has been confirmed to be an effective tool to help individuals manage depression, anxiety, obsessive-compulsive disorder, and other health problems in clinical settings. Scholars currently use various definitions for mindfulness. While some of these definitions overlap, significant differences remain and a general scholarly consensus has yet to be reached. Several domestic and international studies have explored mindfulness-related interventions and their effectiveness. However, the majority of these studies have focused on the fields of clinical medicine, consultation, and education. Mindfulness has rarely been applied in clinical nursing practice and no related systematic concept analysis has been conducted. This paper conducts a concept analysis of mindfulness using the concept analysis method proposed by Walker and Avant (2011). We describe the defining characteristics of mindfulness, clarify the concept, and confirm the predisposing factors and effects of mindfulness using examples of typical cases, borderline cases, related cases, and contrary case. Findings may provide nursing staff with an understanding of the concept of mindfulness for use in clinical practice in order to help patients achieve a comfortable state of body and mind healing. PMID:27026563
International Nuclear Information System (INIS)
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA
Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...
Travis, Brandon E.
2004-01-01
In designing and testing bolted joints there are multiple parameters to be considered and calculations that must be performed to predict the joint behavior. Each different set of parameters may call for a different set of equations. Determining every parameter in each bolted joint is impractical and in many cases impossible. On the other hand, it is much easier to reduce these calculations to a universal set that can be used for all bolted joints. This is the purpose of the Bolt Analysis Program. My project under the Mechanical and Rotating Systems branch of the Engineering Development and Analysis Division was to take the Bolt Analysis Program Version 2.0 and update the program to a modem and user-friendly format. Version 2.0 of the Bolt Analysis Program is a useful program, but lacks the dynamic capabilities that are needed for current applications. Version 2.0 of the Bolt Analysis Program was written in 1993 using the Pascal programming language in a DOS format. This program allows you to input data in a step-by-step format, calculates the data, and then on a final screen displays the input and the output fiom the calculations. Version 2.0 is still applicable for all bolted joint anaiysis, but has updates that are desired. First, the program runs in DOS format. With the applications available today, my mentor decided it would be best to update the program into Excel using Visual Basic for Applications (VBA). This would allow the program to have multiple Graphical User Interfaces (GUI s) while retaining all functions of the previous program. Version 2.0 only allows you to input data in a step-by-step process. If you make a mistake and need to go back, you must run through the entire program before you can return to fix your error. This becomes tedious when needing to change one parameter or test multiple sets of data. In Version 3.0, the program allows you to enter and change data at any time while displaying real-time output data. If you realize an error, it is
International Nuclear Information System (INIS)
This report documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain, Nevada, the site of a repository for spent nuclear fuel and high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this report provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the following reports: ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]), ''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504]), ''Features, Events, and Processes in UZ Flow and Transport'' (BSC 2004 [DIRS 170012]), and ''Features, Events, and Processes in SZ Flow and Transport'' (BSC 2004 [DIRS 170013]). Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one available forecasting method for establishing upper and lower bounds for future climate estimates. The selection of different methods is directly dependent on the available evidence used to build a forecasting argument. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. While alternative analyses are possible for the case presented for Yucca Mountain, the evidence (data) used would be the same and the conclusions would not be expected to drastically change. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Other alternative
Energy Technology Data Exchange (ETDEWEB)
Vivancos, E; Healy, C; Mueller, F; Whalley, D
2001-05-09
Embedded systems often have real-time constraints. Traditional timing analysis statically determines the maximum execution time of a task or a program in a real-time system. These systems typically depend on the worst-case execution time of tasks in order to make static scheduling decisions so that tasks can meet their deadlines. Static determination of worst-case execution times imposes numerous restrictions on real-time programs, which include that the maximum number of iterations of each loop must be known statically. These restrictions can significantly limit the class of programs that would be suitable for a real-time embedded system. This paper describes work-in-progress that uses static timing analysis to aid in making dynamic scheduling decisions. For instance, different algorithms with varying levels of accuracy may be selected based on the algorithm's predicted worst-case execution time and the time allotted for the task. We represent the worst-case execution time of a function or a loop as a formula, where the unknown values affecting the execution time are parameterized. This parametric timing analysis produces formulas that can then be quickly evaluated at run-time so dynamic scheduling decisions can be made with little overhead. Benefits of this work include expanding the class of applications that can be used in a real-time system, improving the accuracy of dynamic scheduling decisions, and more effective utilization of system resources. This paper describes how static timing analysis can be used to aid in making dynamic scheduling decisions. The WCET of a function or a loop is represented as a formula, where the values affecting the execution time are parameterized. Such formulas can then be quickly evaluated at run-time so dynamic scheduling decisions can be made when scheduling a task or choosing algorithms within a task. Benefits of this parametric timing analysis include expanding the class of applications that can be used in a real
Energy Technology Data Exchange (ETDEWEB)
C. G. Cambell
2004-09-03
This report documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain, Nevada, the site of a repository for spent nuclear fuel and high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this report provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the following reports: ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]), ''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504]), ''Features, Events, and Processes in UZ Flow and Transport'' (BSC 2004 [DIRS 170012]), and ''Features, Events, and Processes in SZ Flow and Transport'' (BSC 2004 [DIRS 170013]). Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one available forecasting method for establishing upper and lower bounds for future climate estimates. The selection of different methods is directly dependent on the available evidence used to build a forecasting argument. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. While alternative analyses are possible for the case presented for Yucca Mountain, the evidence (data) used would be the same and the conclusions would not be expected to drastically change. Other studies might develop a different rationale or select other past
Schwarzer, Guido; Rücker, Gerta
2015-01-01
This book provides a comprehensive introduction to performing meta-analysis using the statistical software R. It is intended for quantitative researchers and students in the medical and social sciences who wish to learn how to perform meta-analysis with R. As such, the book introduces the key concepts and models used in meta-analysis. It also includes chapters on the following advanced topics: publication bias and small study effects; missing data; multivariate meta-analysis, network meta-analysis; and meta-analysis of diagnostic studies. .
Preprint Big City 3D Visual Analysis
Lv, Zhihan; Li, Xiaoming; Zhang, Baoyun; Wang, Weixi; Feng, Shengzhong; Hu, Jinxing
2015-01-01
This is the preprint version of our paper on EUROGRAPHICS 2015. A big city visual analysis platform based on Web Virtual Reality Geographical Information System (WEBVRGIS) is presented. Extensive model editing functions and spatial analysis functions are available, including terrain analysis, spatial analysis, sunlight analysis, traffic analysis, population analysis and community analysis.
CADAT integrated circuit mask analysis
1981-01-01
CADAT System Mask Analysis Program (MAPS2) is automated software tool for analyzing integrated-circuit mask design. Included in MAPS2 functions are artwork verification, device identification, nodal analysis, capacitance calculation, and logic equation generation.
Logical analysis of biological systems
DEFF Research Database (Denmark)
Mardare, Radu Iulian
2005-01-01
R. Mardare, Logical analysis of biological systems. Fundamenta Informaticae, N 64:271-285, 2005.......R. Mardare, Logical analysis of biological systems. Fundamenta Informaticae, N 64:271-285, 2005....
The OVIS analysis architecture.
Energy Technology Data Exchange (ETDEWEB)
Mayo, Jackson R.; Gentile, Ann C.; Brandt, James M.; De Sapio, Vincent; Thompson, David C.; Roe, Diana C.; Wong, Matthew H.; Pebay, Philippe Pierre
2010-07-01
This report summarizes the current statistical analysis capability of OVIS and how it works in conjunction with the OVIS data readers and interpolators. It also documents how to extend these capabilities. OVIS is a tool for parallel statistical analysis of sensor data to improve system reliability. Parallelism is achieved using a distributed data model: many sensors on similar components (metaphorically sheep) insert measurements into a series of databases on computers reserved for analyzing the measurements (metaphorically shepherds). Each shepherd node then processes the sheep data stored locally and the results are aggregated across all shepherds. OVIS uses the Visualization Tool Kit (VTK) statistics algorithm class hierarchy to perform analysis of each process's data but avoids VTK's model aggregation stage which uses the Message Passing Interface (MPI); this is because if a single process in an MPI job fails, the entire job will fail. Instead, OVIS uses asynchronous database replication to aggregate statistical models. OVIS has several additional features beyond those present in VTK that, first, accommodate its particular data format and, second, improve the memory and speed of the statistical analyses. First, because many statistical algorithms are multivariate in nature and sensor data is typically univariate, interpolation of data is required to provide simultaneous observations of metrics. Note that in this report, we will refer to a single value obtained from a sensor as a measurement while a collection of multiple sensor values simultaneously present in the system is an observation. A base class for interpolation is provided that abstracts the operation of converting multiple sensor measurements into simultaneous observations. A concrete implementation is provided that performs piecewise constant temporal interpolation of multiple metrics across a single component. Secondly, because calculations may summarize data too large to fit in memory
Directory of Open Access Journals (Sweden)
Rajiv K Gupta
2011-01-01
Full Text Available Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA is one of such techniques which is most frequently used now days. The aim of this paper was to review and analyze critically the current available literature in the field of RFA, and to also discuss based on scientific evidence, the prognostic value of RFA to detect implants at risk of failure. A search was made using the PubMed database to find all the literature published on "Resonance frequency analysis for implant stability" till date. Articles discussed in vivo or in vitro studies comparing RFA with other methods of implant stability measurement and articles discussing its reliability were thoroughly reviewed and discussed. A limited number of clinical reports were found. Various studies have demonstrated the feasibility and predictability of the technique. However, most of these articles are based on retrospective data or uncontrolled cases. Randomized, prospective, parallel-armed longitudinal human trials are based on short-term results and long-term follow up are still scarce in this field. Nonetheless, from available literature, it may be concluded that RFA technique evaluates implant stability as a function of stiffness of the implant bone interface and is influenced by factors such as bone type, exposed implant height above the alveolar crest. Resonance frequency analysis could serve as a non-invasive diagnostic tool for detecting the implant stability of dental implants during the healing stages and in subsequent routine follow up care after treatment. Future studies, preferably randomized
Mergenova, Saule
2013-01-01
In my bachelor thesis it will be done situational analysis of the marketing mix for the sport center “Růžova 5”. It will be considered all parts of the marketing mix specifically 4P (product, price, place, promotion). When a country becomes more developed, the more people can afford to visit sport centers in order to fix their body or just for heath. Today, people are able to pay for such services like sport. Day by day, it becomes a trend to have a healthy lifestyle. In the modern worl...
Analysis, manifolds and physics
Choquet-Bruhat, Y
2000-01-01
Twelve problems have been added to the first edition; four of them are supplements to problems in the first edition. The others deal with issues that have become important, since the first edition of Volume II, in recent developments of various areas of physics. All the problems have their foundations in volume 1 of the 2-Volume set Analysis, Manifolds and Physics. It would have been prohibitively expensive to insert the new problems at their respective places. They are grouped together at the end of this volume, their logical place is indicated by a number of parenthesis following the title.
Dimensional analysis made simple
Lira, Ignacio
2013-11-01
An inductive strategy is proposed for teaching dimensional analysis to second- or third-year students of physics, chemistry, or engineering. In this strategy, Buckingham's theorem is seen as a consequence and not as the starting point. In order to concentrate on the basics, the mathematics is kept as elementary as possible. Simple examples are suggested for classroom demonstrations of the power of the technique and others are put forward for homework or experimentation, but instructors are encouraged to produce examples of their own.
Elementary heat transfer analysis
Whitaker, Stephen; Hartnett, James P
1976-01-01
Elementary Heat Transfer Analysis provides information pertinent to the fundamental aspects of the nature of transient heat conduction. This book presents a thorough understanding of the thermal energy equation and its application to boundary layer flows and confined and unconfined turbulent flows. Organized into nine chapters, this book begins with an overview of the use of heat transfer coefficients in formulating the flux condition at phase interface. This text then explains the specification as well as application of flux boundary conditions. Other chapters consider a derivation of the tra
Horejšová, Zuzana
2013-01-01
The bachelor thesis named ‘Analysis of the marketing mix’ focuses on examining the marketing mix techniques which are used and applied by the ‘IHR Autodily Ltd’ company. It is divided into two main parts: the theoretical and the practical part. In the theoretical part of this thesis the subject which is discussed is that of the general characteristics as well as the basic rules of marketing which include the four tools of the marketing mix: product, price, place and promotion. The pract...
Dimensional analysis made simple
International Nuclear Information System (INIS)
An inductive strategy is proposed for teaching dimensional analysis to second- or third-year students of physics, chemistry, or engineering. In this strategy, Buckingham's theorem is seen as a consequence and not as the starting point. In order to concentrate on the basics, the mathematics is kept as elementary as possible. Simple examples are suggested for classroom demonstrations of the power of the technique and others are put forward for homework or experimentation, but instructors are encouraged to produce examples of their own. (paper)
Ambrosetti, Antonio; Malchiodi, Andrea
2009-01-01
This volume contains lecture notes on some topics in geometric analysis, a growing mathematical subject which uses analytical techniques, mostly of partial differential equations, to treat problems in differential geometry and mathematical physics. The presentation of the material should be rather accessible to non-experts in the field, since the presentation is didactic in nature. The reader will be provided with a survey containing some of the most exciting topics in the field, with a series of techniques used to treat such problems.
Practical multivariate analysis
Afifi, Abdelmonem; Clark, Virginia A
2011-01-01
""First of all, it is very easy to read. … The authors manage to introduce and (at least partially) explain even quite complex concepts, e.g. eigenvalues, in an easy and pedagogical way that I suppose is attractive to readers without deeper statistical knowledge. The text is also sprinkled with references for those who want to probe deeper into a certain topic. Secondly, I personally find the book's emphasis on practical data handling very appealing. … Thirdly, the book gives very nice coverage of regression analysis. … this is a nicely written book that gives a good overview of a large number
International Nuclear Information System (INIS)
Degradation of underground openings as a function of time is a natural and expected occurrence for any subsurface excavation. Over time, changes occur to both the stress condition and the strength of the rock mass due to several interacting factors. Once the factors contributing to degradation are characterized, the effects of drift degradation can typically be mitigated through appropriate design and maintenance of the ground support system. However, for the emplacement drifts of the geologic repository at Yucca Mountain, it is necessary to characterize drift degradation over a 10,000-year period, which is well beyond the functional period of the ground support system. This document provides an analysis of the amount of drift degradation anticipated in repository emplacement drifts for discrete events and time increments extending throughout the 10,000-year regulatory period for postclosure performance. This revision of the drift degradation analysis was developed to support the license application and fulfill specific agreement items between the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE). The earlier versions of ''Drift Degradation Analysis'' (BSC 2001 [DIRS 156304]) relied primarily on the DRKBA numerical code, which provides for a probabilistic key-block assessment based on realistic fracture patterns determined from field mapping in the Exploratory Studies Facility (ESF) at Yucca Mountain. A key block is defined as a critical block in the surrounding rock mass of an excavation, which is removable and oriented in an unsafe manner such that it is likely to move into an opening unless support is provided. However, the use of the DRKBA code to determine potential rockfall data at the repository horizon during the postclosure period has several limitations: (1) The DRKBA code cannot explicitly apply dynamic loads due to seismic ground motion. (2) The DRKBA code cannot explicitly apply loads due to thermal stress. (3) The DRKBA
Badler, N. I.
1985-01-01
Human motion analysis is the task of converting actual human movements into computer readable data. Such movement information may be obtained though active or passive sensing methods. Active methods include physical measuring devices such as goniometers on joints of the body, force plates, and manually operated sensors such as a Cybex dynamometer. Passive sensing de-couples the position measuring device from actual human contact. Passive sensors include Selspot scanning systems (since there is no mechanical connection between the subject's attached LEDs and the infrared sensing cameras), sonic (spark-based) three-dimensional digitizers, Polhemus six-dimensional tracking systems, and image processing systems based on multiple views and photogrammetric calculations.
International Nuclear Information System (INIS)
This paper presents a suggestion for systematic collection of data during the normal use of training simulators, with the double purpose of supporting trainee debriefing and providing data for further theoretical studies of operator performance. The method is based on previously described models of operator performance and decision-making, and is a specific instance of the general method for analysis of operator performance data. The method combines a detailed transient-specific description of the expected performance with transient-independent tools for observation of critical activities. (author)
Geiser, Christian
2012-01-01
A practical introduction to using Mplus for the analysis of multivariate data, this volume provides step-by-step guidance, complete with real data examples, numerous screen shots, and output excerpts. The author shows how to prepare a data set for import in Mplus using SPSS. He explains how to specify different types of models in Mplus syntax and address typical caveats--for example, assessing measurement invariance in longitudinal SEMs. Coverage includes path and factor analytic models as well as mediational, longitudinal, multilevel, and latent class models. Specific programming tips an
Introduction to combinatorial analysis
Riordan, John
2002-01-01
This introduction to combinatorial analysis defines the subject as ""the number of ways there are of doing some well-defined operation."" Chapter 1 surveys that part of the theory of permutations and combinations that finds a place in books on elementary algebra, which leads to the extended treatment of generation functions in Chapter 2, where an important result is the introduction of a set of multivariable polynomials.Chapter 3 contains an extended treatment of the principle of inclusion and exclusion which is indispensable to the enumeration of permutations with restricted position given
Integrated Process Capability Analysis
Institute of Scientific and Technical Information of China (English)
Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S
2002-01-01
Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...
Broukal, Radek
2012-01-01
This bachelor work descibes analysis of marketing mix used for cream „Bobík“ by Bohušovická mlékárna a.s. The work is divided into two part – research of available literature and practical part. In the first part is literature overview of marketing mix and tools of marketing mix, which are product, price, place, promotion. In the second section of this work are described each tools of marketing mix for the product „Bobík“, ...
International Nuclear Information System (INIS)
The purpose of this report is to provide heat capacity values for the host and surrounding rock layers for the waste repository at Yucca Mountain. The heat capacity representations provided by this analysis are used in unsaturated zone (UZ) flow, transport, and coupled processes numerical modeling activities, and in thermal analyses as part of the design of the repository to support the license application. Among the reports that use the heat capacity values estimated in this report are the ''Multiscale Thermohydrologic Model'' report, the ''Drift Degradation Analysis'' report, the ''Ventilation Model and Analysis Report, the Igneous Intrusion Impacts on Waste Packages and Waste Forms'' report, the ''Dike/Drift Interactions report, the Drift-Scale Coupled Processes (DST and TH Seepage) Models'' report, and the ''In-Drift Natural Convection and Condensation'' report. The specific objective of this study is to determine the rock-grain and rock-mass heat capacities for the geologic stratigraphy identified in the ''Mineralogic Model (MM3.0) Report'' (BSC 2004 [DIRS 170031], Table 1-1). This report provides estimates of the heat capacity for all stratigraphic layers except the Paleozoic, for which the mineralogic abundance data required to estimate the heat capacity are not available. The temperature range of interest in this analysis is 25 C to 325 C. This interval is broken into three separate temperature sub-intervals: 25 C to 95 C, 95 C to 114 C, and 114 C to 325 C, which correspond to the preboiling, trans-boiling, and postboiling regimes. Heat capacity is defined as the amount of energy required to raise the temperature of a unit mass of material by one degree (Nimick and Connolly 1991 [DIRS 100690], p. 5). The rock-grain heat capacity is defined as the heat capacity of the rock solids (minerals), and does not include the effect of water that exists in the rock pores. By comparison, the rock-mass heat capacity considers the heat capacity of both solids and pore
Proximate Composition Analysis.
2016-01-01
The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.
Surface Aesthetics and Analysis.
Çakır, Barış; Öreroğlu, Ali Rıza; Daniel, Rollin K
2016-01-01
Surface aesthetics of an attractive nose result from certain lines, shadows, and highlights with specific proportions and breakpoints. Analysis emphasizes geometric polygons as aesthetic subunits. Evaluation of the complete nasal surface aesthetics is achieved using geometric polygons to define the existing deformity and aesthetic goals. The relationship between the dome triangles, interdomal triangle, facet polygons, and infralobular polygon are integrated to form the "diamond shape" light reflection on the nasal tip. The principles of geometric polygons allow the surgeon to analyze the deformities of the nose, define an operative plan to achieve specific goals, and select the appropriate operative technique.
Energy Technology Data Exchange (ETDEWEB)
James Houseworth
2001-10-12
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical
DEFF Research Database (Denmark)
Meredith, David
MEL is a geometric music encoding language designed to allow for musical objects to be encoded parsimoniously as sets of points in pitch-time space, generated by performing geometric transformations on component patterns. MEL has been implemented in Java and coupled with the SIATEC pattern...... discovery algorithm to allow for compact encodings to be generated automatically from in extenso note lists. The MEL-SIATEC system is founded on the belief that music analysis and music perception can be modelled as the compression of in extenso descriptions of musical objects....
Krantz, Steven G
2013-01-01
Praise for the Second Edition:""The book is recommended as a source for middle-level mathematical courses. It can be used not only in mathematical departments, but also by physicists, engineers, economists, and other experts in applied sciences who want to understand the main ideas of analysis in order to use them to study mathematical models of different type processes.""-Zentralblatt MATH ""The book contains many well-chosen examples and each of the fifteen chapters is followed by almost 500 exercises. … Illustrative pictures are instructive and the design of the book makes reading it a real
Murphy, R F
1985-01-01
Imposed restrictions on inpatient revenue have encouraged hospitals to seek alternative sources of revenue through diversification. The venture profile analysis is a low-cost, orderly process to help hospitals plan for service diversification. Potential business ventures are assigned a weighted score based on nine evaluation criteria. Potential business ventures with high relative scores should be those opportunities with the greater prospects of success and those deserving of serious consideration by the hospital. The format of the profile facilitates active involvement of board members in the decision making process and prudent management of risk in market-based strategic planning. PMID:10300483
Bonald, Thomas
2013-01-01
The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i
Hartmanová, Dominika
2010-01-01
The bachelor thesis named ‘Analysis of the marketing mix’ focuses on examining the marketing mix techniques which are used and applied by the ‘IHR Autodily Ltd’ company. It is divided into two main parts: the theoretical and the practical part. In the theoretical part of this thesis the subject which is discussed is that of the general characteristics as well as the basic rules of marketing which include the four tools of the marketing mix: product, price, place and promotion. The pract...
Chemical exchange program analysis.
Energy Technology Data Exchange (ETDEWEB)
Waffelaert, Pascale
2007-09-01
As part of its EMS, Sandia performs an annual environmental aspects/impacts analysis. The purpose of this analysis is to identify the environmental aspects associated with Sandia's activities, products, and services and the potential environmental impacts associated with those aspects. Division and environmental programs established objectives and targets based on the environmental aspects associated with their operations. In 2007 the most significant aspect identified was Hazardous Materials (Use and Storage). The objective for Hazardous Materials (Use and Storage) was to improve chemical handling, storage, and on-site movement of hazardous materials. One of the targets supporting this objective was to develop an effective chemical exchange program, making a business case for it in FY07, and fully implementing a comprehensive chemical exchange program in FY08. A Chemical Exchange Program (CEP) team was formed to implement this target. The team consists of representatives from the Chemical Information System (CIS), Pollution Prevention (P2), the HWMF, Procurement and the Environmental Management System (EMS). The CEP Team performed benchmarking and conducted a life-cycle analysis of the current management of chemicals at SNL/NM and compared it to Chemical Exchange alternatives. Those alternatives are as follows: (1) Revive the 'Virtual' Chemical Exchange Program; (2) Re-implement a 'Physical' Chemical Exchange Program using a Chemical Information System; and (3) Transition to a Chemical Management Services System. The analysis and benchmarking study shows that the present management of chemicals at SNL/NM is significantly disjointed and a life-cycle or 'Cradle-to-Grave' approach to chemical management is needed. This approach must consider the purchasing and maintenance costs as well as the cost of ultimate disposal of the chemicals and materials. A chemical exchange is needed as a mechanism to re-apply chemicals on site. This
Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.
1985-01-01
Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.
Energy Technology Data Exchange (ETDEWEB)
Lu, Z.-T.; Bailey, K.; Chen, C.-Y.; Du, X.; Li, Y.-M.; O' Connor, T. P.; Young, L.
2000-05-25
A new method of ultrasensitive trace-isotope analysis has been developed based upon the technique of laser manipulation of neutral atoms. It has been used to count individual {sup 85}Kr and {sup 81}Kr atoms present in a natural krypton sample with isotopic abundances in the range of 10{sup {minus}11} and 10{sup {minus}13}, respectively. The atom counts are free of contamination from other isotopes, elements,or molecules. The method is applicable to other trace-isotopes that can be efficiently captured with a magneto-optical trap, and has a broad range of potential applications.
Radioactivation analysis of hair
International Nuclear Information System (INIS)
With the aim of indicating environmental pollution effects by heavy metals on humans using hair, nondestructive activation analysis was applied to 382 normal Japanese hair samples (background level). Elemental contents of hair could be determined for Ag, Al, As, Br, Ca, Cd, Cl, Co, Cr, Cu, Fe, Hg, I, K, La, Mg, Mn, Na, S, Sb, Sc, Se, Sm, Ti, V and Zn. As these elements in hair have wide concentration ranges, the differences in concentration distribution between groups (sex, age, permanent treatment and regional difference) are discussed. A method for hair sampling is presented. (author)
Uncertain data envelopment analysis
Wen, Meilin
2014-01-01
This book is intended to present the milestones in the progression of uncertain Data envelopment analysis (DEA). Chapter 1 gives some basic introduction to uncertain theories, including probability theory, credibility theory, uncertainty theory and chance theory. Chapter 2 presents a comprehensive review and discussion of basic DEA models. The stochastic DEA is introduced in Chapter 3, in which the inputs and outputs are assumed to be random variables. To obtain the probability distribution of a random variable, a lot of samples are needed to apply the statistics inference approach. Chapter 4
Pedersen, Steen
2015-01-01
This textbook features applications including a proof of the Fundamental Theorem of Algebra, space filling curves, and the theory of irrational numbers. In addition to the standard results of advanced calculus, the book contains several interesting applications of these results. The text is intended to form a bridge between calculus and analysis. It is based on the authors lecture notes used and revised nearly every year over the last decade. The book contains numerous illustrations and cross references throughout, as well as exercises with solutions at the end of each section
Thida, Myo; Monekosso, Dorothy
2013-01-01
Video context analysis is an active and vibrant research area, which provides means for extracting, analyzing and understanding behavior of a single target and multiple targets. Over the last few decades, computer vision researchers have been working to improve the accuracy and robustness of algorithms to analyse the context of a video automatically. In general, the research work in this area can be categorized into three major topics: 1) counting number of people in the scene 2) tracking individuals in a crowd and 3) understanding behavior of a single target or multiple targets in the scene.
Moeslund, Thomas B
2011-01-01
This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes
He, Jingrui
2012-01-01
In many real-world problems, rare categories (minority classes) play essential roles despite their extreme scarcity. The discovery, characterization and prediction of rare categories of rare examples may protect us from fraudulent or malicious behavior, aid scientific discovery, and even save lives. This book focuses on rare category analysis, where the majority classes have smooth distributions, and the minority classes exhibit the compactness property. Furthermore, it focuses on the challenging cases where the support regions of the majority and minority classes overlap. The author has devel
Associative Analysis in Statistics
Mihaela Muntean
2015-01-01
In the last years, the interest in technologies such as in-memory analytics and associative search has increased. This paper explores how you can use in-memory analytics and an associative model in statistics. The word “associative” puts the emphasis on understanding how datasets relate to one another. The paper presents the main characteristics of “associative” data model. Also, the paper presents how to design an associative model for labor market indicators analysis. The source is the EU L...
Akcoglu, Mustafa A; Ha, Dzung Minh
2011-01-01
A rigorous introduction to calculus in vector spaces The concepts and theorems of advanced calculus combined with related computational methods are essential to understanding nearly all areas of quantitative science. Analysis in Vector Spaces presents the central results of this classic subject through rigorous arguments, discussions, and examples. The book aims to cultivate not only knowledge of the major theoretical results, but also the geometric intuition needed for both mathematical problem-solving and modeling in the formal sciences. The authors begin with an outline of key concepts, ter
Directory of Open Access Journals (Sweden)
Colin Crews
2015-06-01
Full Text Available The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS and liquid chromatography with fluorescence detection (LC-FLD. Other methods based on immunoassays are under development and variations of these and minor techniques are available for specific purposes.
Analysis of synchronous machines
Lipo, TA
2012-01-01
Analysis of Synchronous Machines, Second Edition is a thoroughly modern treatment of an old subject. Courses generally teach about synchronous machines by introducing the steady-state per phase equivalent circuit without a clear, thorough presentation of the source of this circuit representation, which is a crucial aspect. Taking a different approach, this book provides a deeper understanding of complex electromechanical drives. Focusing on the terminal rather than on the internal characteristics of machines, the book begins with the general concept of winding functions, describing the placeme
Performance Analysis in Bigdata
Directory of Open Access Journals (Sweden)
Pankaj Deep Kaur
2015-10-01
Full Text Available Big data technologies like Hadoop, NoSQL, Messaging Queues etc. helps in BigData analytics, drive business growth and to take right decisions in time. These Big Data environments are very dynamic and complex; they require performance validation, root cause analysis, and tuning to ensure success. In this paper we talk about how we can analyse and test the performance of these systems. We present the important factors in a big data that are primary candidates for performance testing like data ingestion capacity and throughput, data processing capacity, simulation of expected usage, map reduce jobs and so on and suggest measures to improve performance of bigdata
Operational Modal Analysis Tutorial
DEFF Research Database (Denmark)
Brincker, Rune; Andersen, Palle
In this paper the basic principles in operational modal testing and analysis are presented and discussed. A brief review of the techniques for operational modal testing and identification is presented, and it is argued, that there is now a wide range of techniques for effective identification of...... modal parameters of practical interest - including the mode shape scaling factor - with a high degree of accuracy. It is also argued that the operational technology offers the user a number of advantages over traditional modal testing. The operational modal technology allows the user to perform a modal...
Kumar, Narenkumarsanjiv
2010-01-01
This book has been written for B. Tech/B.Sc (Engg.)/B.E. students. It consists of seven chapters in all, covering the complete topics systematically and exhaustively. The book is designed as a complete course text of 'Power System Analysis' for undergraduate students of electrical engineering in accordance with the syllabi of Delhi Technological University, Indraprastha University, and Other India Universities/Institutions. This book is to meet the needs of Third Year (6th Semester) students of B.Tech. (Electrical Engineering and Electrical & Electronics Engineering) studying in Engineering co
Energy Technology Data Exchange (ETDEWEB)
D. Kicker
2004-09-16
Degradation of underground openings as a function of time is a natural and expected occurrence for any subsurface excavation. Over time, changes occur to both the stress condition and the strength of the rock mass due to several interacting factors. Once the factors contributing to degradation are characterized, the effects of drift degradation can typically be mitigated through appropriate design and maintenance of the ground support system. However, for the emplacement drifts of the geologic repository at Yucca Mountain, it is necessary to characterize drift degradation over a 10,000-year period, which is well beyond the functional period of the ground support system. This document provides an analysis of the amount of drift degradation anticipated in repository emplacement drifts for discrete events and time increments extending throughout the 10,000-year regulatory period for postclosure performance. This revision of the drift degradation analysis was developed to support the license application and fulfill specific agreement items between the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE). The earlier versions of ''Drift Degradation Analysis'' (BSC 2001 [DIRS 156304]) relied primarily on the DRKBA numerical code, which provides for a probabilistic key-block assessment based on realistic fracture patterns determined from field mapping in the Exploratory Studies Facility (ESF) at Yucca Mountain. A key block is defined as a critical block in the surrounding rock mass of an excavation, which is removable and oriented in an unsafe manner such that it is likely to move into an opening unless support is provided. However, the use of the DRKBA code to determine potential rockfall data at the repository horizon during the postclosure period has several limitations: (1) The DRKBA code cannot explicitly apply dynamic loads due to seismic ground motion. (2) The DRKBA code cannot explicitly apply loads due to thermal
DEFF Research Database (Denmark)
Conti, Roberto; Hong, Jeong Hee; Szymanski, Wojciech
2012-01-01
In this expository article, we discuss the recent progress in the study of endomorphisms and automorphisms of the Cuntz algebras and, more generally graph C*-algebras (or Cuntz-Krieger algebras). In particular, we discuss the definition and properties of both the full and the restricted Weyl group...... of such an algebra. Then we outline a powerful combinatorial approach to analysis of endomorphisms arising from permutation unitaries. The restricted Weyl group consists of automorphisms of this type. We also discuss the action of the restricted Weyl group on the diagonal MASA and its relationship...
Mendes, N
2015-01-01
The structural analysis involves the definition of the model and selection of the analysis type. The model should represent the stiffness, the mass and the loads of the structure. The structures can be represented using simplified models, such as the lumped mass models, and advanced models resorting the Finite Element Method (FEM) and Discrete Element Method (DEM). Depending on the characteristics of the structure, different types of analysis can be used such as limit analysis, linear and non...
[Cluster analysis in biomedical researches].
Akopov, A S; Moskovtsev, A A; Dolenko, S A; Savina, G D
2013-01-01
Cluster analysis is one of the most popular methods for the analysis of multi-parameter data. The cluster analysis reveals the internal structure of the data, group the separate observations on the degree of their similarity. The review provides a definition of the basic concepts of cluster analysis, and discusses the most popular clustering algorithms: k-means, hierarchical algorithms, Kohonen networks algorithms. Examples are the use of these algorithms in biomedical research. PMID:24640781
Blind Analysis in Particle Physics
Roodman, Aaron
2003-01-01
A review of the blind analysis technique, as used in particle physics measurements, is presented. The history of blind analyses in physics is briefly discussed. Next the dangers of "experimenter's bias" and the advantages of a blind analysis are described. Three distinct kinds of blind analysis in particle physics are presented in detail. Finally, the BABAR collaboration's experience with the blind analysis technique is discussed.
Distributed and collaborative software analysis
Ghezzi, G; H.C. Gall
2010-01-01
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysis such as source code analysis, duplication analysis, co-change analysis, bug prediction, or detection of bug fixing patterns. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data...
Firm Analysis by Different Methods
Píbilová, Kateřina
2012-01-01
This Diploma Thesis deals with an analysis of the company made by selected methods. External environment of the company is analysed using PESTLE analysis and Porter’s five-factor model. The internal environment is analysed by means of Kralicek Quick test and Fundamental analysis. SWOT analysis represents opportunities and threats of the external environment with the strengths and weaknesses of the company. The proposal of betterment of the company’s economic management is designed on the basi...
Strategic Analysis for Patch Ltd.
Louis, Owen
2012-01-01
This paper is a strategic analysis for the start-up Patch Ltd. Patch has developed innovative products for growing produce in homes and will compete in the consumer containergrowing industry. The industry and the company are introduced along with urban agriculture trends. The industry is analysed using Porter’s 5 forces analysis, and a competitive analysis compares Patch to its competitors in key success factors found in the 5 forces analysis. A strategy is developed using opportunities and t...
From Critical Discourse Analysis to Positive Discourse Analysis
Institute of Scientific and Technical Information of China (English)
黄茜然
2014-01-01
Different Scholars have different views of the definition of Discourse Analysis;it ’s a research method that can be used by scholars with a variety of academic and non-academic affiliations, coming from a variety of disciplines, to answer a variety of questions. Critical Discourse Analysis is a branch of Discourse Analysis; this paper introduces its development, guidance theory and approach of it. As Positive Discourse Analysis is the extension of Critical Discourse Analysis, this paper introduces the produc-tion and main theories of it. At last, a comparison was made between them.
Correspondence analysis of longitudinal data
Van der Heijden, P.G.M.
2005-01-01
Correspondence analysis is an exploratory tool for the analysis of associations between categorical variables, the results of which may be displayed graphically. For longitudinal data with two time points, an analysis of the transition matrix (showing the relative frequencies for pairs of categories
Discourse Analysis and Language Communication
Institute of Scientific and Technical Information of China (English)
韦钧玮
2008-01-01
A considerable portion of the work of discourse analysis as a research method can be find in the two major families of discourse analysis are linguistic-based analysis (such as conversation),and culturally or socially based discursive practices The potential of both families,along with examples of both,are discussed.
Multilevel analysis in CSCL research
Janssen, J.J.H.M.; Erkens, G.; Kirschner, P.A.; Kanselaar, G.
2011-01-01
The aim of this chapter is to explain why multilevel analysis (MLA) is often necessary to correctly answer the questions CSCL researchers address. Although CSCL researchers continue to use statistical techniques such as analysis of vari-ance or regression analysis, their datasets are often not suite
Exploratory Analysis in Learning Analytics
Gibson, David; de Freitas, Sara
2016-01-01
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
Portfolio Analysis for Vector Calculus
Kaplan, Samuel R.
2015-01-01
Classic stock portfolio analysis provides an applied context for Lagrange multipliers that undergraduate students appreciate. Although modern methods of portfolio analysis are beyond the scope of vector calculus, classic methods reinforce the utility of this material. This paper discusses how to introduce classic stock portfolio analysis in a…
Cross-impacts analysis development and energy policy analysis applications
Energy Technology Data Exchange (ETDEWEB)
Roop, J.M.; Scheer, R.M.; Stacey, G.S.
1986-12-01
Purpose of this report is to describe the cross-impact analysis process and microcomputer software developed for the Office of Policy, Planning, and Analysis (PPA) of DOE. First introduced in 1968, cross-impact analysis is a technique that produces scenarios of future conditions and possibilities. Cross-impact analysis has several unique attributes that make it a tool worth examining, especially in the current climate when the outlook for the economy and several of the key energy markets is uncertain. Cross-impact analysis complements the econometric, engineering, systems dynamics, or trend approaches already in use at DOE. Cross-impact analysis produces self-consistent scenarios in the broadest sense and can include interaction between the economy, technology, society and the environment. Energy policy analyses that couple broad scenarios of the future with detailed forecasting can produce more powerful results than scenario analysis or forecasts can produce alone.
Dynamic Contingency Analysis Tool
Energy Technology Data Exchange (ETDEWEB)
2016-01-14
The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.
Maintenance Process Strategic Analysis
Jasiulewicz-Kaczmarek, M.; Stachowiak, A.
2016-08-01
The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.
Concept Analysis: Music Therapy.
Murrock, Carolyn J; Bekhet, Abir K
2016-01-01
Down through the ages, music has been universally valued for its therapeutic properties based on the psychological and physiological responses in humans. However, the underlying mechanisms of the psychological and physiological responses to music have been poorly identified and defined. Without clarification, a concept can be misused, thereby diminishing its importance for application to nursing research and practice. The purpose of this article was for the clarification of the concept of music therapy based on Walker and Avant's concept analysis strategy. A review of recent nursing and health-related literature covering the years 2007-2014 was performed on the concepts of music, music therapy, preferred music, and individualized music. As a result of the search, the attributes, antecedents, and consequences of music therapy were identified, defined, and used to develop a conceptual model of music therapy. The conceptual model of music therapy provides direction for developing music interventions for nursing research and practice to be tested in various settings to improve various patient outcomes. Based on Walker and Avant's concept analysis strategy, model and contrary cases are included. Implications for future nursing research and practice to use the psychological and physiological responses to music therapy are discussed. PMID:27024999
Deimling, Klaus
1985-01-01
topics. However, only a modest preliminary knowledge is needed. In the first chapter, where we introduce an important topological concept, the so-called topological degree for continuous maps from subsets ofRn into Rn, you need not know anything about functional analysis. Starting with Chapter 2, where infinite dimensions first appear, one should be familiar with the essential step of consider ing a sequence or a function of some sort as a point in the corresponding vector space of all such sequences or functions, whenever this abstraction is worthwhile. One should also work out the things which are proved in § 7 and accept certain basic principles of linear functional analysis quoted there for easier references, until they are applied in later chapters. In other words, even the 'completely linear' sections which we have included for your convenience serve only as a vehicle for progress in nonlinearity. Another point that makes the text introductory is the use of an essentially uniform mathematical languag...
Zorich, Vladimir A
2016-01-01
This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems, and fresh applications to areas seldom touched on in textbooks on real analysis. The main difference between the second and first English editions is the addition of a series of appendices to each volume. There are six of them in the first volume and five in the second. The subjects of these appendices are diverse. They are meant to be useful to both students (in mathematics and physics) and teachers, who may be motivated by different go...
Sun, Jia-Ling; Lin, Chia-Clin; Tsai, Pei-Shan; Chou, Kuei-Ru
2008-10-01
Sleep performs an essential function in humans. Insomnia is one of the common phenomena in a poor sleep pattern. Long-term suffering can result in somatic symptoms and the development of diseases. It can even induce diseases with a mental dimension. Insomnia causes indications of poor health. No systematic analysis of insomnia has been performed, however. The purpose of this study, therefore, was to describe the concept of insomnia. In accordance with Walker and Avant's (2005) methodology of concept analysis, this paper presents a review of the conceptual definitions, characteristics, antecedents and consequences, constructing examples, and empirical references of insomnia. The results indicate that: (1) Insomnia's defining attributes are recognized as an insufficient of quality and quantity for sleep for more than one month. (2) Antecedents of insomnia include changes in life habits, physiological demands caused by sleep time changes, and the experience of uncomfortable sensations. (3) Consequences of insomnia include a poor condition, with physical, psychological, social, and global dimensions. (4) There are many instruments that can be used to inspect insomnia, including questionnaires and tools for physiological measurement. Insomnia is a serious problem with various facets. An understanding of the concept of insomnia will help nurses to perceive this problem in caring for subjects. PMID:18836979
[Analysis of antibiotic usage].
Balpataki, R; Balogh, J; Zelkó, R; Vincze, Z
2001-01-01
Economic analysis is founded on the assumption that resources are limited and that should be used in a way that maximizes the benefits gained. Pharmacoeconomics extends these assumptions to drug treatment. Therefore, a full pharmacoeconomic analysis must consider two or more alternative treatments and should be founded on measurement of incremental cost, incremental efficacy, and the value of successful outcome. Antibiotic policy based only on administrative restrictions is failed, instead of it disease formularies and infectologist consultation system are needed. Equally important are various programmes that encourage the cost-conscious use of the antibiotics chosen. Some of the methods evaluated in the literature include: streamlining from combination therapy to a single agent, early switching from parenteral to oral therapy, initiating treatment with oral agents, administering parenteral antibiotic at home from outset of therapy, and antibiotic streamlining programmes that are partnered with infectious disease physicians. The solution is the rational and adequate use of antibiotics, based on the modern theory and practice of antibiotic policy and infection control, that cannot be carried out without the activities of experts in this field. PMID:11769090
Gebali, Fayez
2015-01-01
This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies. · Provides techniques for modeling and analysis of network software and switching equipment; · Discusses design options used to build efficient switching equipment; · Includes many worked examples of the application of discrete-time Markov chains to communication systems; · Covers the mathematical theory and techniques necessary for ana...
Forensic Analysis of Cathinones.
Gautam, L; Shanmuganathan, A; Cole, M D
2013-03-01
In the past decade there has been a significant increase in the popularity of synthetic cathinones in the illegal drug market. They have been easily available from Internet-based vendors as well as at "head shops" and "smart shops". The recent prominence of synthetic cathinones can be attributed to their stimulatory properties similar to those of amphetamines. This paper provides a review on the current popular cathinone derivatives, their history and prevalence in the illegal drug market, legislation of these drugs in various countries, pharmacology, toxicology, and metabolism studies, analysis of toxicology samples (blood, urine, and hair) and criminalistic samples (seized, purchased via the Internet, and synthesized). From the reviewed literature, it is concluded that the products sold as "legal highs" do not only contain cathinone but also cathinone derivatives, and adulterants such as caffeine, lidocaine, and inorganic materials. Full toxicity data is currently unavailable for this drug class and hence more research is required with regard to their analysis and metabolism. Moreover, clandestine chemists are constantly synthesizing new derivatives and hence forensic chemists often need to synthesize and characterize these drugs to confirm the identity of the seized samples. This is expensive as well as time-consuming. Therefore, there is a need for national and international collaboration among forensic chemists to overcome this difficulty. PMID:26226850
Neptune Aerocapture Systems Analysis
Lockwood, Mary Kae
2004-01-01
A Neptune Aerocapture Systems Analysis is completed to determine the feasibility, benefit and risk of an aeroshell aerocapture system for Neptune and to identify technology gaps and technology performance goals. The high fidelity systems analysis is completed by a five center NASA team and includes the following disciplines and analyses: science; mission design; aeroshell configuration screening and definition; interplanetary navigation analyses; atmosphere modeling; computational fluid dynamics for aerodynamic performance and database definition; initial stability analyses; guidance development; atmospheric flight simulation; computational fluid dynamics and radiation analyses for aeroheating environment definition; thermal protection system design, concepts and sizing; mass properties; structures; spacecraft design and packaging; and mass sensitivities. Results show that aerocapture can deliver 1.4 times more mass to Neptune orbit than an all-propulsive system for the same launch vehicle. In addition aerocapture results in a 3-4 year reduction in trip time compared to all-propulsive systems. Aerocapture is feasible and performance is adequate for the Neptune aerocapture mission. Monte Carlo simulation results show 100% successful capture for all cases including conservative assumptions on atmosphere and navigation. Enabling technologies for this mission include TPS manufacturing; and aerothermodynamic methods and validation for determining coupled 3-D convection, radiation and ablation aeroheating rates and loads, and the effects on surface recession.
Concept Analysis: Music Therapy.
Murrock, Carolyn J; Bekhet, Abir K
2016-01-01
Down through the ages, music has been universally valued for its therapeutic properties based on the psychological and physiological responses in humans. However, the underlying mechanisms of the psychological and physiological responses to music have been poorly identified and defined. Without clarification, a concept can be misused, thereby diminishing its importance for application to nursing research and practice. The purpose of this article was for the clarification of the concept of music therapy based on Walker and Avant's concept analysis strategy. A review of recent nursing and health-related literature covering the years 2007-2014 was performed on the concepts of music, music therapy, preferred music, and individualized music. As a result of the search, the attributes, antecedents, and consequences of music therapy were identified, defined, and used to develop a conceptual model of music therapy. The conceptual model of music therapy provides direction for developing music interventions for nursing research and practice to be tested in various settings to improve various patient outcomes. Based on Walker and Avant's concept analysis strategy, model and contrary cases are included. Implications for future nursing research and practice to use the psychological and physiological responses to music therapy are discussed.