WorldWideScience

Sample records for analysis withmethanosarcina acetivorans

  1. The Methanosarcina barkeri genome: comparative analysis withMethanosarcina acetivorans and Methanosarcina mazei reveals extensiverearrangement within methanosarcinal genomes

    Energy Technology Data Exchange (ETDEWEB)

    Maeder, Dennis L.; Anderson, Iain; Brettin, Thomas S.; Bruce,David C.; Gilna, Paul; Han, Cliff S.; Lapidus, Alla; Metcalf, William W.; Saunders, Elizabeth; Tapia, Roxanne; Sowers, Kevin R.

    2006-05-19

    We report here a comparative analysis of the genome sequence of Methanosarcina barkeri with those of Methanosarcina acetivorans and Methanosarcina mazei. All three genomes share a conserved double origin of replication and many gene clusters. M. barkeri is distinguished by having an organization that is well conserved with respect to the other Methanosarcinae in the region proximal to the origin of replication with interspecies gene similarities as high as 95%. However it is disordered and marked by increased transposase frequency and decreased gene synteny and gene density in the proximal semi-genome. Of the 3680 open reading frames in M. barkeri, 678 had paralogs with better than 80% similarity to both M. acetivorans and M. mazei while 128 nonhypothetical orfs were unique (non-paralogous) amongst these species including a complete formate dehydrogenase operon, two genes required for N-acetylmuramic acid synthesis, a 14 gene gas vesicle cluster and a bacterial P450-specific ferredoxin reductase cluster not previously observed or characterized in this genus. A cryptic 36 kbp plasmid sequence was detected in M. barkeri that contains an orc1 gene flanked by a presumptive origin of replication consisting of 38 tandem repeats of a 143 nt motif. Three-way comparison of these genomes reveals differing mechanisms for the accrual of changes. Elongation of the large M. acetivorans is the result of multiple gene-scale insertions and duplications uniformly distributed in that genome, while M. barkeri is characterized by localized inversions associated with the loss of gene content. In contrast, the relatively short M. mazei most closely approximates the ancestral organizational state.

  2. Reductive nitrosylation of Methanosarcina acetivorans protoglobin: A comparative study

    Energy Technology Data Exchange (ETDEWEB)

    Ascenzi, Paolo, E-mail: ascenzi@uniroma3.it [Laboratorio Interdipartimentale di Microscopia Elettronica, Università Roma Tre, Via della Vasca Navale 79, I-00146 Roma (Italy); Istituto di Biochimica delle Proteine, CNR, Via Pietro Castellino 111, I-80131 Napoli (Italy); Pesce, Alessandra [Dipartimento di Fisica, Università di Genova, I-16146 Genova (Italy); Nardini, Marco; Bolognesi, Martino [Dipartimento di Bioscienze, Università di Milano, Via Celoria 26, I-20133 Milano (Italy); Ciaccio, Chiara; Coletta, Massimo [Dipartimento di Scienze Cliniche e Medicina Traslazionale, Università di Roma Tor Vergata, Via Montpellier 1, I-00133 Roma (Italy); Consorzio Interuniversitario di Ricerca in Chimica dei Metalli nei Sistemi Biologici, Piazza Umberto I 1, I-70121 Bari (Italy); Dewilde, Sylvia [Department of Biomedical Sciences, University of Antwerp, Universiteitsplein 1, B-2610 Antwerp (Belgium)

    2013-01-25

    Highlights: ► Methanosarcina acetivorans is a strictly anaerobic non-motile methane-producing Archaea. ► M. acetivorans protoglobin binds preferentially O{sub 2} rather than CO. ► Reductive nitrosylation of ferric M. acetivorans protoglobin. ► Nitrosylation of ferrious M. acetivorans protoglobin. ► M. acetivorans protoglobin is a scavenger of RNS and ROS. -- Abstract: Methanosarcina acetivorans is a strictly anaerobic non-motile methane-producing Archaea expressing protoglobin (Pgb) which might either facilitate O{sub 2} detoxification or act as a CO sensor/supplier in methanogenesis. Unusually, M. acetivorans Pgb (MaPgb) binds preferentially O{sub 2} rather than CO and displays anticooperativity in ligand binding. Here, kinetics and/or thermodynamics of ferric and ferrous MaPgb (MaPgb(III) and MaPgb(II), respectively) nitrosylation are reported. Data were obtained between pH 7.2 and 9.5, at 22.0 °C. Addition of NO to MaPgb(III) leads to the transient formation of MaPgb(III)–NO in equilibrium with MaPgb(II)–NO{sup +}. In turn, MaPgb(II)–NO{sup +} is converted to MaPgb(II) by OH{sup −}-based catalysis. Then, MaPgb(II) binds NO very rapidly leading to MaPgb(II)–NO. The rate-limiting step for reductive nitrosylation of MaPgb(III) is represented by the OH{sup −}-mediated reduction of MaPgb(II)–NO{sup +} to MaPgb(II). Present results highlight the potential role of MaPgb in scavenging of reactive nitrogen and oxygen species.

  3. The Geoglobus acetivorans genome: Fe(III) reduction, acetate utilization, autotrophic growth, and degradation of aromatic compounds in a hyperthermophilic archaeon.

    Science.gov (United States)

    Mardanov, Andrey V; Slododkina, Galina B; Slobodkin, Alexander I; Beletsky, Alexey V; Gavrilov, Sergey N; Kublanov, Ilya V; Bonch-Osmolovskaya, Elizaveta A; Skryabin, Konstantin G; Ravin, Nikolai V

    2015-02-01

    Geoglobus acetivorans is a hyperthermophilic anaerobic euryarchaeon of the order Archaeoglobales isolated from deep-sea hydrothermal vents. A unique physiological feature of the members of the genus Geoglobus is their obligate dependence on Fe(III) reduction, which plays an important role in the geochemistry of hydrothermal systems. The features of this organism and its complete 1,860,815-bp genome sequence are described in this report. Genome analysis revealed pathways enabling oxidation of molecular hydrogen, proteinaceous substrates, fatty acids, aromatic compounds, n-alkanes, and organic acids, including acetate, through anaerobic respiration linked to Fe(III) reduction. Consistent with the inability of G. acetivorans to grow on carbohydrates, the modified Embden-Meyerhof pathway encoded by the genome is incomplete. Autotrophic CO2 fixation is enabled by the Wood-Ljungdahl pathway. Reduction of insoluble poorly crystalline Fe(III) oxide depends on the transfer of electrons from the quinone pool to multiheme c-type cytochromes exposed on the cell surface. Direct contact of the cells and Fe(III) oxide particles could be facilitated by pilus-like appendages. Genome analysis indicated the presence of metabolic pathways for anaerobic degradation of aromatic compounds and n-alkanes, although an ability of G. acetivorans to grow on these substrates was not observed in laboratory experiments. Overall, our results suggest that Geoglobus species could play an important role in microbial communities of deep-sea hydrothermal vents as lithoautotrophic producers. An additional role as decomposers would close the biogeochemical cycle of carbon through complete mineralization of various organic compounds via Fe(III) respiration.

  4. Structure of the surface layer of the methanogenic archaean Methanosarcina acetivorans

    OpenAIRE

    Arbing, Mark A.; Chan, Sum; Shin, Annie; Phan, Tung; Ahn, Christine J.; Rohlin, Lars; Gunsalus, Robert P.

    2012-01-01

    Archaea have a self-assembling proteinaceous surface (S-) layer as the primary and outermost boundary of their cell envelopes. The S-layer maintains structural rigidity, protects the organism from adverse environmental elements, and yet provides access to all essential nutrients. We have determined the crystal structure of one of the two “homologous” tandem polypeptide repeats that comprise the Methanosarcina acetivorans S-layer protein and propose a high-resolution model for a microbial S-la...

  5. Air-adapted Methanosarcina acetivorans shows high methane production and develops resistance against oxygen stress.

    Directory of Open Access Journals (Sweden)

    Ricardo Jasso-Chávez

    Full Text Available Methanosarcina acetivorans, considered a strict anaerobic archaeon, was cultured in the presence of 0.4-1% O2 (atmospheric for at least 6 months to generate air-adapted cells; further, the biochemical mechanisms developed to deal with O2 were characterized. Methane production and protein content, as indicators of cell growth, did not change in air-adapted cells respect to cells cultured under anoxia (control cells. In contrast, growth and methane production significantly decreased in control cells exposed for the first time to O2. Production of reactive oxygen species was 50 times lower in air-adapted cells versus control cells, suggesting enhanced anti-oxidant mechanisms that attenuated the O2 toxicity. In this regard, (i the transcripts and activities of superoxide dismutase, catalase and peroxidase significantly increased; and (ii the thiol-molecules (cysteine + coenzyme M-SH + sulfide and polyphosphate contents were respectively 2 and 5 times higher in air-adapted cells versus anaerobic-control cells. Long-term cultures (18 days of air-adapted cells exposed to 2% O2 exhibited the ability to form biofilms. These data indicate that M. acetivorans develops multiple mechanisms to contend with O2 and the associated oxidative stress, as also suggested by genome analyses for some methanogens.

  6. Activation of methanogenesis by cadmium in the marine archaeon Methanosarcina acetivorans.

    Directory of Open Access Journals (Sweden)

    Elizabeth Lira-Silva

    Full Text Available Methanosarcina acetivorans was cultured in the presence of CdCl(2 to determine the metal effect on cell growth and biogas production. With methanol as substrate, cell growth and methane synthesis were not altered by cadmium, whereas with acetate, cadmium slightly increased both, growth and methane rate synthesis. In cultures metabolically active, incubations for short-term (minutes with 10 µM total cadmium increased the methanogenesis rate by 6 and 9 folds in methanol- and acetate-grown cells, respectively. Cobalt and zinc but not copper or iron also activated the methane production rate. Methanogenic carbonic anhydrase and acetate kinase were directly activated by cadmium. Indeed, cells cultured in 100 µM total cadmium removed 41-69% of the heavy metal from the culture and accumulated 231-539 nmol Cd/mg cell protein. This is the first report showing that (i Cd(2+ has an activating effect on methanogenesis, a biotechnological relevant process in the bio-fuels field; and (ii a methanogenic archaea is able to remove a heavy metal from aquatic environments.

  7. Structure of the surface layer of the methanogenic archaean Methanosarcina acetivorans

    Energy Technology Data Exchange (ETDEWEB)

    Arbing, Mark A.; Chan, Sum; Shin, Annie; Phan, Tung; Ahn, Christine J.; Rohlin, Lars; Gunsalus, Robert P. (UCLA)

    2012-09-05

    Archaea have a self-assembling proteinaceous surface (S-) layer as the primary and outermost boundary of their cell envelopes. The S-layer maintains structural rigidity, protects the organism from adverse environmental elements, and yet provides access to all essential nutrients. We have determined the crystal structure of one of the two 'homologous' tandem polypeptide repeats that comprise the Methanosarcina acetivorans S-layer protein and propose a high-resolution model for a microbial S-layer. The molecular features of our hexameric S-layer model recapitulate those visualized by medium resolution electron microscopy studies of microbial S-layers and greatly expand our molecular view of S-layer dimensions, porosity, and symmetry. The S-layer model reveals a negatively charged molecular sieve that presents both a charge and size barrier to restrict access to the cell periplasmic-like space. The {beta}-sandwich folds of the S-layer protein are structurally homologous to eukaryotic virus envelope proteins, suggesting that Archaea and viruses have arrived at a common solution for protective envelope structures. These results provide insight into the evolutionary origins of primitive cell envelope structures, of which the S-layer is considered to be among the most primitive: it also provides a platform for the development of self-assembling nanomaterials with diverse functional and structural properties.

  8. Structural Bases for the Regulation of CO Binding in the Archaeal Protoglobin from Methanosarcina acetivorans.

    Directory of Open Access Journals (Sweden)

    Lesley Tilleman

    Full Text Available Studies of CO ligand binding revealed that two protein states with different ligand affinities exist in the protoglobin from Methanosarcina acetivorans (in MaPgb*, residue Cys(E20101 was mutated to Ser. The switch between the two states occurs upon the ligation of MaPgb*. In this work, site-directed mutagenesis was used to explore the role of selected amino acids in ligand sensing and stabilization and in affecting the equilibrium between the "more reactive" and "less reactive" conformational states of MaPgb*. A combination of experimental data obtained from electronic and resonance Raman absorption spectra, CO ligand-binding kinetics, and X-ray crystallography was employed. Three amino acids were assigned a critical role: Trp(60B9, Tyr(61B10, and Phe(93E11. Trp(60B9 and Tyr(61B10 are involved in ligand stabilization in the distal heme pocket; the strength of their interaction was reflected by the spectra of the CO-ligated MaPgb* and by the CO dissociation rate constants. In contrast, Phe(93E11 is a key player in sensing the heme-bound ligand and promotes the rotation of the Trp(60B9 side chain, thus favoring ligand stabilization. Although the structural bases of the fast CO binding rate constant of MaPgb* are still unclear, Trp(60B9, Tyr(61B10, and Phe(93E11 play a role in regulating heme/ligand affinity.

  9. Structure of the surface layer of the methanogenic archaean Methanosarcina acetivorans.

    Science.gov (United States)

    Arbing, Mark A; Chan, Sum; Shin, Annie; Phan, Tung; Ahn, Christine J; Rohlin, Lars; Gunsalus, Robert P

    2012-07-17

    Archaea have a self-assembling proteinaceous surface (S-) layer as the primary and outermost boundary of their cell envelopes. The S-layer maintains structural rigidity, protects the organism from adverse environmental elements, and yet provides access to all essential nutrients. We have determined the crystal structure of one of the two "homologous" tandem polypeptide repeats that comprise the Methanosarcina acetivorans S-layer protein and propose a high-resolution model for a microbial S-layer. The molecular features of our hexameric S-layer model recapitulate those visualized by medium resolution electron microscopy studies of microbial S-layers and greatly expand our molecular view of S-layer dimensions, porosity, and symmetry. The S-layer model reveals a negatively charged molecular sieve that presents both a charge and size barrier to restrict access to the cell periplasmic-like space. The β-sandwich folds of the S-layer protein are structurally homologous to eukaryotic virus envelope proteins, suggesting that Archaea and viruses have arrived at a common solution for protective envelope structures. These results provide insight into the evolutionary origins of primitive cell envelope structures, of which the S-layer is considered to be among the most primitive: it also provides a platform for the development of self-assembling nanomaterials with diverse functional and structural properties.

  10. Apo and ligand-bound structures of ModA from the archaeon Methanosarcina acetivorans.

    Science.gov (United States)

    Chan, Sum; Giuroiu, Iulia; Chernishof, Irina; Sawaya, Michael R; Chiang, Janet; Gunsalus, Robert P; Arbing, Mark A; Perry, L Jeanne

    2010-03-01

    The trace-element oxyanion molybdate, which is required for the growth of many bacterial and archaeal species, is transported into the cell by an ATP-binding cassette (ABC) transporter superfamily uptake system called ModABC. ModABC consists of the ModA periplasmic solute-binding protein, the integral membrane-transport protein ModB and the ATP-binding and hydrolysis cassette protein ModC. In this study, X-ray crystal structures of ModA from the archaeon Methanosarcina acetivorans (MaModA) have been determined in the apoprotein conformation at 1.95 and 1.69 A resolution and in the molybdate-bound conformation at 2.25 and 2.45 A resolution. The overall domain structure of MaModA is similar to other ModA proteins in that it has a bilobal structure in which two mixed alpha/beta domains are linked by a hinge region. The apo MaModA is the first unliganded archaeal ModA structure to be determined: it exhibits a deep cleft between the two domains and confirms that upon binding ligand one domain is rotated towards the other by a hinge-bending motion, which is consistent with the 'Venus flytrap' model seen for bacterial-type periplasmic binding proteins. In contrast to the bacterial ModA structures, which have tetrahedral coordination of their metal substrates, molybdate-bound MaModA employs octahedral coordination of its substrate like other archaeal ModA proteins.

  11. Development of β-Lactamase as a Tool for Monitoring Conditional Gene Expression by a Tetracycline-Riboswitch in Methanosarcina acetivorans

    Directory of Open Access Journals (Sweden)

    Shemsi Demolli

    2014-01-01

    Full Text Available The use of reporter gene fusions to assess cellular processes such as protein targeting and regulation of transcription or translation is established technology in archaeal, bacterial, and eukaryal genetics. Fluorescent proteins or enzymes resulting in chromogenic substrate turnover, like β-galactosidase, have been particularly useful for microscopic and screening purposes. However, application of such methodology is of limited use for strictly anaerobic organisms due to the requirement of molecular oxygen for chromophore formation or color development. We have developed β-lactamase from Escherichia coli (encoded by bla in conjunction with the chromogenic substrate nitrocefin into a reporter system usable under anaerobic conditions for the methanogenic archaeon Methanosarcina acetivorans. By using a signal peptide of a putative flagellin from M. acetivorans and different catabolic promoters, we could demonstrate growth substrate-dependent secretion of β-lactamase, facilitating its use in colony screening on agar plates. Furthermore, a series of fusions comprised of a constitutive promoter and sequences encoding variants of the synthetic tetracycline-responsive riboswitch (tc-RS was created to characterize its influence on translation initiation in M. acetivorans. One tc-RS variant resulted in more than 11-fold tetracycline-dependent regulation of bla expression, which is in the range of regulation by naturally occurring riboswitches. Thus, tc-RS fusions represent the first solely cis-active, that is, factor-independent system for controlled gene expression in Archaea.

  12. Identification of the major expressed S-layer and cell surface-layer-related proteins in the model methanogenic archaea: Methanosarcina barkeri Fusaro and Methanosarcina acetivorans C2A.

    Science.gov (United States)

    Rohlin, Lars; Leon, Deborah R; Kim, Unmi; Loo, Joseph A; Ogorzalek Loo, Rachel R; Gunsalus, Robert P

    2012-01-01

    Many archaeal cell envelopes contain a protein coat or sheath composed of one or more surface exposed proteins. These surface layer (S-layer) proteins contribute structural integrity and protect the lipid membrane from environmental challenges. To explore the species diversity of these layers in the Methanosarcinaceae, the major S-layer protein in Methanosarcina barkeri strain Fusaro was identified using proteomics. The Mbar_A1758 gene product was present in multiple forms with apparent sizes of 130, 120, and 100 kDa, consistent with post-translational modifications including signal peptide excision and protein glycosylation. A protein with features related to the surface layer proteins found in Methanosarcina acetivorans C2A and Methanosarcina mazei Goel was identified in the M. barkeri genome. These data reveal a distinct conserved protein signature with features and implied cell surface architecture in the Methanosarcinaceae that is absent in other archaea. Paralogous gene expression patterns in two Methanosarcina species revealed abundant expression of a single S-layer paralog in each strain. Respective promoter elements were identified and shown to be conserved in mRNA coding and upstream untranslated regions. Prior M. acetivorans genome annotations assigned S-layer or surface layer associated roles of eighty genes: however, of 68 examined none was significantly expressed relative to the experimentally determined S-layer gene.

  13. Analysis of the MT1/MT2 Systems Involved in the Metabolism of One-Carbon Compounds in "Methanosarcina acetivorans" C2A

    Science.gov (United States)

    Opulencia, Rina Bagsic

    2009-01-01

    Methanogens are strictly anaerobic Archaea that derive their energy for growth by reducing a limited number of substrates to methane. "Methanosarcina" spp. utilize the methylotrophic pathway to grow on methanol, methylamines and methylsulfides. These compounds enter the methylotrophic pathway as methyl-coenzyme M, the synthesis of which is…

  14. Analysis

    CERN Document Server

    Maurin, Krzysztof

    1980-01-01

    The extraordinarily rapid advances made in mathematics since World War II have resulted in analysis becoming an enormous organism spread­ ing in all directions. Gone for good surely are the days of the great French "courses of analysis" which embodied the whole of the "ana­ lytical" knowledge of the times in three volumes-as the classical work of Camille Jordan. Perhaps that is why present-day textbooks of anal­ ysis are disproportionately modest relative to the present state of the art. More: they have "retreated" to the state before Jordan and Goursat. In recent years the scene has been changing rapidly: Jean Dieudon­ ne is offering us his monumentel Elements d'Analyse (10 volumes) written in the spirit of the great French Course d'Analyse. To the best of my knowledge, the present book is the only one of its size: starting from scratch-from rational numbers, to be precise-it goes on to the theory of distributions, direct integrals, analysis on com­ plex manifolds, Kahler manifolds, the theory of sheave...

  15. Analysis

    Science.gov (United States)

    Abdelazeem, Maha; El-Sawy, El-Sawy K.; Gobashy, Mohamed M.

    2013-06-01

    Ar Rika fault zone constitutes one of the two major parts of the NW-SE Najd fault system (NFS), which is one of the most prominent structural features located in the east of the center of the Arabian Shield, Saudi Arabia. By using Enhancement Thematic Mapper data (ETM+) and Principle Component Analysis (PCA), surface geological characteristics, distribution of rock types, and the different trends of linear features and faults are determined in the study area. First and second order magnetic gradients of the geomagnetic field at the North East of Wadi Ar Rika have been calculated in the frequency domain to map both surface and subsurface lineaments and faults. Lineaments as deduced from previous studies, suggest an extension of the NFS beneath the cover rocks in the study area. In the present study, integration of magnetic gradients and remote sensing analysis that resulted in different valuable derivative maps confirm the subsurface extension of some of the surface features. The 3D Euler deconvolution, the total gradient, and the tilt angle maps have been utilized to determine accurately the distribution of shear zones, the tectonic implications, and the internal structures of the terranes in the Ar Rika quadrangle in three dimensions.

  16. Needs analysis

    Institute of Scientific and Technical Information of China (English)

    黄静雅

    2009-01-01

    Though needs analysis has been seen as one of the fundamental defining characteristics of ESP,it is not unknown in GPE.it is a salient stage,especially for the task option and syllabus evaluation.In this essay,the meaning of needs,the importance of needs analysis,and the way to implement needs analysis are subject to analysis.

  17. Sensitivity analysis

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...

  18. CSF analysis

    Science.gov (United States)

    Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...

  19. Grey analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cable, G.D.

    1996-12-01

    Grey logic is not another name for fuzzy logic. Grey logic--also called grey analysis or grey system theory--is a new technology, a group of techniques for system analysis and modeling. Like fuzzy logic, grey logic is useful in situations with incomplete and uncertain information. Grey analysis is particularly applicable in instances with very limited data and in cases with little system knowledge or understanding. In this paper, a summary of the basic concepts of grey analysis is provided, with descriptions of its application to several classes of problems. Calculations methods are provided for grey relation analysis, and for modeling and prediction using grey methods.

  20. Fourier analysis

    CERN Document Server

    Stade, Eric

    2005-01-01

    A reader-friendly, systematic introduction to Fourier analysis Rich in both theory and application, Fourier Analysis presents a unique and thorough approach to a key topic in advanced calculus. This pioneering resource tells the full story of Fourier analysis, including its history and its impact on the development of modern mathematical analysis, and also discusses essential concepts and today's applications. Written at a rigorous level, yet in an engaging style that does not dilute the material, Fourier Analysis brings two profound aspects of the discipline to the forefront: the wealth of ap

  1. Panel Analysis

    DEFF Research Database (Denmark)

    Brænder, Morten; Andersen, Lotte Bøgh

    2014-01-01

    Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...

  2. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    2010-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  3. Scenario analysis

    NARCIS (Netherlands)

    Li, L.; Braat, L.C.; Lei, G.; Arets, E.J.M.M.; Liu, J.; Jiang, L.; Fan, Z.; Liu, W.; He, H.; Sun, X.

    2014-01-01

    This chapter presents the results of the scenario analysis of China’s ecosystems focusing on forest, grassland, and wetland ecosystems. The analysis was undertaken using Conversion of Land Use Change and its Effects (CLUE) modeling and an ecosystem service matrix (as explained below) complemented by

  4. Recursive analysis

    CERN Document Server

    Goodstein, R L

    2010-01-01

    Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and

  5. Numerical analysis

    CERN Document Server

    Khabaza, I M

    1960-01-01

    Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput

  6. Conversation Analysis.

    Science.gov (United States)

    Schiffrin, Deborah

    1990-01-01

    Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…

  7. Surface analysis.

    Science.gov (United States)

    Kinsella, T

    2006-10-01

    Surface analysis techniques are important tools to use in the verification of surface cleanliness and medical device functionality. How these techniques can be employed and some example applications are described.

  8. Biorefinery Analysis

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.

  9. CSF Analysis

    Science.gov (United States)

    ... a chronic disease, such as multiple sclerosis or Alzheimer disease . Depending on a person's history, a healthcare provider may order CSF analysis when some combination of the following signs and symptoms appear, especially when accompanied by flu-like symptoms ...

  10. Analysis II

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  11. Analysis I

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  12. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from m

  13. Link Analysis

    Science.gov (United States)

    Donoho, Steve

    Link analysis is a collection of techniques that operate on data that can be represented as nodes and links. This chapter surveys a variety of techniques including subgraph matching, finding cliques and K-plexes, maximizing spread of influence, visualization, finding hubs and authorities, and combining with traditional techniques (classification, clustering, etc). It also surveys applications including social network analysis, viral marketing, Internet search, fraud detection, and crime prevention.

  14. Business analysis

    CERN Document Server

    Paul, Debra; Cadle, James

    2010-01-01

    Throughout the business world, public, private and not-for-profit organisations face huge challenges. Business analysts must respond by developing practical, creative and financially sound solutions. This excellent guide gives them the necessary tools. It supports everyone wanting to achieve university and industry qualifications in business analysis and information systems. It is particularly beneficial for those studying for ISEB qualifications in Business Analysis. Some important additions since the first edition (2006): the inclusion of new techniques such as Ishikawa diagrams and spaghe

  15. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  16. Numerical analysis

    CERN Document Server

    Jacques, Ian

    1987-01-01

    This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...

  17. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  18. Real analysis

    CERN Document Server

    Loeb, Peter A

    2016-01-01

    This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....

  19. Real analysis

    CERN Document Server

    DiBenedetto, Emmanuele

    2016-01-01

    The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...

  20. Image Analysis

    DEFF Research Database (Denmark)

    The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...

  1. Cluster analysis

    CERN Document Server

    Everitt, Brian S; Leese, Morven; Stahl, Daniel

    2011-01-01

    Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons

  2. Elementary analysis

    CERN Document Server

    Snell, K S; Langford, W J; Maxwell, E A

    1966-01-01

    Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is

  3. Nonlinear analysis

    CERN Document Server

    Nanda, Sudarsan

    2013-01-01

    "Nonlinear analysis" presents recent developments in calculus in Banach space, convex sets, convex functions, best approximation, fixed point theorems, nonlinear operators, variational inequality, complementary problem and semi-inner-product spaces. Nonlinear Analysis has become important and useful in the present days because many real world problems are nonlinear, nonconvex and nonsmooth in nature. Although basic concepts have been presented here but many results presented have not appeared in any book till now. The book could be used as a text for graduate students and also it will be useful for researchers working in this field.

  4. Piping Analysis

    Science.gov (United States)

    1980-01-01

    Burns & McDonnell provide architectural and engineering services in planning, design and construction of a wide range of projects all over the world. In design analysis, company regularly uses COSMIC computer programs. In computer testing piping design of a power plant, company uses Pipe Flexibility Analysis Program (MEL-21) to analyze stresses due to weight, temperature, and pressure found in proposed piping systems. Individual flow rates are put into the computer, then computer calculates the pressure drop existing across each component; if needed, design corrections or adjustments can be made and rechecked.

  5. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  6. Genre Analysis

    Institute of Scientific and Technical Information of China (English)

    张楠

    2015-01-01

    Swales' CARS model is a robust method of genre analysis.The model is first designed for article introductions; its application covers a wide range from academic to non-academic. The present paper is completely a theoretical framework.At last,future research is proposed.

  7. Genre Analysis

    Institute of Scientific and Technical Information of China (English)

    张楠

    2015-01-01

    Swales’ CARS model is a robust method of genre analysis.The model is first designed for article introductions; its application covers a wide range from academic to non-academic.The present paper is completely a theoretical framework.At last,future research is proposed.

  8. Hydroeconomic analysis

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel

    2017-01-01

    Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...

  9. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  10. Regression Analysis

    CERN Document Server

    Freund, Rudolf J; Sa, Ping

    2006-01-01

    The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design

  11. Learner Analysis

    Institute of Scientific and Technical Information of China (English)

    Song Xuexia

    2005-01-01

    @@ In the past, more attempts had been made to explore the ways for teachers to teach English but fewer for learners to learn the language. Learner analysis is to analyze "What the learner is", including age, attitude, motivation, intelligence,aptitude, personality , and etc, with the purpose to realize the transition from "teacher-centered" into "learner-oriented".

  12. Poetic Analysis

    DEFF Research Database (Denmark)

    Nielsen, Kirsten

    2010-01-01

    The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonan...

  13. Inclusion Analysis

    CERN Document Server

    Colver, David

    2010-01-01

    Inclusion analysis is the name given by Operis to a black box testing technique that it has found to make the checking of key financial ratios calculated by spreadsheet models quicker, easier and more likely to find omission errors than code inspection.

  14. Wavelet analysis

    CERN Document Server

    Cheng, Lizhi; Luo, Yong; Chen, Bo

    2014-01-01

    This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...

  15. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  16. CMB Analysis

    CERN Document Server

    Bond, J R; Crittenden, Robert G.

    2001-01-01

    We describe the subject of Cosmic Microwave Background (CMB) analysis - its past, present and future. The theory of Gaussian primary anisotropies, those arising from linear physics operating in the early Universe, is in reasonably good shape so the focus has shifted to the statistical pipeline which confronts the data with the theory: mapping, filtering, comparing, cleaning, compressing, forecasting, estimating. There have been many algorithmic advances in the analysis pipeline in recent years, but still more are needed for the forecasts of high precision cosmic parameter estimation to be realized. For secondary anisotropies, those arising once nonlinearity develops, the computational state of the art currently needs effort in all the areas: the Sunyaev-Zeldovich effect, inhomogeneous reionization, gravitational lensing, the Rees-Sciama effect, dusty galaxies. We use the Sunyaev-Zeldovich example to illustrate the issues. The direct interface with observations for these non-Gaussian signals is much more compl...

  17. Elastodynamic Analysis

    DEFF Research Database (Denmark)

    Andersen, Lars

    This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different ....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....

  18. Mathematical analysis

    Science.gov (United States)

    Donaldson, J. A.

    1984-01-01

    Simple continuum models used in the design, analysis, and control of large space structures are examined. Particular emphasis is placed on boundary value problems associated with the Load Correction Method and control problems involving partial differential equations for the large space structure models. Partial differential equations will be used to model a large space structure, base the design of an optimal controller on this model, approximate the resulting optimal control model, and compare the results with data from other methods.

  19. Chromosome Analysis

    Science.gov (United States)

    1998-01-01

    Perceptive Scientific Instruments, Inc., provides the foundation for the Powergene line of chromosome analysis and molecular genetic instrumentation. This product employs image processing technology from NASA's Jet Propulsion Laboratory and image enhancement techniques from Johnson Space Center. Originally developed to send pictures back to earth from space probes, digital imaging techniques have been developed and refined for use in a variety of medical applications, including diagnosis of disease.

  20. Spectral Analysis

    CERN Document Server

    Cecconi, Jaures

    2011-01-01

    G. Bottaro: Quelques resultats d'analyse spectrale pour des operateurs differentiels a coefficients constants sur des domaines non bornes.- L. Garding: Eigenfuction expansions.- C. Goulaouic: Valeurs propres de problemes aux limites irreguliers: applications.- G. Grubb: Essential spectra of elliptic systems on compact manifolds.- J.Cl. Guillot: Quelques resultats recents en Scattering.- N. Schechter: Theory of perturbations of partial differential operators.- C.H. Wilcox: Spectral analysis of the Laplacian with a discontinuous coefficient.

  1. Paleoenvironmental analysis.

    Science.gov (United States)

    Arobba, Daniele; Boscato, Paolo; Boschian, Giovanni; Falgueres, Christophe; Fasani, Leone; Peretto, Carlo; Sala, Benedetto; Hohenstein, Ursula Thun; Tozzi, Carlo

    2004-06-01

    New analysis has been carried out concerning the palaeoenvironmental reconstruction of some Italian sites dating from the Middle Pleistocene to the Bronze Age. Different aspects have been investigated on each site considering the data collected. The following sites have been analyzed: Isernia La Pineta (Molise); Visogliano and Caverna degli Orsi (Tieste); Toirano Caves (Liguria); Grotta Paglicci (Gargano); Riparo del Molare (Salerno); Grotta del Cavallo (Lecce); Castellaro Lagusello (Monzambano, Mantova).

  2. Economic analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-06-01

    The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.

  3. Frontier Analysis

    DEFF Research Database (Denmark)

    Assaf, A. George; Josiassen, Alexander

    2016-01-01

    This article presents a comprehensive review of frontier studies in the tourism literature. We discuss the main advantages and disadvantages of the various frontier approaches, in particular, the nonparametric and parametric frontier approaches. The study further differentiates between micro...... on tourism research. The present review also highlights the limitations of existing studies and suggests an agenda for future research....... and macro applications of these approaches, summarizing and critically reviewing the characteristics of the existing studies. We also conduct a meta-analysis to create an overview of the efficiency results of frontier applications. This allows for an investigation of the impact of frontier methodology...

  4. Vector analysis

    CERN Document Server

    Brand, Louis

    2006-01-01

    The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou

  5. Grammar Analysis

    Institute of Scientific and Technical Information of China (English)

    PEI Yuan-yuan

    2013-01-01

      As the development of the global communication all over the world, English has become the international language for the global communication. Furthermore, speaking as a skill in English language learning becomes more and more important. As it is known, spoken English has specific features, thus to gain an explicit understanding of the features data analysis is helpful. Moreover it is useful for language teacher to make development in teaching to satisfy learner’s needs. Grammatical and phonological are the two remarkable aspects and specific features in spoken language therefore to discover elements into these aspects seems helpful to teaching spoken Enlgish.

  6. Vector analysis

    CERN Document Server

    Newell, Homer E

    2006-01-01

    When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e

  7. Sequential analysis

    CERN Document Server

    Wald, Abraham

    2013-01-01

    In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio

  8. Understanding analysis

    CERN Document Server

    Abbott, Stephen

    2015-01-01

    This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...

  9. Exergy analysis

    DEFF Research Database (Denmark)

    Dovjak, M.; Simone, Angela; Kolarik, Jakub

    2011-01-01

    Exergy analysis enables us to make connections among processes inside the human body and processes in a building. So far, only the effect of different combinations of air temperatures and mean radiant temperatures have been studied, with constant relative humidity in experimental conditions....... The objective of this study is to determine the effects of different levels of relative humidity (RH), air temperature (Ta) and effective clothing insulation on thermal comfort conditions from the exergy point of view. The performed analyses take into consideration the available data from the study by Toftum et...... al. (1998). The effect of different levels of RH, Ta and effective clothing insulation on human body exergy balance chain, changes in human body exergy consumption rate (hbExCr) and predicted mean vote (PMV) index were analyzed. The results show that thermal comfort conditions do not always results...

  10. Peritoneal Fluid Analysis

    Science.gov (United States)

    ... Home Visit Global Sites Search Help? Peritoneal Fluid Analysis Share this page: Was this page helpful? Formal name: Peritoneal Fluid Analysis Related tests: Pleural Fluid Analysis , Pericardial Fluid Analysis , ...

  11. Pleural Fluid Analysis Test

    Science.gov (United States)

    ... Home Visit Global Sites Search Help? Pleural Fluid Analysis Share this page: Was this page helpful? Formal name: Pleural Fluid Analysis Related tests: Pericardial Fluid Analysis , Peritoneal Fluid Analysis , ...

  12. Pericardial Fluid Analysis

    Science.gov (United States)

    ... Home Visit Global Sites Search Help? Pericardial Fluid Analysis Share this page: Was this page helpful? Formal name: Pericardial Fluid Analysis Related tests: Pleural Fluid Analysis , Peritoneal Fluid Analysis , ...

  13. Experimental modal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)

  14. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  15. Matrix analysis

    CERN Document Server

    Bhatia, Rajendra

    1997-01-01

    A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu­ ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe­ matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...

  16. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  17. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  18. Shape analysis in medical image analysis

    CERN Document Server

    Tavares, João

    2014-01-01

    This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...

  19. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  20. Critical Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    杜梅香

    2006-01-01

    This paper is about the discourse analysis and illustrate the approach to analysis the Critical discourses and the discourses about the educational situation of China. And it also includes the condensed theoretical support of the Critical discourse analysis and analysis of the sample I of the discourses between an illiterate person and the literate.

  1. Experimental modal analysis

    DEFF Research Database (Denmark)

    Ibsen, Lars Bo; Liingaard, Morten

    This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software (section 1.1), general digital analysis (section 1.2), basics of structural dynamics and modal analysis (section 1.3) and system ...

  2. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  3. K Basins Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  4. Integrated Sensitivity Analysis Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Friedman-Hill, Ernest J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Edward L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gibson, Marcus J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clay, Robert L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  5. Citation Analysis and Discourse Analysis Revisited

    Science.gov (United States)

    White, Howard D.

    2004-01-01

    John Swales's 1986 article "Citation analysis and discourse analysis" was written by a discourse analyst to introduce citation research from other fields, mainly sociology of science, to his own discipline. Here, I introduce applied linguists and discourse analysts to citation studies from information science, a complementary tradition not…

  6. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  7. Qualitative Content Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2000-06-01

    Full Text Available The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability. Then the central procedures of qualitative content analysis, inductive development of categories and deductive application of categories, are worked out. The possibilities of computer programs in supporting those qualitative steps of analysis are shown and the possibilities and limits of the approach are discussed. URN: urn:nbn:de:0114-fqs0002204

  8. Cluster analysis for applications

    CERN Document Server

    Anderberg, Michael R

    1973-01-01

    Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o

  9. Circuit analysis for dummies

    CERN Document Server

    Santiago, John

    2013-01-01

    Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help

  10. Operational Modal Analysis Tutorial

    DEFF Research Database (Denmark)

    Brincker, Rune; Andersen, Palle

    analysis in an easier way and in many cases more effectively than traditional modal analysis methods. It can be applied for modal testing and analysis on a wide range of structures and not only for problems generally investigated using traditional modal analysis, but also for those requiring load......In this paper the basic principles in operational modal testing and analysis are presented and discussed. A brief review of the techniques for operational modal testing and identification is presented, and it is argued, that there is now a wide range of techniques for effective identification...... estimation, vibration level estimation and fatigue analysis....

  11. Synovial fluid analysis

    Science.gov (United States)

    Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...

  12. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  13. Mathematical and statistical analysis

    Science.gov (United States)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  14. Buildings Sector Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hostick, Donna J.; Nicholls, Andrew K.; McDonald, Sean C.; Hollomon, Jonathan B.

    2005-08-01

    A joint NREL, ORNL, and PNNL team conducted market analysis to help inform DOE/EERE's Weatherization and Intergovernmental Program planning and management decisions. This chapter presents the results of the market analysis for the Buildings sector.

  15. Geologic spatial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thiessen, R.L.; Eliason, J.R.

    1989-01-01

    This report describes the development of geologic spatial analysis research which focuses on conducting comprehensive three-dimensional analysis of regions using geologic data sets that can be referenced by latitude, longitude, and elevation/depth. (CBS)

  16. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  17. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  18. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  19. Slice hyperholomorphic Schur analysis

    CERN Document Server

    Alpay, Daniel; Sabadini, Irene

    2016-01-01

    This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.

  20. Discourse analysis and Foucault's

    Directory of Open Access Journals (Sweden)

    Jansen I.

    2008-01-01

    Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.

  1. Biological sequence analysis

    DEFF Research Database (Denmark)

    Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose

    This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene......This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis...

  2. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  3. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  4. Practical data analysis

    CERN Document Server

    Cuesta, Hector

    2013-01-01

    Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.

  5. Text analysis and computers

    OpenAIRE

    1995-01-01

    Content: Erhard Mergenthaler: Computer-assisted content analysis (3-32); Udo Kelle: Computer-aided qualitative data analysis: an overview (33-63); Christian Mair: Machine-readable text corpora and the linguistic description of danguages (64-75); Jürgen Krause: Principles of content analysis for information retrieval systems (76-99); Conference Abstracts (100-131).

  6. Data analysis for chemistry

    CERN Document Server

    Hibbert, DBrynn

    2005-01-01

    Based on D Brynn Hibbert''s lectures on data analysis to undergraduates and graduate students, this book covers topics including measurements, means and confidence intervals, hypothesis testing, analysis of variance, and calibration models. It is meant as an entry level book targeted at learning and teaching undergraduate data analysis.

  7. Strategic Analysis Overview

    Science.gov (United States)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  8. Foundations of mathematical analysis

    CERN Document Server

    Johnsonbaugh, Richard

    2010-01-01

    This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss

  9. Mathematical analysis fundamentals

    CERN Document Server

    Bashirov, Agamirza

    2014-01-01

    The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o

  10. Analysis in usability evaluations

    DEFF Research Database (Denmark)

    Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper

    2010-01-01

    While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...

  11. Visual physics analysis VISPA

    Energy Technology Data Exchange (ETDEWEB)

    Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana, E-mail: tatsiana.klimkovich@physik.rwth-aachen.d [Physics Institute 3a, RWTH Aachen University, 52056 Aachen (Germany)

    2010-04-01

    VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.

  12. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  13. Multivariate analysis with LISREL

    CERN Document Server

    Jöreskog, Karl G; Y Wallentin, Fan

    2016-01-01

    This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.

  14. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  15. NASA trend analysis procedures

    Science.gov (United States)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  16. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T......, which is shown to be approximated by a χ2 distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling...... conditions was shown to give not only increased precision, but also improved accuracy of the results....

  17. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  18. LULU analysis program

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, H.J.; Lindstrom, P.J.

    1983-06-01

    Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday.

  19. Mastering Clojure data analysis

    CERN Document Server

    Rochester, Eric

    2014-01-01

    This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.

  20. Social network analysis

    OpenAIRE

    Mathias, Carlos Leonardo Kelmer

    2014-01-01

    In general, the paper develops a historiographical debate about the methodology of social network analysis. More than responding questions using such methodology, this article tries to introduce the historian to the founder bibliography of social network analysis. Since the publication of the famous article by John Barnes in 1954, sociologists linked to sociometric studies have usually employed the social network analysis in their studies. On the other hand, this methodology is not widespread...

  1. Problems in real analysis

    OpenAIRE

    Rǎdulescu, Teodora-Liliana T.; Andreescu, Titu; Rǎdulescu, Vicenţiu

    2015-01-01

    This book ontains a collection of challenging problems in elementary mathematical analysis, uses competition-inspired problems as a platform for training typical inventive skills, develops basic valuable techniques for solving problems in mathematical analysis on the real axis, assumes only a basic knowledge of the topic but opens the path to competitive research in the field, includes interesting and valuable historical accounts of ideas and methods in analysis, presents a connection between...

  2. Stochastic Analysis 2010

    CERN Document Server

    Crisan, Dan

    2011-01-01

    "Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa

  3. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is ...... airborne data. The simulation study shows that canonical information analysis is as accurate as and much faster than algorithms presented in previous work, especially for large sample sizes. URL: http://www.imm.dtu.dk/pubdb/p.php?6270...

  4. Space Weather Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Space Weather Analysis archives are model output of ionospheric, thermospheric and magnetospheric particle populations, energies and electrodynamics

  5. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  6. Thermogravimetric Analysis Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....

  7. Machine Fault Signature Analysis

    Directory of Open Access Journals (Sweden)

    Pratesh Jayaswal

    2008-01-01

    Full Text Available The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while detailed consideration is given to the subject of the rolling element bearing fault signature analysis.

  8. Finite element analysis

    CERN Document Server

    2010-01-01

    Finite element analysis is an engineering method for the numerical analysis of complex structures. This book provides a bird's eye view on this very broad matter through 27 original and innovative research studies exhibiting various investigation directions. Through its chapters the reader will have access to works related to Biomedical Engineering, Materials Engineering, Process Analysis and Civil Engineering. The text is addressed not only to researchers, but also to professional engineers, engineering lecturers and students seeking to gain a better understanding of where Finite Element Analysis stands today.

  9. Textile Technology Analysis Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...

  10. Chemical Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...

  11. Chemical Security Analysis Center

    Data.gov (United States)

    Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...

  12. Circuit analysis with Multisim

    CERN Document Server

    Baez-Lopez, David

    2011-01-01

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo

  13. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  14. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  15. International Market Analysis

    DEFF Research Database (Denmark)

    Sørensen, Olav Jull

    2009-01-01

    The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....

  16. Stable carbon isotope fractionation of six strongly fractionating microorganisms is not affected by growth temperature under laboratory conditions

    Science.gov (United States)

    Penger, Jörn; Conrad, Ralf; Blaser, Martin

    2014-09-01

    Temperature is the major driving force for many biological as well as chemical reactions and may impact the fractionation of stable carbon isotopes. Thus, a good correlation between temperature and fractionation is observed in many chemical systems that are controlled by an equilibrium isotope effect. In contrast, biological systems that are usually controlled by a kinetic isotope effect are less well studied with respect to temperature effects and have shown contrasting results. We studied three different biological pathways (methylotrophic methanogenesis, hydrogenotrophic methanogenesis, acetogenesis by the acetyl-CoA pathway) which are characterized by very strong carbon isotope enrichment factors (-50‰ to -83‰). The microorganisms (Methanosarcina barkeri, Methanosarcina acetivorans, Methanolobus zinderi, Methanothermobacter marburgensis, Methanothermobacter thermoautotrophicus, Thermoanaerobacter kivui) exhibiting these pathways were grown at different temperatures ranging between 25 and 68 °C, and the fractionation factors were determined from 13C/12C isotope discrimination during substrate depletion and product formation. Our experiments showed that the fractionation factors were different for the different metabolic pathways but were not much affected by the different growth temperatures. Slight variations were well within the standard errors of replication and regression analysis. Our results showed that temperature had no significant effect on the fractionation of stable carbon isotopes during anaerobic microbial metabolism with relatively strong isotope fractionation.

  17. The stepwise evolution of early life driven by energy conservation.

    Science.gov (United States)

    Ferry, James G; House, Christopher H

    2006-06-01

    Two main theories have emerged for the origin and early evolution of life based on heterotrophic versus chemoautotrophic metabolisms. With the exception of a role for CO, the theories have little common ground. Here we propose an alternative theory for the early evolution of the cell which combines principal features of the widely disparate theories. The theory is based on the extant pathway for conversion of CO to methane and acetate, largely deduced from the genomic analysis of the archaeon Methanosarcina acetivorans. In contrast to current paradigms, we propose that an energy-conservation pathway was the major force which powered and directed the early evolution of the cell. We envision the proposed primitive energy-conservation pathway to have developed sometime after a period of chemical evolution but prior to the establishment of diverse protein-based anaerobic metabolisms. We further propose that energy conservation played the predominant role in the later evolution of anaerobic metabolisms which explains the origin and evolution of extant methanogenic pathways.

  18. Enhancement of bioenergy production from organic wastes by two-stage anaerobic hydrogen and methane production process.

    Science.gov (United States)

    Luo, Gang; Xie, Li; Zhou, Qi; Angelidaki, Irini

    2011-09-01

    The present study investigated a two-stage anaerobic hydrogen and methane process for increasing bioenergy production from organic wastes. A two-stage process with hydraulic retention time (HRT) 3d for hydrogen reactor and 12d for methane reactor, obtained 11% higher energy compared to a single-stage methanogenic process (HRT 15 d) under organic loading rate (OLR) 3 gVS/(L d). The two-stage process was still stable when the OLR was increased to 4.5 gVS/(Ld), while the single-stage process failed. The study further revealed that by changing the HRT(hydrogen):HRT(methane) ratio of the two-stage process from 3:12 to 1:14, 6.7%, more energy could be obtained. Microbial community analysis indicated that the dominant bacterial species were different in the hydrogen reactors (Thermoanaerobacterium thermosaccharolyticum-like species) and methane reactors (Clostridium thermocellum-like species). The changes of substrates and HRT did not change the dominant species. The archaeal community structures in methane reactors were similar both in single- and two- stage reactors, with acetoclastic methanogens Methanosarcina acetivorans-like organisms as the dominant species.

  19. Topological Data Analysis

    OpenAIRE

    2016-01-01

    Topological Data Analysis (TDA) can broadly be described as a collection of data analysis methods that find structure in data. This includes: clustering, manifold estimation, nonlinear dimension reduction, mode estimation, ridge estimation and persistent homology. This paper reviews some of these methods.

  20. Proximate Analysis of Coal

    Science.gov (United States)

    Donahue, Craig J.; Rais, Elizabeth A.

    2009-01-01

    This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…

  1. Deconstructing Statistical Analysis

    Science.gov (United States)

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  2. Structural analysis for diagnosis

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Blanke, M.

    2002-01-01

    Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential tech-nique to obtain redundant information for diagnosis, is reconsidered in this paper. Matching is reformulated as a problem...

  3. Structural analysis for Diagnosis

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Blanke, M.

    2001-01-01

    Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...

  4. Educational Cost Analysis.

    Science.gov (United States)

    Flynn, Donald L.

    Traditional approaches to the cost analysis of educational programs involve examining annual budgets. Such approaches do not properly consider the cost of either new capital expenditures or the current value of previously purchased items. This paper presents the methodology for a new approach to educational cost analysis that identifies the actual…

  5. Interactive Controls Analysis (INCA)

    Science.gov (United States)

    Bauer, Frank H.

    1989-01-01

    Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.

  6. Northern blotting analysis

    DEFF Research Database (Denmark)

    Josefsen, Knud; Nielsen, Henrik

    2011-01-01

    Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membran...

  7. Electric field analysis

    CERN Document Server

    Chakravorti, Sivaji

    2015-01-01

    This book prepares newcomers to dive into the realm of electric field analysis. The book details why one should perform electric field analysis and what are its practical implications. It emphasizes both the fundamentals and modern computational methods of electric machines. The book covers practical applications of the numerical methods in high voltage equipment, including transmission lines, power transformers, cables, and gas insulated systems.

  8. Computational Music Analysis

    DEFF Research Database (Denmark)

    methodological issues, harmonic and pitch-class set analysis, form and voice-separation, grammars and hierarchical reduction, motivic analysis and pattern discovery and, finally, classification and the discovery of distinctive patterns. As a detailed and up-to-date picture of current research in computational...

  9. NOTATIONAL ANALYSIS OF SPORT

    Directory of Open Access Journals (Sweden)

    Ian M. Franks

    2004-06-01

    Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition

  10. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...

  11. From analysis to surface

    DEFF Research Database (Denmark)

    Bemman, Brian; Meredith, David

    2014-01-01

    of Sheer Pluck (1984), a twelve-tone composition for guitar by Milton Babbitt (1916–2011). This analysis focuses on the all-partition array structure on which the piece is based. Having pre- sented this analysis, we formalize some constraints on the structure of the piece and explore some computational...

  12. Structured Analysis - IDEF0

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1999-01-01

    that require a modelling technique for the analysis, development, re-engineering, integration, or acquisition of information systems; and incorporate a systems or enterprise modelling technique into a business process analysis or software engineering methodology.This note is a summary of the Standard...

  13. Laboratory analysis of stardust.

    Science.gov (United States)

    Zinner, Ernst

    2013-02-01

    Tiny dust grains extracted from primitive meteorites are identified to have originated in the atmospheres of stars on the basis of their anomalous isotopic compositions. Although isotopic analysis with the ion microprobe plays a major role in the laboratory analysis of these stardust grains, many other microanalytical techniques are applied to extract the maximum amount of information.

  14. Fundamentals of Functional Analysis

    CERN Document Server

    Kutateladze, S S; Slovák, Jan

    2001-01-01

    A concise guide to basic sections of modern functional analysis. Included are such topics as the principles of Banach and Hilbert spaces, the theory of multinormed and uniform spaces, the Riesz-Dunford holomorphic functional calculus, the Fredholm index theory, convex analysis and duality theory for locally convex spaces with applications to the Schwartz spaces of distributions and Radon measures.

  15. Application of SWOT analysis.

    Science.gov (United States)

    Casebeer, A

    SWOT analysis is an effective and simple planning technique which addresses one aspect of many strategic planning processes. Given the complex nature of modern health care systems, the ability to use this type of technique can enable health professionals to participate more fully in the analysis and implementation of health care improvement.

  16. Zen and Behavior Analysis

    Science.gov (United States)

    Bass, Roger

    2010-01-01

    Zen's challenge for behavior analysis is to explain a repertoire that renders analysis itself meaningless--a result following not from scientific or philosophical arguments but rather from a unique verbal history generated by Zen's methods. Untying Zen's verbal knots suggests how meditation's and koans' effects on verbal behavior contribute to…

  17. Stochastic Flutter Analysis

    NARCIS (Netherlands)

    Verhoosel, C.V.; Gutiérrez, M.A.; Hulshoff, S.J.

    2006-01-01

    The field of fluid-structure interaction is combined with the field of stochastics to perform a stochastic flutter analysis. Various methods to directly incorporate the effects of uncertainties in the flutter analysis are investigated. The panel problem with a supersonic fluid flowing over it is con

  18. Robust Sparse Analysis Regularization

    CERN Document Server

    Vaiter, Samuel; Dossal, Charles; Fadili, Jalal

    2011-01-01

    This paper studies the properties of L1-analysis regularization for the resolution of linear inverse problems. Most previous works consider sparse synthesis priors where the sparsity is measured as the L1 norm of the coefficients that synthesize the signal in a given dictionary. In contrast, the more general analysis regularization minimizes the L1 norm of the correlations between the signal and the atoms in the dictionary. The corresponding variational problem includes several well-known regularizations such as the discrete total variation and the fused lasso. We first prove that a solution of analysis regularization is a piecewise affine function of the observations. Similarly, it is a piecewise affine function of the regularization parameter. This allows us to compute the degrees of freedom associated to sparse analysis estimators. Another contribution gives a sufficient condition to ensure that a signal is the unique solution of the analysis regularization when there is no noise in the observations. The s...

  19. NMA Analysis Center

    Science.gov (United States)

    Kierulf, Halfdan Pascal; Andersen, Per Helge

    2013-01-01

    The Norwegian Mapping Authority (NMA) has during the last few years had a close cooperation with Norwegian Defence Research Establishment (FFI) in the analysis of space geodetic data using the GEOSAT software. In 2012 NMA has taken over the full responsibility for the GEOSAT software. This implies that FFI stopped being an IVS Associate Analysis Center in 2012. NMA has been an IVS Associate Analysis Center since 28 October 2010. NMA's contributions to the IVS as an Analysis Centers focus primarily on routine production of session-by-session unconstrained and consistent normal equations by GEOSAT as input to the IVS combined solution. After the recent improvements, we expect that VLBI results produced with GEOSAT will be consistent with results from the other VLBI Analysis Centers to a satisfactory level.

  20. IAC - INTEGRATED ANALYSIS CAPABILITY

    Science.gov (United States)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model

  1. Functional data analysis

    CERN Document Server

    Ramsay, J O

    1997-01-01

    Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...

  2. Gait analysis: clinical facts.

    Science.gov (United States)

    Baker, Richard; Esquenazi, Alberto; Benedetti, Maria G; Desloovere, Kaat

    2016-08-01

    Gait analysis is a well-established tool for the quantitative assessment of gait disturbances providing functional diagnosis, assessment for treatment planning, and monitoring of disease progress. There is a large volume of literature on the research use of gait analysis, but evidence on its clinical routine use supports a favorable cost-benefit ratio in a limited number of conditions. Initially gait analysis was introduced to clinical practice to improve the management of children with cerebral palsy. However, there is good evidence to extend its use to patients with various upper motor neuron diseases, and to lower limb amputation. Thereby, the methodology for properly conducting and interpreting the exam is of paramount relevance. Appropriateness of gait analysis prescription and reliability of data obtained are required in the clinical environment. This paper provides an overview on guidelines for managing a clinical gait analysis service and on the principal clinical domains of its application: cerebral palsy, stroke, traumatic brain injury and lower limb amputation.

  3. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  4. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    this limitation, based on the sociology of associations and the mathematics of set theory, this paper presents a new approach to big data analytics called social set analysis. Social set analysis consists of a generative framework for the philosophies of computational social science, theory of social data......, conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....

  5. Statistical Analysis of Thermal Analysis Margin

    Science.gov (United States)

    Garrison, Matthew B.

    2011-01-01

    NASA Goddard Space Flight Center requires that each project demonstrate a minimum of 5 C margin between temperature predictions and hot and cold flight operational limits. The bounding temperature predictions include worst-case environment and thermal optical properties. The purpose of this work is to: assess how current missions are performing against their pre-launch bounding temperature predictions and suggest any possible changes to the thermal analysis margin rules

  6. Dynamics in Epistasis Analysis.

    Science.gov (United States)

    Awdeh, Aseel; Phenix, Hilary; Kaern, Mads; Perkins, Theodore

    2017-01-16

    Finding regulatory relationships between genes, including the direction and nature of influence between them, is a fundamental challenge in the field of molecular genetics. One classical approach to this problem is epistasis analysis. Broadly speaking, epistasis analysis infers the regulatory relationships between a pair of genes in a genetic pathway by considering the patterns of change in an observable trait resulting from single and double deletion of genes. While classical epistasis analysis has yielded deep insights on numerous genetic pathways, it is not without limitations. Here, we explore the possibility of dynamic epistasis analysis, in which, in addition to performing genetic perturbations of a pathway, we drive the pathway by a time-varying upstream signal. We explore the theoretical power of dynamical epistasis analysis by conducting an identifiability analysis of Boolean models of genetic pathways, comparing static and dynamic approaches. We find that even relatively simple input dynamics greatly increases the power of epistasis analysis to discriminate alternative network structures. Further, we explore the question of experiment design, and show that a subset of short time-varying signals, which we call dynamic primitives, allow maximum discriminative power with a reduced number of experiments.

  7. Meaning from curriculum analysis

    Science.gov (United States)

    Finegold, Menahem; Mackeracher, Dorothy

    This paper reports on the analysis of science curricula carried out across Canada within the framework of the Second International Science Study (SISS). The organization of Canadian education in twelve autonomous educational jurisdictions is briefly described and problems are noted in relation to the analysis of curricula on a national scale. The international design for curriculum analysis is discussed and an alternative design, more suited to the diversity of science education in Canada, is introduced. The analysis of curriculum documents is described and three patterns which emerge from this analysis are identified. These derive from the concepts of commonality, specificity and prescriptiveness. Commonality relates to topics listed in curriculum guideline documents by a number of jurisdictions. Specificity refers to the richness of curriculum documents. Prescriptiveness is a measure of the extent to which jurisdictions do or do not make provision for local options in curriculum design. The Canadian analysis, using the concepts of the common curriculum, specificity and prescriptiveness, is described and research procedures are exemplified. Outcomes of curriculum analysis are presented in graphical form.

  8. Material analysis on engineering statistics

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hun

    2008-03-15

    This book is about material analysis on engineering statistics using mini tab, which includes technical statistics and seven tools of QC, probability distribution, presumption and checking, regression analysis, tim series analysis, control chart, process capacity analysis, measurement system analysis, sampling check, experiment planning, response surface analysis, compound experiment, Taguchi method, and non parametric statistics. It is good for university and company to use because it deals with theory first and analysis using mini tab on 6 sigma BB and MBB.

  9. Static analysis for blinding

    DEFF Research Database (Denmark)

    Nielsen, Christoffer Rosenkilde; Nielson, Hanne Riis

    2006-01-01

    operation blinding. In this paper we study the theoretical foundations for one of the successful approaches to validating cryptographic protocols and we extend it to handle the blinding primitive. Our static analysis approach is based on Flow Logic; this gives us a clean separation between the specification...... of the analysis and its realisation in an automatic tool. We concentrate on the former in the present paper and provide the semantic foundation for our analysis of protocols using blinding - also in the presence of malicious attackers....

  10. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  11. Android malware and analysis

    CERN Document Server

    Dunham, Ken

    2014-01-01

    The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact

  12. Observations on risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested.

  13. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  14. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  15. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  16. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  17. Wood Products Analysis

    Science.gov (United States)

    1990-01-01

    Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.

  18. [Emergy analysis of ecosystems].

    Science.gov (United States)

    Lan, S; Qin, P

    2001-02-01

    The concept of emergy and its use for comparing with energy were discussed in this paper. Emergy was defined as the amount of available energy required directly and indirectly to make a product or services, and emergy analysis was based on the solar emergy units (solar emjoules). Emergy provided the methodology with a common basis for measuring the value of different kinds of energy, and for evaluating the contributions from nature and humanity. A comparison of emergy analysis with previous energy analysis was also performed.

  19. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  20. Structural analysis of DAEs

    DEFF Research Database (Denmark)

    Poulsen, Mikael Zebbelin

    2002-01-01

    analysis of DAE is original in the sense that it is based on a new matrix representation of the structural information of a general DAE system instead of a graph oriented representation. Also the presentation of the theory is found to be more complete compared to other presentations, since it e.g. proves....... The methodology is mainly based on strutural index analysis which is not limited by the index of the DAE as other methodologies. As a result of structural index analysis one can perform index reduction of the DAE and obtain the so-called augmented underlying ODE. It is also described, how to use the augmented......, by the implementation of the Simpy tool box. This is an object oriented system implemented in the Python language. It can be used for analysis of DAEs, ODEs and non-linear equation and uses e.g. symbolic representations of expressions and equations. The presentations of theory and algorithms for structural index...

  1. Principles of Fourier analysis

    CERN Document Server

    Howell, Kenneth B

    2001-01-01

    Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...

  2. Analysis in Banach spaces

    CERN Document Server

    Hytönen, Tuomas; Veraar, Mark; Weis, Lutz

    2016-01-01

    The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...

  3. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  4. Coal - proximate analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-06-14

    This Standard establishes a practice for the proximate analysis of coal, that is, the coal is analysed for the content of moisture, ash and volatile matter; fixed carbon is calculated. The standard provides a basis for the comparison of coals.

  5. Lectures on Functional Analysis

    CERN Document Server

    Kurepa, Svetozar; Kraljević, Hrvoje

    1987-01-01

    This volume consists of a long monographic paper by J. Hoffmann-Jorgensen and a number of shorter research papers and survey articles covering different aspects of functional analysis and its application to probability theory and differential equations.

  6. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  7. Energy Sector Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.

    2006-10-01

    This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.

  8. Multidimensional nonlinear descriptive analysis

    CERN Document Server

    Nishisato, Shizuhiko

    2006-01-01

    Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...

  9. GAP Analysis Program (GAP)

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...

  10. Errors from Image Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.

  11. ANALYSIS OF MULTISCALE METHODS

    Institute of Scientific and Technical Information of China (English)

    Wei-nan E; Ping-bing Ming

    2004-01-01

    The heterogeneous multiscale method gives a general framework for the analysis of multiscale methods. In this paper, we demonstrate this by applying this framework to two canonical problems: The elliptic problem with multiscale coefficients and the quasicontinuum method.

  12. Perspectives in shape analysis

    CERN Document Server

    Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie

    2016-01-01

    This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...

  13. Intelligent audio analysis

    CERN Document Server

    Schuller, Björn W

    2013-01-01

    This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition.  Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...

  14. Longitudinal categorical data analysis

    CERN Document Server

    Sutradhar, Brajendra C

    2014-01-01

    This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...

  15. Unsupervised Linear Discriminant Analysis

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An algorithm for unsupervised linear discriminant analysis was presented. Optimal unsupervised discriminant vectors are obtained through maximizing covariance of all samples and minimizing covariance of local k-nearest neighbor samples. The experimental results show our algorithm is effective.

  16. Main: Nucleotide Analysis [KOME

    Lifescience Database Archive (English)

    Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against japon...ica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...

  17. Russian River Analysis

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This document is an analysis and summary of progress toward achieving the interim management objectives for the Russian River during the 1979 season. Additionally,...

  18. Design and Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...

  19. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  20. Materials analysis fast ions

    CERN Document Server

    Denker, A; Rauschenberg, J; Röhrich, J; Strub, E

    2006-01-01

    Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.

  1. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  2. Static Analysis Numerical Algorithms

    Science.gov (United States)

    2016-04-01

    STATIC ANALYSIS OF NUMERICAL ALGORITHMS KESTREL TECHNOLOGY, LLC APRIL 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...3. DATES COVERED (From - To) NOV 2013 – NOV 2015 4. TITLE AND SUBTITLE STATIC ANALYSIS OF NUMERICAL ALGORITHMS 5a. CONTRACT NUMBER FA8750-14-C... algorithms , linear digital filters and integrating accumulators, modifying existing versions of Honeywell’s HiLiTE model-based development system and

  3. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  4. DART system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Boggs, Paul T.; Althsuler, Alan (Exagrid Engineering); Larzelere, Alex R. (Exagrid Engineering); Walsh, Edward J.; Clay, Ruuobert L.; Hardwick, Michael F. (Sandia National Laboratories, Livermore, CA)

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a community model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.

  5. Cuckoo malware analysis

    CERN Document Server

    Oktavianto, Digit

    2013-01-01

    This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.

  6. Hedging in Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    XIAO Xin

    2015-01-01

    In this article an attempt to enhance the awareness of hedging use in discourse analysis and academic writing is made by analyzing hedges employed in two comparable texts. The discourse analysis is conducted from“content-oriented”hedges and“reader-oriented”hedges. The article suggests that hedging can dampen utterances and statements, weaken the force of what one says and show politeness to the listeners or readers, which varies from different discourse styles of various genres.

  7. Automated Sentiment Analysis

    Science.gov (United States)

    2009-06-01

    Sentiment Analysis? Deep philosophical questions could be raised about the nature of sentiment. It is not exactly an emotion – one can choose to...and syntactic analysis easier. It also forestalls misunderstanding; sentences likely to be misclassified (because of unusual style, sarcasm , etc...has no emotional significance. We focus on supervised learning for this prototype; though, we can alter our program to perform unsupervised learning

  8. Nonlinear functional analysis

    Directory of Open Access Journals (Sweden)

    W. L. Fouché

    1983-03-01

    Full Text Available In this article we discuss some aspects of nonlinear functional analysis. It included reviews of Banach’s contraction theorem, Schauder’s fixed point theorem, globalising techniques and applications of homotopy theory to nonlinear functional analysis. The author emphasises that fundamentally new ideas are required in order to achieve a better understanding of phenomena which contain both nonlinear and definite infinite dimensional features.

  9. Video analysis platform

    OpenAIRE

    FLORES, Pablo; Arias, Pablo; Lecumberry, Federico; Pardo, Álvaro

    2006-01-01

    In this article we present the Video Analysis Platform (VAP) which is an open source software framework for video analysis, processing and description. The main goals of VAP are: to provide a multiplatform system which allows the easy implementation of video algorithms, provide structures and algorithms for the segmentation of video data in its different levels of abstraction: shots, frames, objects, regions, etc, permit the generation and comparison of MPEG7-like descriptors, and develop tes...

  10. ESF BLAST DESIGN ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    E.F. fitch

    1995-03-13

    The purpose and objective of this design analysis are to develop controls considered necessary and sufficient to implement the requirements for the controlled drilling and blasting excavation of operations support alcoves and test support alcoves in the Exploratory Studies Facility (ESF). The conclusions reached in this analysis will flow down into a construction specification ensuring controlled drilling and blasting excavation will be performed within the bounds established here.

  11. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  12. PEST Analysis of Serbia

    OpenAIRE

    Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic

    2012-01-01

    The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...

  13. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  14. Distributed analysis in ATLAS

    CERN Document Server

    Legger, Federica; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

  15. A PROOF Analysis Framework

    CERN Document Server

    Gonzalez Caballero, Isidro

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of ...

  16. Morrey spaces in harmonic analysis

    OpenAIRE

    David R. Adams; Xiao, Jie

    2012-01-01

    Through a geometric capacitary analysis based on space dualities, this paper addresses several fundamental aspects of functional analysis and potential theory for the Morrey spaces in harmonic analysis over the Euclidean spaces.

  17. Morrey spaces in harmonic analysis

    Science.gov (United States)

    Adams, David R.; Xiao, Jie

    2012-10-01

    Through a geometric capacitary analysis based on space dualities, this paper addresses several fundamental aspects of functional analysis and potential theory for the Morrey spaces in harmonic analysis over the Euclidean spaces.

  18. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  19. Biosensors for Cell Analysis.

    Science.gov (United States)

    Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander

    2015-01-01

    Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.

  20. Pitch Analysis of Ukulele

    Directory of Open Access Journals (Sweden)

    Suphattharachai Chomphan

    2012-01-01

    Full Text Available Problem statement: The ukulele is a trendy instrument in the present day. It is a member of the guitar family of instruments which employs four nylon or gut strings or four courses of strings. However, a statistical analysis of the pitch of this instrument has not been conducted. To analysis pitch or fundamental frequency of its main cords should be performed in an appropriate way. This study brings about its effective sound synthesis which is an important issue in the future. Approach: An efficient technique for the analysis of the fundamental frequency (F0 of the human speech had been applied to the analysis of main cords of the ukulele. The autocorrelation-based technique was used with the signal waveform to extract the optimal period or pitch for the corresponding analyzed frame in time domain. Then the corresponding fundamental frequency was calculated in the frequency domain. Results: The 21 main cords were chosen in the study. It had been seen that the existing fundamental frequency values were varied from one to three values. The value was ranging from 65.42 Hz-329.93 Hz. Conclusion: By using the analysis technique of fundamental frequency of the human speech, the output frequencies of all main cords can be extracted. It can be empirically seen that they have their unique values from each others."

  1. Arterial waveform analysis.

    Science.gov (United States)

    Esper, Stephen A; Pinsky, Michael R

    2014-12-01

    The bedside measurement of continuous arterial pressure values from waveform analysis has been routinely available via indwelling arterial catheterization for >50 years. Invasive blood pressure monitoring has been utilized in critically ill patients, in both the operating room and critical care units, to facilitate rapid diagnoses of cardiovascular insufficiency and monitor response to treatments aimed at correcting abnormalities before the consequences of either hypo- or hypertension are seen. Minimally invasive techniques to estimate cardiac output (CO) have gained increased appeal. This has led to the increased interest in arterial waveform analysis to provide this important information, as it is measured continuously in many operating rooms and intensive care units. Arterial waveform analysis also allows for the calculation of many so-called derived parameters intrinsically created by this pulse pressure profile. These include estimates of left ventricular stroke volume (SV), CO, vascular resistance, and during positive-pressure breathing, SV variation, and pulse pressure variation. This article focuses on the principles of arterial waveform analysis and their determinants, components of the arterial system, and arterial pulse contour. It will also address the advantage of measuring real-time CO by the arterial waveform and the benefits to measuring SV variation. Arterial waveform analysis has gained a large interest in the overall assessment and management of the critically ill and those at a risk of hemodynamic deterioration.

  2. Harmonic and geometric analysis

    CERN Document Server

    Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao

    2015-01-01

    This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights.  The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...

  3. Professionalizing Intelligence Analysis

    Directory of Open Access Journals (Sweden)

    James B. Bruce

    2015-09-01

    Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.

  4. Color Medical Image Analysis

    CERN Document Server

    Schaefer, Gerald

    2013-01-01

    Since the early 20th century, medical imaging has been dominated by monochrome imaging modalities such as x-ray, computed tomography, ultrasound, and magnetic resonance imaging. As a result, color information has been overlooked in medical image analysis applications. Recently, various medical imaging modalities that involve color information have been introduced. These include cervicography, dermoscopy, fundus photography, gastrointestinal endoscopy, microscopy, and wound photography. However, in comparison to monochrome images, the analysis of color images is a relatively unexplored area. The multivariate nature of color image data presents new challenges for researchers and practitioners as the numerous methods developed for monochrome images are often not directly applicable to multichannel images. The goal of this volume is to summarize the state-of-the-art in the utilization of color information in medical image analysis.

  5. Real analysis on intervals

    CERN Document Server

    Choudary, A D R

    2014-01-01

    The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...

  6. Analysis of Waves

    DEFF Research Database (Denmark)

    Frigaard, Peter; Andersen, Thomas Lykke

    The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...

  7. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  8. Mathematical analysis I

    CERN Document Server

    Zorich, Vladimir A

    2015-01-01

    VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...

  9. Software safety hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  10. Proximate analysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Donahue, C.J.; Rais, E.A. [University of Michigan, Dearborn, MI (USA)

    2009-02-15

    This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter, fixed carbon, and ash content are determined for each sample and comparisons are made. Proximate analysis is performed on a coal sample from a local electric utility. From the weight percent sulfur found in the coal (determined by a separate procedure the Eschka method) and the ash content, students calculate the quantity of sulfur dioxide emissions and ash produced annually by a large coal-fired electric power plant.

  11. Digital Fourier analysis fundamentals

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations.  These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...

  12. Principles of harmonic analysis

    CERN Document Server

    Deitmar, Anton

    2014-01-01

    This book offers a complete and streamlined treatment of the central principles of abelian harmonic analysis: Pontryagin duality, the Plancherel theorem and the Poisson summation formula, as well as their respective generalizations to non-abelian groups, including the Selberg trace formula. The principles are then applied to spectral analysis of Heisenberg manifolds and Riemann surfaces. This new edition contains a new chapter on p-adic and adelic groups, as well as a complementary section on direct and projective limits. Many of the supporting proofs have been revised and refined. The book is an excellent resource for graduate students who wish to learn and understand harmonic analysis and for researchers seeking to apply it.

  13. Real mathematical analysis

    CERN Document Server

    Pugh, Charles C

    2015-01-01

    Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...

  14. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  15. Pitfalls of exergy analysis

    CERN Document Server

    Vágner, Petr; Maršík, František

    2016-01-01

    The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.

  16. Foundations of VISAR analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H.

    2006-06-01

    The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.

  17. The data analysis handbook

    CERN Document Server

    Frank, IE

    1994-01-01

    Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...

  18. Exercises in analysis

    CERN Document Server

    Gasiński, Leszek

    2016-01-01

    This second of two Exercises in Analysis volumes covers problems in five core topics of mathematical analysis: Function Spaces, Nonlinear and Multivalued Maps, Smooth and Nonsmooth Calculus, Degree Theory and Fixed Point Theory, and Variational and Topological Methods. Each of five topics corresponds to a different chapter with inclusion of the basic theory and accompanying main definitions and results, followed by suitable comments and remarks for better understanding of the material. Exercises/problems are presented for each topic, with solutions available at the end of each chapter. The entire collection of exercises offers a balanced and useful picture for the application surrounding each topic. This nearly encyclopedic coverage of exercises in mathematical analysis is the first of its kind and is accessible to a wide readership. Graduate students will find the collection of problems valuable in preparation for their preliminary or qualifying exams as well as for testing their deeper understanding of the ...

  19. Gabor Analysis for Imaging

    DEFF Research Database (Denmark)

    Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan

    2015-01-01

    , it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....

  20. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  1. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  2. Fuzzy data analysis

    CERN Document Server

    Bandemer, Hans

    1992-01-01

    Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.

  3. Neutron signal transfer analysis

    CERN Document Server

    Pleinert, H; Lehmann, E

    1999-01-01

    A new method called neutron signal transfer analysis has been developed for quantitative determination of hydrogenous distributions from neutron radiographic measurements. The technique is based on a model which describes the detector signal obtained in the measurement as a result of the action of three different mechanisms expressed by signal transfer functions. The explicit forms of the signal transfer functions are determined by Monte Carlo computer simulations and contain only the distribution as a variable. Therefore an unknown distribution can be determined from the detector signal by recursive iteration. This technique provides a simple and efficient tool for analysis of this type while also taking into account complex effects due to the energy dependency of neutron interaction and single and multiple scattering. Therefore this method provides an efficient tool for precise quantitative analysis using neutron radiography, as for example quantitative determination of moisture distributions in porous buil...

  4. Analysis of Autopilot Behavior

    Science.gov (United States)

    Sherry, Lance; Polson, Peter; Feay, Mike; Palmer, Everett; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    Aviation and cognitive science researchers have identified situations in which the pilot's expectations for behavior of autopilot avionics are not matched by the actual behavior of the avionics. These "automation surprises" have been attributed to differences between the pilot's model of the behavior of the avionics and the actual behavior encoded in the avionics software. A formal technique is described for the analysis and measurement of the behavior of the cruise pitch modes of a modern Autopilot. The analysis characterizes the behavior of the Autopilot as situation-action rules. The behavior of the cruise pitch mode logic for a contemporary modern Autopilot was found to include 177 rules, including Level Change (23), Vertical Speed (16), Altitude Capture (50), and Altitude Hold (88). These rules are determined based on the values of 62 inputs. Analysis of the rule-based model also shed light on the factors cited in the literature as contributors to "automation surprises."

  5. Behavior analysis in Brasil

    Directory of Open Access Journals (Sweden)

    João Cláudio Todorov

    2006-01-01

    Full Text Available The history of behavior analysis in Brazil began with the visit of Fred S. Keller as a FulbrightScholar to the University of São Paulo in 1961. Keller introduced Skinner works to the Brazilianpsychologists. His first assistant was Carolina Martuscelli Bori, then a social psychologistinfluenced by the work of Kurt Lewin. Initially guided by Keller, Carolina Bori was the majorforce in the diffusion of Behavior Analysis in Brazil, beginning with the psychology course ofthe University of Brasília, where the first course on Experimental Analysis of Behavior beganin August of 1964. Most of behavior analysts in Brazil today were students, directly or indirectly,of Carolina Bori. Several graduate programs throughout the country offer courses in behavioranalysis.

  6. Mathematical analysis II

    CERN Document Server

    Canuto, Claudio

    2015-01-01

    The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...

  7. A Role for Language Analysis in Mathematics Textbook Analysis

    Science.gov (United States)

    O'Keeffe, Lisa; O'Donoghue, John

    2015-01-01

    In current textbook analysis research, there is a strong focus on the content, structure and expectation presented by the textbook as elements for analysis. This research moves beyond such foci and proposes a framework for textbook language analysis which is intended to be integrated into an overall framework for mathematics textbook analysis. The…

  8. Criticality Analysis of GTPPS

    Directory of Open Access Journals (Sweden)

    Asis Sarkar

    2011-10-01

    Full Text Available The paper is concerned with the study of criticality analysis of components of Gas Turbine Power Plant Systems (GTPPS and the failures occurring in the plant. Failure mode and effect and criticality analysis (FMECA is carried out to estimate the criticality number for different components and failure modes. In addition the failure effects, higher effects and end effectsare incorporated in the final FMECA sheet. The criticality resultscompensating provision will highlight possible ways to tackle thefailures economically. The findings in this Paper are (1 criticality index of the components (2 Critical failures (3 compensating provision of critical failure.

  9. Sensitivity analysis of SPURR

    Energy Technology Data Exchange (ETDEWEB)

    Witholder, R.E.

    1980-04-01

    The Solar Energy Research Institute has conducted a limited sensitivity analysis on a System for Projecting the Utilization of Renewable Resources (SPURR). The study utilized the Domestic Policy Review scenario for SPURR agricultural and industrial process heat and utility market sectors. This sensitivity analysis determines whether variations in solar system capital cost, operation and maintenance cost, and fuel cost (biomass only) correlate with intuitive expectations. The results of this effort contribute to a much larger issue: validation of SPURR. Such a study has practical applications for engineering improvements in solar technologies and is useful as a planning tool in the R and D allocation process.

  10. Environmental analysis support

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.L.

    1996-06-01

    Activities in environmental analysis support included assistance to the Morgantown and Pittsburgh Energy Technology Centers (METC and PETC) in reviewing and preparing documents required by the National Environmental Policy Act (NEPA) for projects selected for the Clean Coal Technology (CCT) Program. An important activity was the preparation for METC of a final Environmental Assessment (EA) for the proposed Externally Fired Combined Cycle (EFCC) Project in Warren, Pennsylvania. In addition, a post-project environmental analysis was prepared for PETC to evaluate the Demonstration of Advanced Combustion Techniques for a Tangentially-Fired Boiler in Lynn Haven, Florida.

  11. Stochastic decision analysis

    Science.gov (United States)

    Lacksonen, Thomas A.

    1994-01-01

    Small space flight project design at NASA Langley Research Center goes through a multi-phase process from preliminary analysis to flight operations. The process insures that each system achieves its technical objectives with demonstrated quality and within planned budgets and schedules. A key technical component of early phases is decision analysis, which is a structure procedure for determining the best of a number of feasible concepts based upon project objectives. Feasible system concepts are generated by the designers and analyzed for schedule, cost, risk, and technical measures. Each performance measure value is normalized between the best and worst values and a weighted average score of all measures is calculated for each concept. The concept(s) with the highest scores are retained, while others are eliminated from further analysis. This project automated and enhanced the decision analysis process. Automation of the decision analysis process was done by creating a user-friendly, menu-driven, spreadsheet macro based decision analysis software program. The program contains data entry dialog boxes, automated data and output report generation, and automated output chart generation. The enhancements to the decision analysis process permit stochastic data entry and analysis. Rather than enter single measure values, the designers enter the range and most likely value for each measure and concept. The data can be entered at the system or subsystem level. System level data can be calculated as either sum, maximum, or product functions of the subsystem data. For each concept, the probability distributions are approximated for each measure and the total score for each concept as either constant, triangular, normal, or log-normal distributions. Based on these distributions, formulas are derived for the probability that the concept meets any given constraint, the probability that the concept meets all constraints, and the probability that the concept is within a given

  12. Transient Error Data Analysis.

    Science.gov (United States)

    1979-05-01

    Analysis is 3.2 Graphical Data Analysis 16 3.3 General Statistics and Confidence Intervals 1" 3.4 Goodness of Fit Test 15 4. Conclusions 31 Acknowledgements...MTTF per System Technology Mechanism Processor Processor MT IE . CMUA PDP-10, ECL Parity 44 hrs. 800-1600 hrs. 0.03-0.06 Cm* LSI-1 1, NMOS Diagnostics...OF BAD TIME ERRORS: 6 TOTAL NUMBER OF ENTRIES FOR ALL INPUT FILESs 18445 TIME SPAN: 1542 HRS., FROM: 17-Feb-79 5:3:11 TO: 18-1Mj-79 11:30:99

  13. Fourier Analysis on Groups

    CERN Document Server

    Rudin, Walter

    2011-01-01

    In the late 1950s, many of the more refined aspects of Fourier analysis were transferred from their original settings (the unit circle, the integers, the real line) to arbitrary locally compact abelian (LCA) groups. Rudin's book, published in 1962, was the first to give a systematic account of these developments and has come to be regarded as a classic in the field. The basic facts concerning Fourier analysis and the structure of LCA groups are proved in the opening chapters, in order to make the treatment relatively self-contained.

  14. Introduction to real analysis

    CERN Document Server

    Schramm, Michael J

    2008-01-01

    This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl

  15. Pathway analysis of IMC

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik

    2009-01-01

    We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...

  16. Towards Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan

    2005-01-01

    Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representi...... semantics, not only in text, but also in dynamic text (chat), images, and combinations of text and images. Here we further expand on the relevance of the ICA model for representing context, including two new analyzes of abstract data: social networks and musical features....

  17. Foundations of stochastic analysis

    CERN Document Server

    Rao, M M; Lukacs, E

    1981-01-01

    Foundations of Stochastic Analysis deals with the foundations of the theory of Kolmogorov and Bochner and its impact on the growth of stochastic analysis. Topics covered range from conditional expectations and probabilities to projective and direct limits, as well as martingales and likelihood ratios. Abstract martingales and their applications are also discussed. Comprised of five chapters, this volume begins with an overview of the basic Kolmogorov-Bochner theorem, followed by a discussion on conditional expectations and probabilities containing several characterizations of operators and mea

  18. Analysis of Design Documentation

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1998-01-01

    In design practice a process where a satisfactory solution is created within limited resources is required. However, since the design process is not well understood, research into how engineering designers actually solve design problems is needed. As a contribution to that end a research project...... has been established where we seek to identify useful design work patterns by retrospective analyses of documentation created during design projects. This paper describes the analysis method, a tentatively defined metric to evaluate identified work patterns, and presents results from the first...... analysis accomplished....

  19. Data analysis using SAS

    CERN Document Server

    Peng, Chao-Ying Joanne

    2008-01-01

    "Peng provides an excellent overview of data analysis using the powerful statistical software package SAS. This book is quite appropriate as a self-placed tutorial for researchers, as well as a textbook or supplemental workbook for data analysis courses such as statistics or research methods. Peng provides detailed coverage of SAS capabilities using step-by-step procedures and includes numerous comprehensive graphics and figures, as well as SAS printouts. Readers do not need a background in computer science or programming. Includes numerous examples in education, health sciences, and business.

  20. Highdimensional data analysis

    CERN Document Server

    Cai, Tony

    2010-01-01

    Over the last few years, significant developments have been taking place in highdimensional data analysis, driven primarily by a wide range of applications in many fields such as genomics and signal processing. In particular, substantial advances have been made in the areas of feature selection, covariance estimation, classification and regression. This book intends to examine important issues arising from highdimensional data analysis to explore key ideas for statistical inference and prediction. It is structured around topics on multiple hypothesis testing, feature selection, regression, cla

  1. Elements of real analysis

    CERN Document Server

    Sprecher, David A

    2010-01-01

    This classic text in introductory analysis delineates and explores the intermediate steps between the basics of calculus and the ultimate stage of mathematics: abstraction and generalization.Since many abstractions and generalizations originate with the real line, the author has made it the unifying theme of the text, constructing the real number system from the point of view of a Cauchy sequence (a step which Dr. Sprecher feels is essential to learn what the real number system is).The material covered in Elements of Real Analysis should be accessible to those who have completed a course in

  2. Concise vector analysis

    CERN Document Server

    Eliezer, C J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Concise Vector Analysis is a five-chapter introductory account of the methods and techniques of vector analysis. These methods are indispensable tools in mathematics, physics, and engineering. The book is based on lectures given by the author in the University of Ceylon.The first two chapters deal with vector algebra. These chapters particularly present the addition, representation, and resolution of vectors. The next two chapters examine the various aspects and specificities of vector calculus. The last chapter looks into some standard applications of vector algebra and calculus.This book wil

  3. Introduction to complex analysis

    CERN Document Server

    Priestley, H A

    2003-01-01

    Complex analysis is a classic and central area of mathematics, which is studied and exploited in a range of important fields, from number theory to engineering. Introduction to Complex Analysis was first published in 1985, and for this much awaited second edition the text has been considerably expanded, while retaining the style of the original. More detailed presentation is given of elementary topics, to reflect the knowledge base of current students. Exercise sets have beensubstantially revised and enlarged, with carefully graded exercises at the end of each chapter.This is the latest additi

  4. Analysis in Euclidean space

    CERN Document Server

    Hoffman, Kenneth

    2007-01-01

    Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq

  5. Signal flow analysis

    CERN Document Server

    Abrahams, J R; Hiller, N

    1965-01-01

    Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther

  6. Similar component analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hong; WANG Xin; LI Junwei; CAO Xianguang

    2006-01-01

    A new unsupervised feature extraction method called similar component analysis (SCA) is proposed in this paper. SCA method has a self-aggregation property that the data objects will move towards each other to form clusters through SCA theoretically,which can reveal the inherent pattern of similarity hidden in the dataset. The inputs of SCA are just the pairwise similarities of the dataset,which makes it easier for time series analysis due to the variable length of the time series. Our experimental results on many problems have verified the effectiveness of SCA on some engineering application.

  7. Gait Analysis Laboratory

    Science.gov (United States)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  8. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  9. Critical Analysis of Multimodal Discourse

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples...... of recent work in the critical analysis of multimodal discourse....

  10. Linear Regression Analysis

    CERN Document Server

    Seber, George A F

    2012-01-01

    Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.

  11. Thermal Analysis of Plastics

    Science.gov (United States)

    D'Amico, Teresa; Donahue, Craig J.; Rais, Elizabeth A.

    2008-01-01

    This lab experiment illustrates the use of differential scanning calorimetry (DSC) and thermal gravimetric analysis (TGA) in the measurement of polymer properties. A total of seven exercises are described. These are dry exercises: students interpret previously recorded scans. They do not perform the experiments. DSC was used to determine the…

  12. Learning: An Evolutionary Analysis

    Science.gov (United States)

    Swann, Joanna

    2009-01-01

    This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…

  13. Retrospective landscape analysis

    DEFF Research Database (Denmark)

    Fritzbøger, Bo

    2011-01-01

    On the basis of maps from the 18th and 19th centuries, a retrospective analysis was carried out of documentary settlement and landscape data extending back to the Middle Ages with the intention of identifying and dating general structural and dynamic features of the cultural landscape in a selected...

  14. Multiphasic analysis of growth.

    NARCIS (Netherlands)

    Koops, W.J.

    1989-01-01

    The central theme of this thesis is the mathematical analysis of growth in animals, based on the theory of multiphasic growth. Growth in biological terms is related to increase in size and shape. This increase is determined by internal (genetical) and external (environmental) factors. Well known mat

  15. Morphological image analysis

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Kawakatsu, T.

    2000-01-01

    We describe a morphological image analysis method to characterize images in terms of geometry and topology. We present a method to compute the morphological properties of the objects building up the image and apply the method to triply periodic minimal surfaces and to images taken from polymer chemi

  16. Morphological image analysis

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, H; Kawakatsu, T; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a morphological image analysis method to characterize images in terms of geometry and topology. We present a method to compute the morphological properties of the objects building up the image and apply the method to triply periodic minimal surfaces and to images taken from polymer chemi

  17. Introductory complex analysis

    CERN Document Server

    Silverman, Richard A

    1984-01-01

    A shorter version of A. I. Markushevich's masterly three-volume Theory of Functions of a Complex Variable, this edition is appropriate for advanced undergraduate and graduate courses in complex analysis. Numerous worked-out examples and more than 300 problems, some with hints and answers, make it suitable for independent study. 1967 edition.

  18. Microfluidic single sperm analysis

    NARCIS (Netherlands)

    Wagenaar, de Bjorn

    2016-01-01

    Microfluidic technology has been occasionally used for in vitro analysis and separation of cells. The small dimensions of microfluidic chips are very suitable to study cells on the single cell level rather than in whole populations. Also sperm cells have been studied and manipulated using microfluid

  19. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  20. Flipping the Analysis Classroom

    Science.gov (United States)

    Shannon, Christine Ann

    2016-01-01

    Advances in learning theory call us to examine ways to get students more actively engaged both inside and outside of the classroom. This report offers suggestions for encouraging and increasing student reading, writing, and collaborative development in a real analysis course.

  1. Generic flux coupling analysis

    NARCIS (Netherlands)

    Reimers, A.C.; Goldstein, Y.; Bockmayr, A.

    2015-01-01

    Flux coupling analysis (FCA) has become a useful tool for aiding metabolic reconstructions and guiding genetic manipulations. Originally, it was introduced for constraint-based models of metabolic networks that are based on the steady-state assumption. Recently, we have shown that the steady-state a

  2. Python data analysis

    CERN Document Server

    Idris, Ivan

    2014-01-01

    This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.

  3. Statistical Analysis Plan

    DEFF Research Database (Denmark)

    Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi;

    2015-01-01

    This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....

  4. Shifted Independent Component Analysis

    DEFF Research Database (Denmark)

    Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai

    2007-01-01

    Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...

  5. Writing proofs in analysis

    CERN Document Server

    Kane, Jonathan M

    2016-01-01

    This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...

  6. Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    2004-01-01

    This chapter provides an introduction to automated chemical analysis, which essentially can be divided into two groups: batch assays, where the solution is stationary while the container is moved through a number of stations where various unit operations performed; and continuous-flow procedures,...

  7. Vibration Sensitive Keystroke Analysis

    NARCIS (Netherlands)

    Lopatka, M.; Peetz, M.-H.; van Erp, M.; Stehouwer, H.; van Zaanen, M.

    2009-01-01

    We present a novel method for performing non-invasive biometric analysis on habitual keystroke patterns using a vibration-based feature space. With the increasing availability of 3-D accelerometer chips in laptop computers, conventional methods using time vectors may be augmented using a distinct fe

  8. Real Analysis in Reverse

    CERN Document Server

    Propp, James

    2012-01-01

    Many of the theorems of real analysis, against the background of the ordered field axioms, are equivalent to Dedekind completeness, and hence can serve as completeness axioms for the reals. In the course of demonstrating this, the article offers a tour of some less-familiar ordered fields, provides some of the relevant history, and considers pedagogical implications.

  9. Strategies of Data Analysis.

    Science.gov (United States)

    1987-06-01

    outliers and to determine clusters. More tecently, other types of graph- ical representations such as correspondence analysis ( Benzecri and Benzecri (1980...illustres a 1 aide d’exemples. BIBLIOGRAPHY 1. Benzecri , J.P. and Benzicri, F. (1980). L’Analyse des Correspondesces: Expose Elimentaire, Dunod, Paris. 2

  10. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  11. A Domain Analysis Bibliography

    Science.gov (United States)

    1990-06-01

    Bauhaus , a prototype CASE workstation for D-SAPS development. [ARAN88A] Guillermo F. Arango. Domain Engineering for Software Reuse. PhD thesis...34 VITA90B: Domain Analysis within the ISEC Rapid Center 48 CMU/SEI-90-SR-3 Appendix III Alphabetical by Organization/Project BAUHAUS * ALLE87A

  12. Haskell data analysis cookbook

    CERN Document Server

    Shukla, Nishant

    2014-01-01

    Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.

  13. Learning Haskell data analysis

    CERN Document Server

    Church, James

    2015-01-01

    If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.

  14. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  15. Strictness and Totality Analysis

    DEFF Research Database (Denmark)

    Solberg, K. L.; Nielson, Hanne Riis; Nielson, Flemming

    1998-01-01

    We define a novel inference system for strictness and totality analysis for the simply-typed lazy lambda-calculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those ...

  16. An Analysis of Anaphora

    Institute of Scientific and Technical Information of China (English)

    于昌利

    2008-01-01

    This paper is based on Chinese and English examples,presenting a systematic research and analysis on zero anaphora,pronominal anaphora and NP anaphora. It is found that semantic features,pragmatic elements.contexts and syntactic structures play important roles in our choice and interpreting them.

  17. Digital Systems Analysis

    Science.gov (United States)

    Martin, Vance S.

    2009-01-01

    There have been many attempts to understand how the Internet affects our modern world. There have also been numerous attempts to understand specific areas of the Internet. This article applies Immanuel Wallerstein's World Systems Analysis to our informationalist society. Understanding this world as divided among individual core, semi-periphery,…

  18. Anisotropic generalized Procrustes analysis

    NARCIS (Netherlands)

    Bennani Dosse, Mohammed; Kiers, Henk A.L.; Ten Berge, Jos M.F.

    2011-01-01

    Generalized Procrustes analysis is a popular method for matching several configurations by translations, rotations/reflections and scaling constants. It aims at producing a group average from these Euclidean similarity transformations followed by bi-linear approximation of this group average for gra

  19. Social network analysis

    NARCIS (Netherlands)

    W. de Nooy

    2009-01-01

    Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the si

  20. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels

    2015-01-01

    of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA...

  1. Transactional Analysis in Management.

    Science.gov (United States)

    Hewson, Julie; Turner, Colin

    Although Transactional Analysis (TA) has heavily influenced psychotherapy, little has been written to parallel that influence in areas of organization theory, organization behavior, or management studies. This book is intended primarily for people working in management roles. In part one, personal experiences are drawn upon to describe a fictional…

  2. Visual Interactive Analysis

    DEFF Research Database (Denmark)

    Kirchmeier-Andersen, Sabine; Møller Christensen, Jakob; Lihn Jensen, Bente

    2004-01-01

    This article presents the latest version of VIA (version 3.0). The development of the program was initiated by a demand for more systematic training of language analysis in high schools and universities. The system is now web-based, which enables teachers and students to share exercises across th...

  3. Spatial Data Analysis.

    Science.gov (United States)

    Banerjee, Sudipto

    2016-01-01

    With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains.

  4. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  5. LISA Pathfinder data analysis

    Science.gov (United States)

    Antonucci, F.; Armano, M.; Audley, H.; Auger, G.; Benedetti, M.; Binetruy, P.; Boatella, C.; Bogenstahl, J.; Bortoluzzi, D.; Bosetti, P.; Caleno, M.; Cavalleri, A.; Cesa, M.; Chmeissani, M.; Ciani, G.; Conchillo, A.; Congedo, G.; Cristofolini, I.; Cruise, M.; Danzmann, K.; De Marchi, F.; Diaz-Aguilo, M.; Diepholz, I.; Dixon, G.; Dolesi, R.; Fauste, J.; Ferraioli, L.; Fertin, D.; Fichter, W.; Fitzsimons, E.; Freschi, M.; García Marin, A.; García Marirrodriga, C.; Gesa, L.; Giardini, D.; Grimani, C.; Grynagier, A.; Guillaume, B.; Guzmán, F.; Harrison, I.; Heinzel, G.; Hewitson, M.; Hollington, D.; Hough, J.; Hoyland, D.; Hueller, M.; Huesler, J.; Jeannin, O.; Jennrich, O.; Jetzer, P.; Johlander, B.; Killow, C.; Llamas, X.; Lloro, I.; Lobo, A.; Maarschalkerweerd, R.; Madden, S.; Mance, D.; Mateos, I.; McNamara, P. W.; Mendes, J.; Mitchell, E.; Monsky, A.; Nicolini, D.; Nicolodi, D.; Nofrarias, M.; Pedersen, F.; Perreur-Lloyd, M.; Perreca, A.; Plagnol, E.; Prat, P.; Racca, G. D.; Rais, B.; Ramos-Castro, J.; Reiche, J.; Romera Perez, J. A.; Robertson, D.; Rozemeijer, H.; Sanjuan, J.; Schulte, M.; Shaul, D.; Stagnaro, L.; Strandmoe, S.; Steier, F.; Sumner, T. J.; Taylor, A.; Texier, D.; Trenkel, C.; Tombolato, D.; Vitale, S.; Wanner, G.; Ward, H.; Waschke, S.; Wass, P.; Weber, W. J.; Zweifel, P.

    2011-05-01

    As the launch of LISA Pathfinder (LPF) draws near, more and more effort is being put in to the preparation of the data analysis activities that will be carried out during the mission operations. The operations phase of the mission will be composed of a series of experiments that will be carried out on the satellite. These experiments will be directed and analysed by the data analysis team, which is part of the operations team. The operations phase will last about 90 days, during which time the data analysis team aims to fully characterize the LPF, and in particular, its core instrument the LISA Technology Package. By analysing the various couplings present in the system, the different noise sources that will disturb the system, and through the identification of the key physical parameters of the system, a detailed noise budget of the instrument will be constructed that will allow the performance of the different subsystems to be assessed and projected towards LISA. This paper describes the various aspects of the full data analysis chain that are needed to successfully characterize the LPF and build up the noise budget during mission operations.

  6. LISA Pathfinder data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Antonucci, F; Cavalleri, A; Congedo, G [Dipartimento di Fisica, Universita di Trento and INFN, Gruppo Collegato di Trento, 38123 Povo, Trento (Italy); Armano, M [European Space Astronomy Centre, European Space Agency, Villanueva de la Canada, 28692 Madrid (Spain); Audley, H; Bogenstahl, J; Danzmann, K [Albert-Einstein-Institut, Max-Planck-Institut fuer Gravitationsphysik und Universitaet Hannover, 30167 Hannover (Germany); Auger, G; Binetruy, P [APC UMR7164, Universite Paris Diderot, Paris (France); Benedetti, M [Dipartimento di Ingegneria dei Materiali e Tecnologie Industriali, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Boatella, C [CNES, DCT/AQ/EC, 18 Avenue Edouard Belin, 31401 Toulouse, Cedex9 (France); Bortoluzzi, D; Bosetti, P; Cristofolini, I [Dipartimento di Ingegneria Meccanica e Strutturale, Universita di Trento and INFN, Gruppo Collegato di Trento, Mesiano, Trento (Italy); Caleno, M; Cesa, M [European Space Technology Centre, European Space Agency, Keplerlaan 1, 2200 AG Noordwijk (Netherlands); Chmeissani, M [IFAE, Universitat Autonoma de Barcelona, E-08193 Bellaterra (Barcelona) (Spain); Ciani, G [Department of Physics, University of Florida, Gainesville, FL 32611-8440 (United States); Conchillo, A [ICE-CSIC/IEEC, Facultat de Ciencies, E-08193 Bellaterra (Barcelona) (Spain); Cruise, M, E-mail: martin.hewitson@aei.mpg.de [Department of Physics and Astronomy, University of Birmingham, Birmingham (United Kingdom)

    2011-05-07

    As the launch of LISA Pathfinder (LPF) draws near, more and more effort is being put in to the preparation of the data analysis activities that will be carried out during the mission operations. The operations phase of the mission will be composed of a series of experiments that will be carried out on the satellite. These experiments will be directed and analysed by the data analysis team, which is part of the operations team. The operations phase will last about 90 days, during which time the data analysis team aims to fully characterize the LPF, and in particular, its core instrument the LISA Technology Package. By analysing the various couplings present in the system, the different noise sources that will disturb the system, and through the identification of the key physical parameters of the system, a detailed noise budget of the instrument will be constructed that will allow the performance of the different subsystems to be assessed and projected towards LISA. This paper describes the various aspects of the full data analysis chain that are needed to successfully characterize the LPF and build up the noise budget during mission operations.

  7. Introductory real analysis

    CERN Document Server

    Kolmogorov, A N; Silverman, Richard A

    1975-01-01

    Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.

  8. Policy analysis without data

    Energy Technology Data Exchange (ETDEWEB)

    Hale, D.R.; Spancake, L.R. (Energy Information Adminstration, Washington, DC (United States))

    1993-01-01

    This article describes the Energy Information Administration's work to assess the potential of experimental economics for policy analysis. The article reviews the experimental approach and describes an application to provisions of the Clean Air Act Amendments of 1990. 10 refs.

  9. Multilevel component analysis

    NARCIS (Netherlands)

    Timmerman, M.E.

    2006-01-01

    A general framework for the exploratory component analysis of multilevel data (MLCA) is proposed. In this framework, a separate component model is specified for each group of objects at a certain level. The similarities between the groups of objects at a given level can be expressed by imposing cons

  10. Methods in algorithmic analysis

    CERN Document Server

    Dobrushkin, Vladimir A

    2009-01-01

    …helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010

  11. Elementary functional analysis

    CERN Document Server

    Shilov, Georgi E

    1996-01-01

    Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.

  12. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    -and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  13. Learning Talk Analysis

    Science.gov (United States)

    Markee, Numa; Seo, Mi-Suk

    2009-01-01

    Since the beginning, second language acquisition (SLA) studies have been predominantly cognitive in their theoretical assumptions and programmatic agendas. This is still largely true today. In this paper, we set out our proposals for learning talk analysis (LTA). LTA synthesizes insights from linguistic philosophy, ethnomethodology, conversation…

  14. Design Analysis and Synthesis.

    Science.gov (United States)

    Redekop, David

    1984-01-01

    Encourages use of case studies to introduce "real-life" engineering assignments, suggesting that they should be given in a natural order as a series. Describes three such assignments being used at the University of the West Indies. Description, analysis, and design of an existing engineering system is included. (JM)

  15. Doxing: a conceptual analysis

    NARCIS (Netherlands)

    Douglas, David M.

    2016-01-01

    Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it d

  16. SWOT analysis - Chinese Petroleum

    OpenAIRE

    Chunlan Wang; Lei Zhang; Qi Zhong

    2014-01-01

    This article was written in early December 2013, combined with the historical development and the latest data on the Chinese Petroleum carried SWOTanalysis. This paper discusses corporate resources, cost, management and external factors such as the political environment and the market supply and demand, conducted a comprehensive and profound analysis.

  17. Analysis of Endomorphisms

    DEFF Research Database (Denmark)

    Conti, Roberto; Hong, Jeong Hee; Szymanski, Wojciech

    2012-01-01

    of such an algebra. Then we outline a powerful combinatorial approach to analysis of endomorphisms arising from permutation unitaries. The restricted Weyl group consists of automorphisms of this type. We also discuss the action of the restricted Weyl group on the diagonal MASA and its relationship...

  18. Advanced Economic Analysis

    Science.gov (United States)

    Greenberg, Marc W.; Laing, William

    2013-01-01

    An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.

  19. Experiments in Scene Analysis

    Science.gov (United States)

    1970-01-01

    analyzer tha t we construct will very likely combine some of the ideas reported here with the approach described by our co-workers Brice and Fennema ...188-205 (1968). Brice , C. R., and C. L. Fennema , " Scene Analysis of Pictures Using Regions " Technical Note No. 17 , Artificial Intelligence Group

  20. Safety analysis for `Fugen`

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ``Fugen`` was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ``Fugen`` has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)

  1. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  2. Financial Analysis of HRD.

    Science.gov (United States)

    1995

    These five papers are from a symposium facilitated by Ronald L. Jacobs on financial analysis of human resource development (HRD) at the 1995 Academy of Human Resource Development conference. "Return on Investment--Beyond the Four Levels!" (Jack J. Phillips) offers a modification of the Kirkpatrick model of evaluation for HRD in…

  3. Proteoglycan isolation and analysis

    DEFF Research Database (Denmark)

    Woods, A; Couchman, J R

    2001-01-01

    Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...

  4. Online Paper Review Analysis

    Directory of Open Access Journals (Sweden)

    Doaa Mohey El-Din

    2015-09-01

    Full Text Available Sentiment analysis or opinion mining is used to automate the detection of subjective information such as opinions, attitudes, emotions, and feelings. Hundreds of thousands care about scientific research and take a long time to select suitable papers for their research. Online reviews on papers are the essential source to help them. The reviews save reading time and save papers cost. This paper proposes a new technique to analyze online reviews. It is called sentiment analysis of online papers (SAOOP. SAOOP is a new technique used for enhancing bag-of-words model, improving the accuracy and performance. SAOOP is useful in increasing the understanding rate of review's sentences through higher language coverage cases. SAOOP introduces solutions for some sentiment analysis challenges and uses them to achieve higher accuracy. This paper also presents a measure of topic domain attributes, which provides a ranking of total judging on each text review for assessing and comparing results across different sentiment techniques for a given text review. Finally, showing the efficiency of the proposed approach by comparing the proposed technique with two sentiment analysis techniques. The comparison terms are based on measuring accuracy, performance and understanding rate of sentences.

  5. Introduction to Food Analysis

    Science.gov (United States)

    Nielsen, S. Suzanne

    Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet

  6. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  7. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  8. Ceramic tubesheet design analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mallett, R.H.; Swindeman, R.W.

    1996-06-01

    A transport combustor is being commissioned at the Southern Services facility in Wilsonville, Alabama to provide a gaseous product for the assessment of hot-gas filtering systems. One of the barrier filters incorporates a ceramic tubesheet to support candle filters. The ceramic tubesheet, designed and manufactured by Industrial Filter and Pump Manufacturing Company (EF&PM), is unique and offers distinct advantages over metallic systems in terms of density, resistance to corrosion, and resistance to creep at operating temperatures above 815{degrees}C (1500{degrees}F). Nevertheless, the operational requirements of the ceramic tubesheet are severe. The tubesheet is almost 1.5 m in (55 in.) in diameter, has many penetrations, and must support the weight of the ceramic filters, coal ash accumulation, and a pressure drop (one atmosphere). Further, thermal stresses related to steady state and transient conditions will occur. To gain a better understanding of the structural performance limitations, a contract was placed with Mallett Technology, Inc. to perform a thermal and structural analysis of the tubesheet design. The design analysis specification and a preliminary design analysis were completed in the early part of 1995. The analyses indicated that modifications to the design were necessary to reduce thermal stress, and it was necessary to complete the redesign before the final thermal/mechanical analysis could be undertaken. The preliminary analysis identified the need to confirm that the physical and mechanical properties data used in the design were representative of the material in the tubesheet. Subsequently, few exploratory tests were performed at ORNL to evaluate the ceramic structural material.

  9. Robust verification analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  10. Exploratory data analysis with Matlab

    CERN Document Server

    Martinez, Wendy L; Solka, Jeffrey

    2010-01-01

    Since the publication of the bestselling first edition, many advances have been made in exploratory data analysis (EDA). Covering innovative approaches for dimensionality reduction, clustering, and visualization, Exploratory Data Analysis with MATLAB®, Second Edition uses numerous examples and applications to show how the methods are used in practice.New to the Second EditionDiscussions of nonnegative matrix factorization, linear discriminant analysis, curvilinear component analysis, independent component analysis, and smoothing splinesAn expanded set of methods for estimating the intrinsic di

  11. Analysis of Muji's Business Strategy

    Institute of Scientific and Technical Information of China (English)

    范晶

    2011-01-01

    This article is a report of analysis of Muji's business strategy.First,the vision and mission are introduced.Second,the current strategy is identified.Then the industry analysis,industry driving forces,key success factors,value chain analysis,competitive advantage,competitive power of the competitive advantage were analyzed.At last,on the basis of the former analysis the SWOT analysis was worked out.

  12. Dicty_cDB: Contig-U15070-1 [Dicty_cDB

    Lifescience Database Archive (English)

    Full Text Available _2742( AE010299 |pid:none) Methanosarcina acetivorans str.... 37 1.9 AJ006017_1( AJ006017 |pid:none) Hamster...in Kilh... 37 2.5 AJ006016_1( AJ006016 |pid:none) Hamster polyomavirus VP2 gene f

  13. Handbook of radioactivity analysis

    CERN Document Server

    2012-01-01

    The updated and much expanded Third Edition of the "Handbook of Radioactivity Analysis" is an authoritative reference providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities and in the implementation of nuclear forensic analysis and nuclear safeguards. The Third Edition contains seven new chapters providing a reference text much broader in scope than the previous Second Edition, and all of the other chapters have been updated and expanded many with new authors. The book describes the basic principles of radiation detection and measurement, the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-ar...

  14. MIR Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hazen, Damian [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Hick, Jason [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  15. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  16. Exascale Data Analysis

    CERN Document Server

    CERN. Geneva; Fitch, Blake

    2011-01-01

    Traditionaly, the primary role of supercomputers was to create data, primarily for simulation applications. Due to usage and technology trends, supercomputers are increasingly also used for data analysis. Some of this data is from simulations, but there is also a rapidly increasingly amount of real-world science and business data to be analyzed. We briefly overview Blue Gene and other current supercomputer architectures. We outline future architectures, up to the Exascale supercomputers expected in the 2020 time frame. We focus on the data analysis challenges and opportunites, especially those concerning Flash and other up-and-coming storage class memory. About the speakers Blake G. Fitch has been with IBM Research, Yorktown Heights, NY since 1987, mainly pursuing interests in parallel systems. He joined the Scalable Parallel Systems Group in 1990, contributing to research and development that culminated in the IBM scalable parallel system (SP*) product. His research interests have focused on applicatio...

  17. In Silico Expression Analysis.

    Science.gov (United States)

    Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz

    2016-01-01

    Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress.

  18. Specialized ratio analysis.

    Science.gov (United States)

    Wyer, J C; Salzinger, F H

    1983-01-01

    Many common management techniques have little use in managing a medical group practice. Ratio analysis, however, can easily be adapted to the group practice setting. Acting as broad-gauge indicators, financial ratios provide an early warning of potential problems and can be very useful in planning for future operations. The author has gathered a collection of financial ratios which were developed by participants at an education seminar presented for the Virginia Medical Group Management Association. Classified according to the human element, system component, and financial factor, the ratios provide a good sampling of measurements relevant to medical group practices and can serve as an example for custom-tailoring a ratio analysis system for your medical group.

  19. Jumper connector analysis

    Science.gov (United States)

    Kanjilal, S. K.; Lindquist, M. R.; Ulbricht, L. E.

    1994-02-01

    Jumper connectors are used for remotely connecting pipe lines containing transfer fluids ranging from hazardous chemicals to other nonhazardous liquids. The jumper connector assembly comprises hooks, hookpins, a block, a nozzle, an operating screw, and a nut. The hooks are tightened against the nozzle flanges by the operating screw that is tightened with a remotely connected torque wrench. Stress analysis for the jumper connector assembly (used extensively on the US Department of Energy's Hanford Site, near Richland, Washington) is performed by using hand calculation and finite-element techniques to determine the stress levels resulting from operating and seismic loads on components of the assembly. The analysis addresses loading conditions such as prestress, seismic, operating, thermal, and leakage. The preload torque-generated forces at which each component reaches its stress limits are presented in a tabulated format. Allowable operating loads for the jumper assembly are provided to prevent leakage of the assembly during operating cycles.

  20. Mercury analysis in hair

    DEFF Research Database (Denmark)

    Esteban, Marta; Schindler, Birgit K; Jiménez-Guerrero, José A

    2015-01-01

    Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical...... assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating...... laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0...

  1. Elemental analysis in biotechnology.

    Science.gov (United States)

    Hann, Stephan; Dernovics, Mihaly; Koellensperger, Gunda

    2015-02-01

    This article focuses on analytical strategies integrating atomic spectroscopy in biotechnology. The rationale behind developing such methods is inherently linked to unique features of the key technique in elemental analysis, which is inductively coupled plasma mass spectrometry: (1) the high sensitivity and selectivity of state of the art instrumentation, (2) the possibility of accurate absolute quantification even in complex matrices, (3) the capability of combining elemental detectors with chromatographic separation methods and the versatility of the latter approach, (4) the complementarity of inorganic and organic mass spectrometry, (5) the multi-element capability and finally (6) the capability of isotopic analysis. The article highlights the most recent bio-analytical developments exploiting these methodological advantages and shows the potential in biotechnological applications.

  2. Exploration Laboratory Analysis - ARC

    Science.gov (United States)

    Krihak, Michael K.; Fung, Paul P.

    2012-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk, Risk of Inability to Adequately Treat an Ill or Injured Crew Member, and ExMC Gap 4.05: Lack of minimally invasive in-flight laboratory capabilities with limited consumables required for diagnosing identified Exploration Medical Conditions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability in future exploration missions. Mission architecture poses constraints on equipment and procedures that will be available to treat evidence-based medical conditions according to the Space Medicine Exploration Medical Conditions List (SMEMCL). The SMEMCL provided diagnosis and treatment for the evidence-based medical conditions and hence, a basis for developing ELA functional requirements.

  3. Invitation to complex analysis

    CERN Document Server

    Boas, Ralph P

    2010-01-01

    Ideal for a first course in complex analysis, this book can be used either as a classroom text or for independent study. Written at a level accessible to advanced undergraduates and beginning graduate students, the book is suitable for readers acquainted with advanced calculus or introductory real analysis. The treatment goes beyond the standard material of power series, Cauchy's theorem, residues, conformal mapping, and harmonic functions by including accessible discussions of intriguing topics that are uncommon in a book at this level. The flexibility afforded by the supplementary topics and applications makes the book adaptable either to a short, one-term course or to a comprehensive, full-year course. Detailed solutions of the exercises both serve as models for students and facilitate independent study. Supplementary exercises, not solved in the book, provide an additional teaching tool. This second edition has been painstakingly revised by the author's son, himself an award-winning mathematical expositor...

  4. [Audimutitas: An analysis].

    Science.gov (United States)

    Goorhuis-Brouwer, S M

    1989-01-01

    There are children who are assumed to have normal hearing and normal intelligence, but who do not speak at all. The phoniatric diagnosis for these children used to be audimutitas. Nowadays these children are not specified any more. They are called language-disturbed and are diagnosed and treated like all other language-disturbed children. An analysis is made of 46 nonspeaking children, aged 1.5-3, who are assumed to understand language in a proper way. The analysis deals with their language (language comprehension, language production and pragmatics) as well as with medical and psychological factors influencing the language disorder. We found no correlations between language comprehension, language production and pragmatics. So, when we know one language aspect, we cannot be sure of the others. We also found various factors contributing to the language problem.

  5. Nonstandard asymptotic analysis

    CERN Document Server

    Berg, Imme

    1987-01-01

    This research monograph considers the subject of asymptotics from a nonstandard view point. It is intended both for classical asymptoticists - they will discover a new approach to problems very familiar to them - and for nonstandard analysts but includes topics of general interest, like the remarkable behaviour of Taylor polynomials of elementary functions. Noting that within nonstandard analysis, "small", "large", and "domain of validity of asymptotic behaviour" have a precise meaning, a nonstandard alternative to classical asymptotics is developed. Special emphasis is given to applications in numerical approximation by convergent and divergent expansions: in the latter case a clear asymptotic answer is given to the problem of optimal approximation, which is valid for a large class of functions including many special functions. The author's approach is didactical. The book opens with a large introductory chapter which can be read without much knowledge of nonstandard analysis. Here the main features of the t...

  6. Waveform analysis of sound

    CERN Document Server

    Tohyama, Mikio

    2015-01-01

    What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...

  7. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  8. MEAD retrospective analysis report

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Carstensen, J.; Frohn, L.M.;

    2003-01-01

    The retrospective analysis investigates links between atmospheric nitrogen deposition and algal bloom development in the Kattegat Sea from April to September 1989-1999. The analysis is based on atmospheric deposition model results from the ACDEP model,hydrodynamic deep-water flux results...... with an increase above 0.5 µg/l chlorophyll a, but severalconsecutive days of high nitrogen inputs create the potential for blooms. The physical and chemical conditions before and during a bloom revealed that blooms occurred under higher salinity and wind conditions on 2-6 days prior to the observed bloom...... the bottom waters. Yet the cumulative atmospheric deposition is always larger than the marine deep-water flux. The mixing of nutrient-rich water from belowthe pycnocline into the euphotic zone is also a process of highly episodic character and provides sufficient nitrogen to the euphotic zone to sustain...

  9. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  10. Reconstructability analysis of epistasis.

    Science.gov (United States)

    Zwick, Martin

    2011-01-01

    The literature on epistasis describes various methods to detect epistatic interactions and to classify different types of epistasis. Reconstructability analysis (RA) has recently been used to detect epistasis in genomic data. This paper shows that RA offers a classification of types of epistasis at three levels of resolution (variable-based models without loops, variable-based models with loops, state-based models). These types can be defined by the simplest RA structures that model the data without information loss; a more detailed classification can be defined by the information content of multiple candidate structures. The RA classification can be augmented with structures from related graphical modeling approaches. RA can analyze epistatic interactions involving an arbitrary number of genes or SNPs and constitutes a flexible and effective methodology for genomic analysis.

  11. Analysis of neural data

    CERN Document Server

    Kass, Robert E; Brown, Emery N

    2014-01-01

    Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.

  12. Theories of Comparative Analysis

    Science.gov (United States)

    1988-05-01

    model of a projectile fired from a cannon in a uniform gravitational field serves to demonstrate the problems due to qualitative arithmetic. Nei...recently demonstrated the qualitative Gauss rule, a type of algebraic manip- ulation that is solution preserving. While it cannot eliminate all...projectile fired from a cannon illustrates this point. Given an increase in muzzle velocity, Vft, as a perturbation, DQ " analysis predicts that apogee

  13. Analysis of employee satisfaction

    OpenAIRE

    Cikrytová, Kateřina

    2009-01-01

    The thesis contains analysis of job satisfaction employees of call center. In the theoretical part is defined conception of job satisfaction, are described deteminants of job satisfaction and relationship between job satisfaction and work motivation. In the practical part are analysed results of questionnaire survey and there are presented suggestions measures to increas job satisfaction. The respondents were asked about satisfaction with the content of work, remuneration, work organization, ...

  14. Enterprise Architecture Tradespace Analysis

    Science.gov (United States)

    2014-02-21

    Investigator: Dr. Tommer Ender , Georgia Institute of Technology Research Team Dr. Santiago Balestrini-Robinson Daniel Browne Jennifer DeLockery Aaron...analysis of alternatives for the acquisition programs capable of assessing cost, schedule and performance risk. [O’Neal et al., 2011] [ Ender et al... Ender , Tommer R., Daniel C. Browne, William W. Yates, and Michael O’Neal. 2012. "FACT: An M&S Framework for Systems Engineering." In The

  15. Analysis of Ergot Alkaloids

    OpenAIRE

    Colin Crews

    2015-01-01

    The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS) and liquid chromatography with fluorescence detection (LC-FLD). Ot...

  16. Introduction to abstract analysis

    CERN Document Server

    Goldstein, Marvin E

    2015-01-01

    Developed from lectures delivered at NASA's Lewis Research Center, this concise text introduces scientists and engineers with backgrounds in applied mathematics to the concepts of abstract analysis. Rather than preparing readers for research in the field, this volume offers background necessary for reading the literature of pure mathematics. Starting with elementary set concepts, the treatment explores real numbers, vector and metric spaces, functions and relations, infinite collections of sets, and limits of sequences. Additional topics include continuity and function algebras, Cauchy complet

  17. Dental Forensics: Bitemark Analysis

    Directory of Open Access Journals (Sweden)

    Elza Ibrahim Auerkari

    2013-06-01

    Full Text Available Forensic odontology (dental forensics can provide useful evidence in both criminal and civil cases, and therefore remains a part of the wider discipline of forensic science. As an example from the toolbox of forensic odontology, the practice and experience on bitemark analysis is reviewed here in brief. The principle of using visible bitemarks in crime victims or in other objects as evidence is fundamentally based on the observation that the detailed pattern of dental imprints tend to be practically unique for each individual. Therefore, finding such an imprint as a bitemark can bear a strong testimony that it was produced by the individual that has the matching dental pattern. However, the comparison of the observed bitemark and the suspected set of teeth will necessarily require human interpretation, and this is not infallible. Both technical challenges in the bitemarks and human errors in the interpretation are possible. To minimise such errors and to maximise the value of bitemark analysis, dedicated procedures and protocols have been developed, and the personnel taking care of the analysis need to be properly trained. In principle the action within the discipline should be conducted as in evidence-based dentristy, i.e. accepted procedures should have known error rates. Because of the involvement of human interpretation, even personal performance statistics may be required from legal expert statements. The requirements have been introduced largely due to cases where false convictions based on bitemark analysishave been overturned after DNA analysis.DOI: 10.14693/jdi.v15i2.76

  18. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HWAN; YANG

    2001-01-01

    Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).……

  19. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).

  20. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HAWNYANG

    2001-01-01

    Risk analysis is a useful too for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data.contaminant residue levels,statistical tools,exposure values and relevant variants,Risk managers consider scientific evidence and risk estimates,along with statutory,engineering,economic,social,and political factors,in evaluating alternative regulatory options and choosing among those options(NRC,1983).

  1. Economic Analysis of Ransomware

    OpenAIRE

    Hernandez-Castro, Julio; Cartwright, Edward; Stepanova, Anna

    2017-01-01

    We present in this work an economic analysis of ransomware, with relevant data from Cryptolocker, CryptoWall, TeslaCrypt and other major strands. We include a detailed study of the impact that different price discrimination strategies can have on the success of a ransomware family, examining uniform pricing, optimal price discrimination and bargaining strategies and analysing their advantages and limitations. In addition, we present results of a preliminary survey that can helps in estimating...

  2. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  3. Making Strategic Analysis Matter

    Science.gov (United States)

    2012-01-01

    way, so intel- ligence can help them ask, for instance, “Does Plan Colombia offer insights for Afghani- stan?” If the analogy is to be useful, it...Intelligence Agency DNI Director of National Intelligence EADS European Aeronautic Defence and Space Company HIV human immunodeficiency virus NATO...can be a task for strategic analysis. For instance, does Plan Colombia offer any insights by analogy for counterinsurgency in Afghanistan? If

  4. Dimensional analysis for engineers

    CERN Document Server

    Simon, Volker; Gomaa, Hassan

    2017-01-01

    This monograph provides the fundamentals of dimensional analysis and illustrates the method by numerous examples for a wide spectrum of applications in engineering. The book covers thoroughly the fundamental definitions and the Buckingham theorem, as well as the choice of the system of basic units. The authors also include a presentation of model theory and similarity solutions. The target audience primarily comprises researchers and practitioners but the book may also be suitable as a textbook at university level.

  5. CLEAN: CLustering Enrichment ANalysis

    Directory of Open Access Journals (Sweden)

    Medvedovic Mario

    2009-07-01

    Full Text Available Abstract Background Integration of biological knowledge encoded in various lists of functionally related genes has become one of the most important aspects of analyzing genome-wide functional genomics data. In the context of cluster analysis, functional coherence of clusters established through such analyses have been used to identify biologically meaningful clusters, compare clustering algorithms and identify biological pathways associated with the biological process under investigation. Results We developed a computational framework for analytically and visually integrating knowledge-based functional categories with the cluster analysis of genomics data. The framework is based on the simple, conceptually appealing, and biologically interpretable gene-specific functional coherence score (CLEAN score. The score is derived by correlating the clustering structure as a whole with functional categories of interest. We directly demonstrate that integrating biological knowledge in this way improves the reproducibility of conclusions derived from cluster analysis. The CLEAN score differentiates between the levels of functional coherence for genes within the same cluster based on their membership in enriched functional categories. We show that this aspect results in higher reproducibility across independent datasets and produces more informative genes for distinguishing different sample types than the scores based on the traditional cluster-wide analysis. We also demonstrate the utility of the CLEAN framework in comparing clusterings produced by different algorithms. CLEAN was implemented as an add-on R package and can be downloaded at http://Clusteranalysis.org. The package integrates routines for calculating gene specific functional coherence scores and the open source interactive Java-based viewer Functional TreeView (FTreeView. Conclusion Our results indicate that using the gene-specific functional coherence score improves the reproducibility of the

  6. Intelligence Analysis: Once Again

    Science.gov (United States)

    2008-02-01

    least touch on the subject of intelligence analysis. However, while still a large body of work, it is a considerably smaller set that specifically...meaning is influenced by the analyst’s mindset, mental model, or frame of mind . Kent (1949, p. 199) indicated “…an intelligence staff which must...or a top-down process are not unique to the intelligence literature. In the scientific literature, arguments date back to Descartes (1596- 1650

  7. Data envelopment analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This review introduces the history and present status of data envelopment analysis (DEA) research, particularly the evaluation process. And extensions of some DEA models are also described. It is pointed out that mathematics, economics and management science are the main forces in the DEA development, optimization provides the fundamental method for the DEA research, and the wide range of applications enforces the rapid development of DEA.

  8. Projectile Base Flow Analysis

    Science.gov (United States)

    2007-11-02

    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  9. Digital image analysis

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben

    2012-01-01

    Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....

  10. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization....

  11. Direct Olive Oil Analysis

    OpenAIRE

    2002-01-01

    The practical impact of “direct analysis” is undeniable as it strong contributes to enhance the so-called productive analytical features such as expeditiousness, reduction of costs and minimisation of risks for the analysts and environment. The main objective is to establish a reliable bypass to the conventional preliminary operations of the analytical process. This paper offers a systematic approach in this context and emphasises the great field of action of...

  12. Operational Modal Analysis Tutorial

    OpenAIRE

    Brincker, Rune; Andersen, Palle

    2007-01-01

    In this paper the basic principles in operational modal testing and analysis are presented and discussed. A brief review of the techniques for operational modal testing and identification is presented, and it is argued, that there is now a wide range of techniques for effective identification of modal parameters of practical interest - including the mode shape scaling factor - with a high degree of accuracy. It is also argued that the operational technology offers the user a number of advanta...

  13. SAMPLING AND ANALYSIS PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T; P Fledderman, P

    2007-02-09

    Radiological sampling and analyses are performed to collect data for a variety of specific reasons covering a wide range of projects. These activities include: Effluent monitoring; Environmental surveillance; Emergency response; Routine ambient monitoring; Background assessments; Nuclear license termination; Remediation; Deactivation and decommissioning (D&D); and Waste management. In this chapter, effluent monitoring and environmental surveillance programs at nuclear operating facilities and radiological sampling and analysis plans for remediation and D&D activities will be discussed.

  14. Techno-Economic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Salvesen, F.; Sandgren, J. [KanEnergi AS, Rud (Norway)

    1997-12-31

    The present energy situation in the target area is summarized: 20 million inhabitants without electricity in north- west Russia, 50 % of the people in the Baltic`s without electricity, very high technical skills, biggest problems is the finance. The energy situation, the advantages of the renewables, the restrictions, and examples for possible technical solutions are reviewed on the basis of short analysis and experience with the Baltics and Russia

  15. ATLAS reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bartsch, R.R.

    1995-09-01

    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  16. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  17. Performance Analysis of MYSEA

    Science.gov (United States)

    2012-09-01

    Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties

  18. Paternity analysis in Excel.

    Science.gov (United States)

    Rocheta, Margarida; Dionísio, F Miguel; Fonseca, Luís; Pires, Ana M

    2007-12-01

    Paternity analysis using microsatellite information is a well-studied subject. These markers are ideal for parentage studies and fingerprinting, due to their high-discrimination power. This type of data is used to assign paternity, to compute the average selfing and outcrossing rates and to estimate the biparental inbreeding. There are several public domain programs that compute all this information from data. Most of the time, it is necessary to export data to some sort of format, feed it to the program and import the output to an Excel book for further processing. In this article we briefly describe a program referred from now on as Paternity Analysis in Excel (PAE), developed at IST and IBET (see the acknowledgments) that computes paternity candidates from data, and other information, from within Excel. In practice this means that the end user provides the data in an Excel sheet and, by pressing an appropriate button, obtains the results in another Excel sheet. For convenience PAE is divided into two modules. The first one is a filtering module that selects data from the sequencer and reorganizes it in a format appropriate to process paternity analysis, assuming certain conventions for the names of parents and offspring from the sequencer. The second module carries out the paternity analysis assuming that one parent is known. Both modules are written in Excel-VBA and can be obtained at the address (www.math.ist.utl.pt/~fmd/pa/pa.zip). They are free for non-commercial purposes and have been tested with different data and against different software (Cervus, FaMoz, and MLTR).

  19. Biomedical signal analysis

    CERN Document Server

    Rangayyan, Rangaraj M

    2015-01-01

    The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.

  20. [Mindfulness: A Concept Analysis].

    Science.gov (United States)

    Chen, Tsai-Ling; Chou, Fan-Hao; Wang, Hsiu-Hung

    2016-04-01

    "Mindfulness" is an emerging concept in the field of healthcare. Ranging from stress relief to psychotherapy, mindfulness has been confirmed to be an effective tool to help individuals manage depression, anxiety, obsessive-compulsive disorder, and other health problems in clinical settings. Scholars currently use various definitions for mindfulness. While some of these definitions overlap, significant differences remain and a general scholarly consensus has yet to be reached. Several domestic and international studies have explored mindfulness-related interventions and their effectiveness. However, the majority of these studies have focused on the fields of clinical medicine, consultation, and education. Mindfulness has rarely been applied in clinical nursing practice and no related systematic concept analysis has been conducted. This paper conducts a concept analysis of mindfulness using the concept analysis method proposed by Walker and Avant (2011). We describe the defining characteristics of mindfulness, clarify the concept, and confirm the predisposing factors and effects of mindfulness using examples of typical cases, borderline cases, related cases, and contrary case. Findings may provide nursing staff with an understanding of the concept of mindfulness for use in clinical practice in order to help patients achieve a comfortable state of body and mind healing.

  1. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  2. Numerical analysis using Sage

    CERN Document Server

    Anastassiou, George A

    2015-01-01

    This is the first numerical analysis text to use Sage for the implementation of algorithms and can be used in a one-semester course for undergraduates in mathematics, math education, computer science/information technology, engineering, and physical sciences. The primary aim of this text is to simplify understanding of the theories and ideas from a numerical analysis/numerical methods course via a modern programming language like Sage. Aside from the presentation of fundamental theoretical notions of numerical analysis throughout the text, each chapter concludes with several exercises that are oriented to real-world application.  Answers may be verified using Sage.  The presented code, written in core components of Sage, are backward compatible, i.e., easily applicable to other software systems such as Mathematica®.  Sage is  open source software and uses Python-like syntax. Previous Python programming experience is not a requirement for the reader, though familiarity with any programming language is a p...

  3. Medical Image Analysis Facility

    Science.gov (United States)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  4. Laminar Flow Analysis

    Science.gov (United States)

    Rogers, David F.

    1992-10-01

    The major thrust of this book is to present a technique of analysis that aids the formulation, understanding, and solution of problems of viscous flow. The intent is to avoid providing a "canned" program to solve a problem, offering instead a way to recognize the underlying physical, mathematical, and modeling concepts inherent in the solutions. The reader must first choose a mathematical model and derive governing equations based on realistic assumptions, or become aware of the limitations and assumptions associated with existing models. An appropriate solution technique is then selected. The solution technique may be either analytical or numerical. Computer-aided analysis algorithms supplement the classical analyses. The book begins by deriving the Navier-Stokes equation for a viscous compressible variable property fluid. The second chapter considers exact solutions of the incompressible hydrodynamic boundary layer equations solved with and without mass transfer at the wall. Forced convection, free convection, and the compressible laminar boundary layer are discussed in the remaining chapters. The text unifies the various topics by tracing a logical progression from simple to complex governing differential equations and boundary conditions. Numerical, parametric, and directed analysis problems are included at the end of each chapter.

  5. Interstage Flammability Analysis Approach

    Science.gov (United States)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  6. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  7. FUTURE CLIMATE ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    R.M. Forester

    2000-03-14

    This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure l), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog.

  8. Nominal analysis of "variance".

    Science.gov (United States)

    Weiss, David J

    2009-08-01

    Nominal responses are the natural way for people to report actions or opinions. Because nominal responses do not generate numerical data, they have been underutilized in behavioral research. On those occasions in which nominal responses are elicited, the responses are customarily aggregated over people or trials so that large-sample statistics can be employed. A new analysis is proposed that directly associates differences among responses with particular sources in factorial designs. A pair of nominal responses either matches or does not; when responses do not match, they vary. That analogue to variance is incorporated in the nominal analysis of "variance" (NANOVA) procedure, wherein the proportions of matches associated with sources play the same role as do sums of squares in an ANOVA. The NANOVA table is structured like an ANOVA table. The significance levels of the N ratios formed by comparing proportions are determined by resampling. Fictitious behavioral examples featuring independent groups and repeated measures designs are presented. A Windows program for the analysis is available.

  9. Political analysis using R

    CERN Document Server

    Monogan III, James E

    2015-01-01

    Political Analysis Using R can serve as a textbook for undergraduate or graduate students as well as a manual for independent researchers. It is unique among competitor books in its usage of 21 example datasets that are all drawn from political research. All of the data and example code is available from the Springer website, as well as from Dataverse (http://dx.doi.org/10.7910/DVN/ARKOTI). The book provides a narrative of how R can be useful for addressing problems common to the analysis of public administration, public policy, and political science data specifically, in addition to the social sciences more broadly. While the book uses data drawn from political science, public administration, and policy analyses, it is written so that students and researchers in other fields should find it accessible and useful as well. Political Analysis Using R is perfect for the first-time R user who has no prior knowledge about the program. By working through the first seven chapters of this book, an entry-level user sho...

  10. Meta-analysis with R

    CERN Document Server

    Schwarzer, Guido; Rücker, Gerta

    2015-01-01

    This book provides a comprehensive introduction to performing meta-analysis using the statistical software R. It is intended for quantitative researchers and students in the medical and social sciences who wish to learn how to perform meta-analysis with R. As such, the book introduces the key concepts and models used in meta-analysis. It also includes chapters on the following advanced topics: publication bias and small study effects; missing data; multivariate meta-analysis, network meta-analysis; and meta-analysis of diagnostic studies.  .

  11. Simplified seismic risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pellissetti, Manuel; Klapp, Ulrich [AREVA NP GmbH, Erlangen (Germany)

    2011-07-01

    Within the context of probabilistic safety analysis (PSA) for nuclear power plants (NPP's), seismic risk assessment has the purpose to demonstrate that the contribution of seismic events to overall risk is not excessive. The most suitable vehicle for seismic risk assessment is a full scope seismic PSA (SPSA), in which the frequency of core damage due to seismic events is estimated. An alternative method is represented by seismic margin assessment (SMA), which aims at showing sufficient margin between the site-specific safe shutdown earthquake (SSE) and the actual capacity of the plant. Both methods are based on system analysis (fault-trees and event-trees) and hence require fragility estimates for safety relevant systems, structures and components (SSC's). If the seismic conditions at a specific site of a plant are not very demanding, then it is reasonable to expect that the risk due to seismic events is low. In such cases, the cost-benefit ratio for performing a full scale, site-specific SPSA or SMA will be excessive, considering the ultimate objective of seismic risk analysis. Rather, it will be more rational to rely on a less comprehensive analysis, used as a basis for demonstrating that the risk due to seismic events is not excessive. The present paper addresses such a simplified approach to seismic risk assessment which is used in AREVA to: - estimate seismic risk in early design stages, - identify needs to extend the design basis, - define a reasonable level of seismic risk analysis Starting from a conservative estimate of the overall plant capacity, in terms of the HCLPF (High Confidence of Low Probability of Failure), and utilizing a generic value for the variability, the seismic risk is estimated by convolution of the hazard and the fragility curve. Critical importance is attached to the selection of the plant capacity in terms of the HCLPF, without performing extensive fragility calculations of seismically relevant SSC's. A suitable basis

  12. Parametric Timing Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vivancos, E; Healy, C; Mueller, F; Whalley, D

    2001-05-09

    Embedded systems often have real-time constraints. Traditional timing analysis statically determines the maximum execution time of a task or a program in a real-time system. These systems typically depend on the worst-case execution time of tasks in order to make static scheduling decisions so that tasks can meet their deadlines. Static determination of worst-case execution times imposes numerous restrictions on real-time programs, which include that the maximum number of iterations of each loop must be known statically. These restrictions can significantly limit the class of programs that would be suitable for a real-time embedded system. This paper describes work-in-progress that uses static timing analysis to aid in making dynamic scheduling decisions. For instance, different algorithms with varying levels of accuracy may be selected based on the algorithm's predicted worst-case execution time and the time allotted for the task. We represent the worst-case execution time of a function or a loop as a formula, where the unknown values affecting the execution time are parameterized. This parametric timing analysis produces formulas that can then be quickly evaluated at run-time so dynamic scheduling decisions can be made with little overhead. Benefits of this work include expanding the class of applications that can be used in a real-time system, improving the accuracy of dynamic scheduling decisions, and more effective utilization of system resources. This paper describes how static timing analysis can be used to aid in making dynamic scheduling decisions. The WCET of a function or a loop is represented as a formula, where the values affecting the execution time are parameterized. Such formulas can then be quickly evaluated at run-time so dynamic scheduling decisions can be made when scheduling a task or choosing algorithms within a task. Benefits of this parametric timing analysis include expanding the class of applications that can be used in a real

  13. Future Climate Analysis

    Energy Technology Data Exchange (ETDEWEB)

    C. G. Cambell

    2004-09-03

    This report documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain, Nevada, the site of a repository for spent nuclear fuel and high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this report provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the following reports: ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]), ''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504]), ''Features, Events, and Processes in UZ Flow and Transport'' (BSC 2004 [DIRS 170012]), and ''Features, Events, and Processes in SZ Flow and Transport'' (BSC 2004 [DIRS 170013]). Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one available forecasting method for establishing upper and lower bounds for future climate estimates. The selection of different methods is directly dependent on the available evidence used to build a forecasting argument. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. While alternative analyses are possible for the case presented for Yucca Mountain, the evidence (data) used would be the same and the conclusions would not be expected to drastically change. Other studies might develop a different rationale or select other past

  14. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  15. Root Cause Analysis - A Diagnostic Failure Analysis Technique for Managers

    Science.gov (United States)

    1975-03-26

    AA~ TECHNICAL REPORT RF-75-2 yAbom 0 ROOT CAUSE ANALYSIS - A DIAGNOSTIC FAILURE ANALYSIS TECHNIQUE FOR MANAGERS Augustine E. Magistro Nuclear...through 1975. rB Augustine E. Magistro has participated in root cause analysis task tem including team member and Blue Ribbon A panel reviewer, team

  16. CADAT integrated circuit mask analysis

    Science.gov (United States)

    1981-01-01

    CADAT System Mask Analysis Program (MAPS2) is automated software tool for analyzing integrated-circuit mask design. Included in MAPS2 functions are artwork verification, device identification, nodal analysis, capacitance calculation, and logic equation generation.

  17. Logical analysis of biological systems

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian

    2005-01-01

    R. Mardare, Logical analysis of biological systems. Fundamenta Informaticae, N 64:271-285, 2005.......R. Mardare, Logical analysis of biological systems. Fundamenta Informaticae, N 64:271-285, 2005....

  18. Resonance frequency analysis

    Directory of Open Access Journals (Sweden)

    Rajiv K Gupta

    2011-01-01

    Full Text Available Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA is one of such techniques which is most frequently used now days. The aim of this paper was to review and analyze critically the current available literature in the field of RFA, and to also discuss based on scientific evidence, the prognostic value of RFA to detect implants at risk of failure. A search was made using the PubMed database to find all the literature published on "Resonance frequency analysis for implant stability" till date. Articles discussed in vivo or in vitro studies comparing RFA with other methods of implant stability measurement and articles discussing its reliability were thoroughly reviewed and discussed. A limited number of clinical reports were found. Various studies have demonstrated the feasibility and predictability of the technique. However, most of these articles are based on retrospective data or uncontrolled cases. Randomized, prospective, parallel-armed longitudinal human trials are based on short-term results and long-term follow up are still scarce in this field. Nonetheless, from available literature, it may be concluded that RFA technique evaluates implant stability as a function of stiffness of the implant bone interface and is influenced by factors such as bone type, exposed implant height above the alveolar crest. Resonance frequency analysis could serve as a non-invasive diagnostic tool for detecting the implant stability of dental implants during the healing stages and in subsequent routine follow up care after treatment. Future studies, preferably randomized

  19. Meta-analysis with R

    OpenAIRE

    Schwarzer, G; Carpenter, JR; Rucker, G

    2015-01-01

    This book provides a comprehensive introduction to performing meta-analysis using the statistical software R. It is intended for quantitative researchers and students in the medical and social sciences who wish to learn how to perform meta-analysis with R. As such, the book introduces the key concepts and models used in meta-analysis. It also includes chapters on the following advanced topics: publication bias and small study effects; missing data; multivariate meta-analysis, network meta-ana...

  20. Masonry macro-block analysis

    OpenAIRE

    Mendes, N

    2015-01-01

    The structural analysis involves the definition of the model and selection of the analysis type. The model should represent the stiffness, the mass and the loads of the structure. The structures can be represented using simplified models, such as the lumped mass models, and advanced models resorting the Finite Element Method (FEM) and Discrete Element Method (DEM). Depending on the characteristics of the structure, different types of analysis can be used such as limit analysis, linear and non...

  1. Strategic Analysis for Patch Ltd.

    OpenAIRE

    Louis, Owen

    2012-01-01

    This paper is a strategic analysis for the start-up Patch Ltd. Patch has developed innovative products for growing produce in homes and will compete in the consumer containergrowing industry. The industry and the company are introduced along with urban agriculture trends. The industry is analysed using Porter’s 5 forces analysis, and a competitive analysis compares Patch to its competitors in key success factors found in the 5 forces analysis. A strategy is developed using opportunities and t...

  2. From Critical Discourse Analysis to Positive Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    黄茜然

    2014-01-01

    Different Scholars have different views of the definition of Discourse Analysis;it ’s a research method that can be used by scholars with a variety of academic and non-academic affiliations, coming from a variety of disciplines, to answer a variety of questions. Critical Discourse Analysis is a branch of Discourse Analysis; this paper introduces its development, guidance theory and approach of it. As Positive Discourse Analysis is the extension of Critical Discourse Analysis, this paper introduces the produc-tion and main theories of it. At last, a comparison was made between them.

  3. Wavelet analysis in neurodynamics

    Science.gov (United States)

    Pavlov, Aleksei N.; Hramov, Aleksandr E.; Koronovskii, Aleksei A.; Sitnikova, Evgenija Yu; Makarov, Valeri A.; Ovchinnikov, Alexey A.

    2012-09-01

    Results obtained using continuous and discrete wavelet transforms as applied to problems in neurodynamics are reviewed, with the emphasis on the potential of wavelet analysis for decoding signal information from neural systems and networks. The following areas of application are considered: (1) the microscopic dynamics of single cells and intracellular processes, (2) sensory data processing, (3) the group dynamics of neuronal ensembles, and (4) the macrodynamics of rhythmical brain activity (using multichannel EEG recordings). The detection and classification of various oscillatory patterns of brain electrical activity and the development of continuous wavelet-based brain activity monitoring systems are also discussed as possibilities.

  4. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  5. Real analysis and foundations

    CERN Document Server

    Krantz, Steven G

    2013-01-01

    Praise for the Second Edition:""The book is recommended as a source for middle-level mathematical courses. It can be used not only in mathematical departments, but also by physicists, engineers, economists, and other experts in applied sciences who want to understand the main ideas of analysis in order to use them to study mathematical models of different type processes.""-Zentralblatt MATH ""The book contains many well-chosen examples and each of the fifteen chapters is followed by almost 500 exercises. … Illustrative pictures are instructive and the design of the book makes reading it a real

  6. Practical multivariate analysis

    CERN Document Server

    Afifi, Abdelmonem; Clark, Virginia A

    2011-01-01

    ""First of all, it is very easy to read. … The authors manage to introduce and (at least partially) explain even quite complex concepts, e.g. eigenvalues, in an easy and pedagogical way that I suppose is attractive to readers without deeper statistical knowledge. The text is also sprinkled with references for those who want to probe deeper into a certain topic. Secondly, I personally find the book's emphasis on practical data handling very appealing. … Thirdly, the book gives very nice coverage of regression analysis. … this is a nicely written book that gives a good overview of a large number

  7. Analysis of Ergot Alkaloids

    Directory of Open Access Journals (Sweden)

    Colin Crews

    2015-06-01

    Full Text Available The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS and liquid chromatography with fluorescence detection (LC-FLD. Other methods based on immunoassays are under development and variations of these and minor techniques are available for specific purposes.

  8. Analysis of synchronous machines

    CERN Document Server

    Lipo, TA

    2012-01-01

    Analysis of Synchronous Machines, Second Edition is a thoroughly modern treatment of an old subject. Courses generally teach about synchronous machines by introducing the steady-state per phase equivalent circuit without a clear, thorough presentation of the source of this circuit representation, which is a crucial aspect. Taking a different approach, this book provides a deeper understanding of complex electromechanical drives. Focusing on the terminal rather than on the internal characteristics of machines, the book begins with the general concept of winding functions, describing the placeme

  9. Analysis, manifolds and physics

    CERN Document Server

    Choquet-Bruhat, Y

    2000-01-01

    Twelve problems have been added to the first edition; four of them are supplements to problems in the first edition. The others deal with issues that have become important, since the first edition of Volume II, in recent developments of various areas of physics. All the problems have their foundations in volume 1 of the 2-Volume set Analysis, Manifolds and Physics. It would have been prohibitively expensive to insert the new problems at their respective places. They are grouped together at the end of this volume, their logical place is indicated by a number of parenthesis following the title.

  10. Dimensional analysis made simple

    Science.gov (United States)

    Lira, Ignacio

    2013-11-01

    An inductive strategy is proposed for teaching dimensional analysis to second- or third-year students of physics, chemistry, or engineering. In this strategy, Buckingham's theorem is seen as a consequence and not as the starting point. In order to concentrate on the basics, the mathematics is kept as elementary as possible. Simple examples are suggested for classroom demonstrations of the power of the technique and others are put forward for homework or experimentation, but instructors are encouraged to produce examples of their own.

  11. Proximate Composition Analysis.

    Science.gov (United States)

    2016-01-01

    The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.

  12. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    -effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high......-dimensional Euclidean spaces under the effect of additive spatially correlated effects, and then move on to consider how to include data alignment in the statistical model as a nonlinear effect under additive correlated noise. In both cases, we will give directions on how to generalize the methodology to more complex...

  13. Elementary heat transfer analysis

    CERN Document Server

    Whitaker, Stephen; Hartnett, James P

    1976-01-01

    Elementary Heat Transfer Analysis provides information pertinent to the fundamental aspects of the nature of transient heat conduction. This book presents a thorough understanding of the thermal energy equation and its application to boundary layer flows and confined and unconfined turbulent flows. Organized into nine chapters, this book begins with an overview of the use of heat transfer coefficients in formulating the flux condition at phase interface. This text then explains the specification as well as application of flux boundary conditions. Other chapters consider a derivation of the tra

  14. Contextual analysis of videos

    CERN Document Server

    Thida, Myo; Monekosso, Dorothy

    2013-01-01

    Video context analysis is an active and vibrant research area, which provides means for extracting, analyzing and understanding behavior of a single target and multiple targets. Over the last few decades, computer vision researchers have been working to improve the accuracy and robustness of algorithms to analyse the context of a video automatically. In general, the research work in this area can be categorized into three major topics: 1) counting number of people in the scene 2) tracking individuals in a crowd and 3) understanding behavior of a single target or multiple targets in the scene.

  15. System Reliability Analysis: Foundations.

    Science.gov (United States)

    1982-07-01

    performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash

  16. SCATHA-Analysis System

    Science.gov (United States)

    1981-01-31

    experiment, tape number, or day. 7 Sk* A1 .A Data Analysis System Functional Flow SAK SCF Agency Agency TapesAec Aec I-, TaFgur 1ap 8C C The digital data then...Mainframe 13 MF word 41 (Subcom) Containing Subcom 14 NIF word 53 (D4002) Level 0 15 MF word 85 (D4001) 16 MF word 102 (Subcom) 17 MF word 103 (Subcom...16 MF word 105 17 NIF word 106 18 MF word 110 19-32 Repeat word order of bytes 5-18 for subcom level 1 23-46 Repeat word order of bytes 5-18 for

  17. Analysis in indexing

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2005-01-01

    is presented as an alternative and the paper discusses how this approach includes a broader range of analyses and how it requires a new set of actions from using this approach; analysis of the domain, users and indexers. The paper concludes that the two-step procedure to indexing is insufficient to explain...... the indexing process and suggests that the domain-centered approach offers a guide for indexers that can help them manage the complexity of indexing. © 2004 Elsevier Ltd. All rights reserved....

  18. Future Climate Analysis

    Energy Technology Data Exchange (ETDEWEB)

    James Houseworth

    2001-10-12

    This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical

  19. Drift Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    D. Kicker

    2004-09-16

    Degradation of underground openings as a function of time is a natural and expected occurrence for any subsurface excavation. Over time, changes occur to both the stress condition and the strength of the rock mass due to several interacting factors. Once the factors contributing to degradation are characterized, the effects of drift degradation can typically be mitigated through appropriate design and maintenance of the ground support system. However, for the emplacement drifts of the geologic repository at Yucca Mountain, it is necessary to characterize drift degradation over a 10,000-year period, which is well beyond the functional period of the ground support system. This document provides an analysis of the amount of drift degradation anticipated in repository emplacement drifts for discrete events and time increments extending throughout the 10,000-year regulatory period for postclosure performance. This revision of the drift degradation analysis was developed to support the license application and fulfill specific agreement items between the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE). The earlier versions of ''Drift Degradation Analysis'' (BSC 2001 [DIRS 156304]) relied primarily on the DRKBA numerical code, which provides for a probabilistic key-block assessment based on realistic fracture patterns determined from field mapping in the Exploratory Studies Facility (ESF) at Yucca Mountain. A key block is defined as a critical block in the surrounding rock mass of an excavation, which is removable and oriented in an unsafe manner such that it is likely to move into an opening unless support is provided. However, the use of the DRKBA code to determine potential rockfall data at the repository horizon during the postclosure period has several limitations: (1) The DRKBA code cannot explicitly apply dynamic loads due to seismic ground motion. (2) The DRKBA code cannot explicitly apply loads due to thermal

  20. Data analysis with Mplus

    CERN Document Server

    Geiser, Christian

    2012-01-01

    A practical introduction to using Mplus for the analysis of multivariate data, this volume provides step-by-step guidance, complete with real data examples, numerous screen shots, and output excerpts. The author shows how to prepare a data set for import in Mplus using SPSS. He explains how to specify different types of models in Mplus syntax and address typical caveats--for example, assessing measurement invariance in longitudinal SEMs. Coverage includes path and factor analytic models as well as mediational, longitudinal, multilevel, and latent class models. Specific programming tips an

  1. Uncertain data envelopment analysis

    CERN Document Server

    Wen, Meilin

    2014-01-01

    This book is intended to present the milestones in the progression of uncertain Data envelopment analysis (DEA). Chapter 1 gives some basic introduction to uncertain theories, including probability theory, credibility theory, uncertainty theory and chance theory. Chapter 2 presents a comprehensive review and discussion of basic DEA models. The stochastic DEA is introduced in Chapter 3, in which the inputs and outputs are assumed to be random variables. To obtain the probability distribution of a random variable, a lot of samples are needed to apply the statistics inference approach. Chapter 4

  2. Recursive principal components analysis.

    Science.gov (United States)

    Voegtlin, Thomas

    2005-10-01

    A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.

  3. Atom trap trace analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Z.-T.; Bailey, K.; Chen, C.-Y.; Du, X.; Li, Y.-M.; O' Connor, T. P.; Young, L.

    2000-05-25

    A new method of ultrasensitive trace-isotope analysis has been developed based upon the technique of laser manipulation of neutral atoms. It has been used to count individual {sup 85}Kr and {sup 81}Kr atoms present in a natural krypton sample with isotopic abundances in the range of 10{sup {minus}11} and 10{sup {minus}13}, respectively. The atom counts are free of contamination from other isotopes, elements,or molecules. The method is applicable to other trace-isotopes that can be efficiently captured with a magneto-optical trap, and has a broad range of potential applications.

  4. Confinement Vessel Dynamic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. Robert Stevens; Stephen P. Rojas

    1999-08-01

    A series of hydrodynamic and structural analyses of a spherical confinement vessel has been performed. The analyses used a hydrodynamic code to estimate the dynamic blast pressures at the vessel's internal surfaces caused by the detonation of a mass of high explosive, then used those blast pressures as applied loads in an explicit finite element model to simulate the vessel's structural response. Numerous load cases were considered. Particular attention was paid to the bolted port connections and the O-ring pressure seals. The analysis methods and results are discussed, and comparisons to experimental results are made.

  5. Structural dynamics analysis

    Science.gov (United States)

    Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.

    1985-01-01

    Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.

  6. Geometric analysis and PDEs

    CERN Document Server

    Ambrosetti, Antonio; Malchiodi, Andrea

    2009-01-01

    This volume contains lecture notes on some topics in geometric analysis, a growing mathematical subject which uses analytical techniques, mostly of partial differential equations, to treat problems in differential geometry and mathematical physics. The presentation of the material should be rather accessible to non-experts in the field, since the presentation is didactic in nature. The reader will be provided with a survey containing some of the most exciting topics in the field, with a series of techniques used to treat such problems.

  7. Analysis of rare categories

    CERN Document Server

    He, Jingrui

    2012-01-01

    In many real-world problems, rare categories (minority classes) play essential roles despite their extreme scarcity. The discovery, characterization and prediction of rare categories of rare examples may protect us from fraudulent or malicious behavior, aid scientific discovery, and even save lives. This book focuses on rare category analysis, where the majority classes have smooth distributions, and the minority classes exhibit the compactness property. Furthermore, it focuses on the challenging cases where the support regions of the majority and minority classes overlap. The author has devel

  8. Integrated Process Capability Analysis

    Institute of Scientific and Technical Information of China (English)

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S

    2002-01-01

    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  9. Surface Aesthetics and Analysis.

    Science.gov (United States)

    Çakır, Barış; Öreroğlu, Ali Rıza; Daniel, Rollin K

    2016-01-01

    Surface aesthetics of an attractive nose result from certain lines, shadows, and highlights with specific proportions and breakpoints. Analysis emphasizes geometric polygons as aesthetic subunits. Evaluation of the complete nasal surface aesthetics is achieved using geometric polygons to define the existing deformity and aesthetic goals. The relationship between the dome triangles, interdomal triangle, facet polygons, and infralobular polygon are integrated to form the "diamond shape" light reflection on the nasal tip. The principles of geometric polygons allow the surgeon to analyze the deformities of the nose, define an operative plan to achieve specific goals, and select the appropriate operative technique.

  10. Introduction to combinatorial analysis

    CERN Document Server

    Riordan, John

    2002-01-01

    This introduction to combinatorial analysis defines the subject as ""the number of ways there are of doing some well-defined operation."" Chapter 1 surveys that part of the theory of permutations and combinations that finds a place in books on elementary algebra, which leads to the extended treatment of generation functions in Chapter 2, where an important result is the introduction of a set of multivariable polynomials.Chapter 3 contains an extended treatment of the principle of inclusion and exclusion which is indispensable to the enumeration of permutations with restricted position given

  11. Compositional analysis of silicon

    Science.gov (United States)

    Kazmerski, L. L.

    1985-05-01

    The use of surface analysis methods in the detection and evaluation of elemental and impurity species in Si is presented. Examples are provided from polycrystalline Si and high-efficiency MINP cells. Auger electron spectroscopy and secondary ion mass spectrometry are used to complement microelectrical data obtained by electron-beam induced-current measurements. A new method is discussed which utilizes the volume indexing of digital secondary ion mass spectroscopy signals, providing compositional information and impurity maps on internal materials/device interfaces.

  12. Motion analysis report

    Science.gov (United States)

    Badler, N. I.

    1985-01-01

    Human motion analysis is the task of converting actual human movements into computer readable data. Such movement information may be obtained though active or passive sensing methods. Active methods include physical measuring devices such as goniometers on joints of the body, force plates, and manually operated sensors such as a Cybex dynamometer. Passive sensing de-couples the position measuring device from actual human contact. Passive sensors include Selspot scanning systems (since there is no mechanical connection between the subject's attached LEDs and the infrared sensing cameras), sonic (spark-based) three-dimensional digitizers, Polhemus six-dimensional tracking systems, and image processing systems based on multiple views and photogrammetric calculations.

  13. Performance Analysis in Bigdata

    Directory of Open Access Journals (Sweden)

    Pankaj Deep Kaur

    2015-10-01

    Full Text Available Big data technologies like Hadoop, NoSQL, Messaging Queues etc. helps in BigData analytics, drive business growth and to take right decisions in time. These Big Data environments are very dynamic and complex; they require performance validation, root cause analysis, and tuning to ensure success. In this paper we talk about how we can analyse and test the performance of these systems. We present the important factors in a big data that are primary candidates for performance testing like data ingestion capacity and throughput, data processing capacity, simulation of expected usage, map reduce jobs and so on and suggest measures to improve performance of bigdata

  14. Visual Analysis of Humans

    CERN Document Server

    Moeslund, Thomas B

    2011-01-01

    This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes

  15. Portfolio Analysis for Vector Calculus

    Science.gov (United States)

    Kaplan, Samuel R.

    2015-01-01

    Classic stock portfolio analysis provides an applied context for Lagrange multipliers that undergraduate students appreciate. Although modern methods of portfolio analysis are beyond the scope of vector calculus, classic methods reinforce the utility of this material. This paper discusses how to introduce classic stock portfolio analysis in a…

  16. Multilevel analysis in CSCL research

    NARCIS (Netherlands)

    Janssen, J.J.H.M.; Erkens, G.; Kirschner, P.A.; Kanselaar, G.

    2011-01-01

    The aim of this chapter is to explain why multilevel analysis (MLA) is often necessary to correctly answer the questions CSCL researchers address. Although CSCL researchers continue to use statistical techniques such as analysis of vari-ance or regression analysis, their datasets are often not suite

  17. Analysis of bilinear stochastic systems

    Science.gov (United States)

    Willsky, A. S.; Martin, D. N.; Marcus, S. I.

    1975-01-01

    Analysis of stochastic dynamical systems that involve multiplicative (bilinear) noise processes. After defining the systems of interest, consideration is given to the evolution of the moments of such systems, the question of stochastic stability, and estimation for bilinear stochastic systems. Both exact and approximate methods of analysis are introduced, and, in particular, the uses of Lie-theoretic concepts and harmonic analysis are discussed.

  18. Exploratory Analysis in Learning Analytics

    Science.gov (United States)

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  19. Discourse Analysis and Language Communication

    Institute of Scientific and Technical Information of China (English)

    韦钧玮

    2008-01-01

    A considerable portion of the work of discourse analysis as a research method can be find in the two major families of discourse analysis are linguistic-based analysis (such as conversation),and culturally or socially based discursive practices The potential of both families,along with examples of both,are discussed.

  20. Correspondence analysis of longitudinal data

    NARCIS (Netherlands)

    Van der Heijden, P.G.M.

    2005-01-01

    Correspondence analysis is an exploratory tool for the analysis of associations between categorical variables, the results of which may be displayed graphically. For longitudinal data with two time points, an analysis of the transition matrix (showing the relative frequencies for pairs of categories

  1. Error Analysis and Propagation in Metabolomics Data Analysis.

    Science.gov (United States)

    Moseley, Hunter N B

    2013-01-01

    Error analysis plays a fundamental role in describing the uncertainty in experimental results. It has several fundamental uses in metabolomics including experimental design, quality control of experiments, the selection of appropriate statistical methods, and the determination of uncertainty in results. Furthermore, the importance of error analysis has grown with the increasing number, complexity, and heterogeneity of measurements characteristic of 'omics research. The increase in data complexity is particularly problematic for metabolomics, which has more heterogeneity than other omics technologies due to the much wider range of molecular entities detected and measured. This review introduces the fundamental concepts of error analysis as they apply to a wide range of metabolomics experimental designs and it discusses current methodologies for determining the propagation of uncertainty in appropriate metabolomics data analysis. These methodologies include analytical derivation and approximation techniques, Monte Carlo error analysis, and error analysis in metabolic inverse problems. Current limitations of each methodology with respect to metabolomics data analysis are also discussed.

  2. Cross-impacts analysis development and energy policy analysis applications

    Energy Technology Data Exchange (ETDEWEB)

    Roop, J.M.; Scheer, R.M.; Stacey, G.S.

    1986-12-01

    Purpose of this report is to describe the cross-impact analysis process and microcomputer software developed for the Office of Policy, Planning, and Analysis (PPA) of DOE. First introduced in 1968, cross-impact analysis is a technique that produces scenarios of future conditions and possibilities. Cross-impact analysis has several unique attributes that make it a tool worth examining, especially in the current climate when the outlook for the economy and several of the key energy markets is uncertain. Cross-impact analysis complements the econometric, engineering, systems dynamics, or trend approaches already in use at DOE. Cross-impact analysis produces self-consistent scenarios in the broadest sense and can include interaction between the economy, technology, society and the environment. Energy policy analyses that couple broad scenarios of the future with detailed forecasting can produce more powerful results than scenario analysis or forecasts can produce alone.

  3. JASMINE data analysis

    Science.gov (United States)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yoshioka, Satoshi

    2015-08-01

    We are planning JASMINE (Japan Astrometric Satellite Mission for INfrared Exploration) as a series missions of Nano-JASMINE, Small-JASMINE, and JASMINE. Nano-JASMINE data analysis will be performed as a collaboration with Gaia data analysis team. We apply Gaia core processing software named AGIS as a Nano-JASMINE core solution. Applicability has been confirmed by D. Michalik and Gaia DPAC team. Converting telemetry data to AGIS input is a JASMINE team's task. It includes centroid caoculatoin of the stellar image. Accuracy of Gaia is two-order better than that of Nano-JASMINE. But there are only two astrometric satellite missions with CCD detector for global astrometry. So, Nano-JASMINE will have role of calibrating Gaia data. Bright star centroiding is the most important science target.Small-JASMINE has completely different observation strategy. It will observe step stair observation with about a million observations for individual star. Sub milli arcsec centroid errors of individual steallar images will be reduced by two order and getting 10 micro arcsecond astrometric accuracy by applying square root N law of million observations. Various systematic noise should be estimated, modelled, and subtracted. Some statistical study will be shown in this poster.

  4. Compressed Submanifold Multifactor Analysis.

    Science.gov (United States)

    Luu, Khoa; Savvides, Marios; Bui, Tien; Suen, Ching

    2016-04-14

    Although widely used, Multilinear PCA (MPCA), one of the leading multilinear analysis methods, still suffers from four major drawbacks. First, it is very sensitive to outliers and noise. Second, it is unable to cope with missing values. Third, it is computationally expensive since MPCA deals with large multi-dimensional datasets. Finally, it is unable to maintain the local geometrical structures due to the averaging process. This paper proposes a novel approach named Compressed Submanifold Multifactor Analysis (CSMA) to solve the four problems mentioned above. Our approach can deal with the problem of missing values and outliers via SVD-L1. The Random Projection method is used to obtain the fast low-rank approximation of a given multifactor dataset. In addition, it is able to preserve the geometry of the original data. Our CSMA method can be used efficiently for multiple purposes, e.g. noise and outlier removal, estimation of missing values, biometric applications. We show that CSMA method can achieve good results and is very efficient in the inpainting problem as compared to [1], [2]. Our method also achieves higher face recognition rates compared to LRTC, SPMA, MPCA and some other methods, i.e. PCA, LDA and LPP, on three challenging face databases, i.e. CMU-MPIE, CMU-PIE and Extended YALE-B.

  5. Mitigation analysis for Estonia

    Energy Technology Data Exchange (ETDEWEB)

    Martins, A.; Roos, J.; Pesur, A. [Inst. of Energy Research, Tallinn (Estonia)] [and others

    1996-09-01

    The present report provides data on the mitigation analysis of Estonia. The results for energy, forest and agricultural sectors and macro-economic analysis are given. The Government of Estonia has identified the development of energy production as the main strategical means in the movement towards market economy. Now 99% of electricity generation and about 25% of heat production in Estonia is based on oil shale combustion. To increase the efficiency of oil shale-fired power plants and decrease CO{sub 2} emissions, the State Enterprise (SE) Eesti Energia (Estonian Energy) is planning to reconstruct these power plants and introduce the Circulating Fluidized Bed (CFB) combustion technology for oil shale burning to replace the Pulverized Combustion (PC). According to the Estonian Forest Policy, two general objectives are of importance: sustainability in forestry and efficiency in forest management. For the reduction of greenhouse gases (GHG) emissions from agriculture, it is necessary to increase the efficiency of production resource usage. The growth of the GDP in 1995 was 2.9% as a result of large-scale privatization activities in Estonia and re-introduction of the available, but unused production capacities with the help of foreign and domestic investments. It is assumed that the medium growth rate of GDP reaches 6% in 1998.

  6. Discriminant Incoherent Component Analysis.

    Science.gov (United States)

    Georgakis, Christos; Panagakis, Yannis; Pantic, Maja

    2016-05-01

    Face images convey rich information which can be perceived as a superposition of low-complexity components associated with attributes, such as facial identity, expressions, and activation of facial action units (AUs). For instance, low-rank components characterizing neutral facial images are associated with identity, while sparse components capturing non-rigid deformations occurring in certain face regions reveal expressions and AU activations. In this paper, the discriminant incoherent component analysis (DICA) is proposed in order to extract low-complexity components, corresponding to facial attributes, which are mutually incoherent among different classes (e.g., identity, expression, and AU activation) from training data, even in the presence of gross sparse errors. To this end, a suitable optimization problem, involving the minimization of nuclear-and l1 -norm, is solved. Having found an ensemble of class-specific incoherent components by the DICA, an unseen (test) image is expressed as a group-sparse linear combination of these components, where the non-zero coefficients reveal the class(es) of the respective facial attribute(s) that it belongs to. The performance of the DICA is experimentally assessed on both synthetic and real-world data. Emphasis is placed on face analysis tasks, namely, joint face and expression recognition, face recognition under varying percentages of training data corruption, subject-independent expression recognition, and AU detection by conducting experiments on four data sets. The proposed method outperforms all the methods that are compared with all the tasks and experimental settings.

  7. Blood stain pattern analysis.

    Science.gov (United States)

    Peschel, O; Kunz, S N; Rothschild, M A; Mützel, E

    2011-09-01

    Bloodstain pattern analysis (BPA) refers to the collection, categorization and interpretation of the shape and distribution of bloodstains connected with a crime. These kinds of stains occur in a considerable proportion of homicide cases. They offer extensive information and are an important part of a functional, medically and scientifically based reconstruction of a crime. The following groups of patterns can essentially be distinguished: dripped and splashed blood, projected blood, impact patterns, cast-off stains, expirated and transferred bloodstains. A highly qualified analysis can help to estimate facts concerning the location, quality and intensity of an external force. A sequence of events may be recognized, and detailed questions connected with the reconstruction of the crime might be answered. In some cases, BPA helps to distinguish between accident, homicide and suicide or to identify bloodstains originating from a perpetrator. BPA is based on systematic training, a visit to the crime scene or alternatively good photographic documentation, and an understanding and knowledge of autopsy findings or statements made by the perpetrator and/or victim. A BPA working group has been established within the German Society of Legal Medicine aiming to put the knowledge and practical applications of this subdiscipline of forensic science on a wider basis.

  8. Modern Fourier analysis

    CERN Document Server

    Grafakos, Loukas

    2014-01-01

    This text is addressed to graduate students in mathematics and to interested researchers who wish to acquire an in depth understanding of Euclidean Harmonic analysis. The text covers modern topics and techniques in function spaces, atomic decompositions, singular integrals of nonconvolution type, and the boundedness and convergence of Fourier series and integrals. The exposition and style are designed to stimulate further study and promote research. Historical information and references are included at the end of each chapter. This third edition includes a new chapter entitled "Multilinear Harmonic Analysis" which focuses on topics related to multilinear operators and their applications. Sections 1.1 and 1.2 are also new in this edition. Numerous corrections have been made to the text from the previous editions and several improvements have been incorporated, such as the adoption of clear and elegant statements. A few more exercises have been added with relevant hints when necessary. Reviews fr...

  9. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  10. Image sequence analysis

    CERN Document Server

    1981-01-01

    The processing of image sequences has a broad spectrum of important applica­ tions including target tracking, robot navigation, bandwidth compression of TV conferencing video signals, studying the motion of biological cells using microcinematography, cloud tracking, and highway traffic monitoring. Image sequence processing involves a large amount of data. However, because of the progress in computer, LSI, and VLSI technologies, we have now reached a stage when many useful processing tasks can be done in a reasonable amount of time. As a result, research and development activities in image sequence analysis have recently been growing at a rapid pace. An IEEE Computer Society Workshop on Computer Analysis of Time-Varying Imagery was held in Philadelphia, April 5-6, 1979. A related special issue of the IEEE Transactions on Pattern Anal­ ysis and Machine Intelligence was published in November 1980. The IEEE Com­ puter magazine has also published a special issue on the subject in 1981. The purpose of this book ...

  11. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  12. Microarray Analysis in Glioblastomas

    Science.gov (United States)

    Bhawe, Kaumudi M.; Aghi, Manish K.

    2016-01-01

    Microarray analysis in glioblastomas is done using either cell lines or patient samples as starting material. A survey of the current literature points to transcript-based microarrays and immunohistochemistry (IHC)-based tissue microarrays as being the preferred methods of choice in cancers of neurological origin. Microarray analysis may be carried out for various purposes including the following: To correlate gene expression signatures of glioblastoma cell lines or tumors with response to chemotherapy (DeLay et al., Clin Cancer Res 18(10):2930–2942, 2012)To correlate gene expression patterns with biological features like proliferation or invasiveness of the glioblastoma cells (Jiang et al., PLoS One 8(6):e66008, 2013)To discover new tumor classificatory systems based on gene expression signature, and to correlate therapeutic response and prognosis with these signatures (Huse et al., Annu Rev Med 64(1):59–70, 2013; Verhaak et al., Cancer Cell 17(1):98–110, 2010) While investigators can sometimes use archived tumor gene expression data available from repositories such as the NCBI Gene Expression Omnibus to answer their questions, new arrays must often be run to adequately answer specific questions. Here, we provide a detailed description of microarray methodologies, how to select the appropriate methodology for a given question, and analytical strategies that can be used. Experimental methodology for protein microarrays is outside the scope of this chapter, but basic sample preparation techniques for transcript-based microarrays are included here. PMID:26113463

  13. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    Variable. This procedure was developed in order to be able to export objects as ESRI shape data with the 90-percentile of the Hue of each object's pixels as an item in the shape attribute table. This procedure uses a sub-level single pixel chessboard segmentation, loops for each of the objects......This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Object...... of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  14. [Public policy analysis].

    Science.gov (United States)

    Subirats, J

    2001-01-01

    This article presents to public health professionals concepts and perspectives from political science relevant for creating a healthier public policy. Currently, there is no uniform vision of what constitutes public interest and the decisions of public administrations tend to be based on compromise. In public debate, what is paramount is the capacity to persuade. From the perspective of public policy analysis, the crucial issue is definition: the final decision depends on the definition of the problem that has emerged triumphant in the public debate among competing actors with different definitions of the problem. From a policy analysis perspective, the problems entering the agenda of public administration does not necessarily correspond to their severity, as competing actors try to impose their point of view. Because of its historical evolution, the Spanish political system has specific traits. The relatively weak democratic tradition tends to make the decision process less visibles, with strong technocratic elements and weaker social articulation. Both the juridical tradition and liberal rhetoric portray lobbying as contrary to public interest, when in fact it is constantly performed by powerful vested interest groups, through both personal contacts and economic connections. Regulatory policies, with concentrated costs and diffuse benefits, seem to be moving from Spain to the European Union. To promote healthier public policies, the development of civil society initiatives and the building of coalitions will play an increasingly greater role in the future.

  15. Parametric scramjet analysis

    Science.gov (United States)

    Choi, Jongseong

    The performance of a hypersonic flight vehicle will depend on existing materials and fuels; this work presents the performance of the ideal scramjet engine for three different combustion chamber materials and three different candidate fuels. Engine performance is explored by parametric cycle analysis for the ideal scramjet as a function of material maximum service temperature and the lower heating value of jet engine fuels. The thermodynamic analysis is based on the Brayton cycle as similarly employed in describing the performance of the ramjet, turbojet, and fanjet ideal engines. The objective of this work is to explore material operating temperatures and fuel possibilities for the combustion chamber of a scramjet propulsion system to show how they relate to scramjet performance and the seven scramjet engine parameters: specific thrust, fuel-to-air ratio, thrust-specific fuel consumption, thermal efficiency, propulsive efficiency, overall efficiency, and thrust flux. The information presented in this work has not been done by others in the scientific literature. This work yields simple algebraic equations for scramjet performance which are similar to that of the ideal ramjet, ideal turbojet and ideal turbofan engines.

  16. Mathematical analysis II

    CERN Document Server

    Zorich, Vladimir A

    2016-01-01

    This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems, and fresh applications to areas seldom touched on in textbooks on real analysis. The main difference between the second and first English editions is the addition of a series of appendices to each volume. There are six of them in the first volume and five in the second. The subjects of these appendices are diverse. They are meant to be useful to both students (in mathematics and physics) and teachers, who may be motivated by different go...

  17. Surface analysis in microelectronics.

    Science.gov (United States)

    Pignataro, S

    1995-10-01

    The contribution given by surface analysis to solve some problems encountered in the production of electronic power devices have been discussed. Mainly two types of problems have been faced. One of these deal with interfacial chemistry. Three examples have been investigated. The first applies to the improvement of the quality and the reliability of plastic packages through the optimization of the resin/metal and resin/die adhesion. The second relies to the adhesion between polyimide and silicon nitride used in the multilevel technology. The third example refers to the so called die-attach process and related problems. Another area of interest in microelectronics is that of the erosion of various types of surfaces and the possibility of wrong etching. A few examples of the application of surface analytical techniques for these problems will be presented. XPS and SIMS working in imaging and multipoint analysis mode, scanning acoustic microscopy, contact angle measurements as well as peeling and tensile strength measurements are the main tools used to obtain useful data.

  18. Concept Analysis: Music Therapy.

    Science.gov (United States)

    Murrock, Carolyn J; Bekhet, Abir K

    2016-01-01

    Down through the ages, music has been universally valued for its therapeutic properties based on the psychological and physiological responses in humans. However, the underlying mechanisms of the psychological and physiological responses to music have been poorly identified and defined. Without clarification, a concept can be misused, thereby diminishing its importance for application to nursing research and practice. The purpose of this article was for the clarification of the concept of music therapy based on Walker and Avant's concept analysis strategy. A review of recent nursing and health-related literature covering the years 2007-2014 was performed on the concepts of music, music therapy, preferred music, and individualized music. As a result of the search, the attributes, antecedents, and consequences of music therapy were identified, defined, and used to develop a conceptual model of music therapy. The conceptual model of music therapy provides direction for developing music interventions for nursing research and practice to be tested in various settings to improve various patient outcomes. Based on Walker and Avant's concept analysis strategy, model and contrary cases are included. Implications for future nursing research and practice to use the psychological and physiological responses to music therapy are discussed.

  19. Nonlinear functional analysis

    CERN Document Server

    Deimling, Klaus

    1985-01-01

    topics. However, only a modest preliminary knowledge is needed. In the first chapter, where we introduce an important topological concept, the so-called topological degree for continuous maps from subsets ofRn into Rn, you need not know anything about functional analysis. Starting with Chapter 2, where infinite dimensions first appear, one should be familiar with the essential step of consider­ ing a sequence or a function of some sort as a point in the corresponding vector space of all such sequences or functions, whenever this abstraction is worthwhile. One should also work out the things which are proved in § 7 and accept certain basic principles of linear functional analysis quoted there for easier references, until they are applied in later chapters. In other words, even the 'completely linear' sections which we have included for your convenience serve only as a vehicle for progress in nonlinearity. Another point that makes the text introductory is the use of an essentially uniform mathematical languag...

  20. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  1. Targeted assets risk analysis.

    Science.gov (United States)

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians.

  2. SCUPOL Magnetic Field Analysis

    CERN Document Server

    Poidevin, Frederick; Kowal, Grzegorz; Pino, Elisabete de Gouveia Dal; Magalhaes, Antonio-Mario

    2013-01-01

    We present an extensive analysis of the 850 microns polarization maps of the SCUPOL Catalog produced by Matthews et al. (2009), focusing exclusively on the molecular clouds and star-forming regions. For the sufficiently sampled regions, we characterize the depolarization properties and the turbulent-to-mean magnetic field ratio of each region. Similar sets of parameters are calculated from 2D synthetic maps of dust emission polarization produced with 3D MHD numerical simulations scaled to the S106, OMC-2/3, W49 and DR21 molecular clouds polarization maps. For these specific regions the turbulent MHD regimes retrieved from the simulations, as described by the turbulent Alfv\\`en and sonic Mach numbers, are consistent within a factor 1 to 2 with the values of the same turbulent regimes estimated from the analysis of Zeeman measurements data provided by Crutcher (1999). Constraints on the values of the inclination angle of the mean magnetic field with respect to the LOS are also given. The values obtained from th...

  3. Exploratory and multivariate data analysis

    CERN Document Server

    Jambu, Michel

    1991-01-01

    With a useful index of notations at the beginning, this book explains and illustrates the theory and application of data analysis methods from univariate to multidimensional and how to learn and use them efficiently. This book is well illustrated and is a useful and well-documented review of the most important data analysis techniques.Key Features* Describes, in detail, exploratory data analysis techniques from the univariate to the multivariate ones* Features a complete description of correspondence analysis and factor analysis techniques as multidimensional statistical data a

  4. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  5. Data-variant kernel analysis

    CERN Document Server

    Motai, Yuichi

    2015-01-01

    Describes and discusses the variants of kernel analysis methods for data types that have been intensely studied in recent years This book covers kernel analysis topics ranging from the fundamental theory of kernel functions to its applications. The book surveys the current status, popular trends, and developments in kernel analysis studies. The author discusses multiple kernel learning algorithms and how to choose the appropriate kernels during the learning phase. Data-Variant Kernel Analysis is a new pattern analysis framework for different types of data configurations. The chapters include

  6. Elements of stock market analysis

    Directory of Open Access Journals (Sweden)

    Suciu, T.

    2013-12-01

    Full Text Available The paper represents a starting point in the presentation of the two types of stock/market analysis: the fundamental analysis and the technical analysis. The fundamental analysis consist in the assessment of the financial and economic status of the company together with the context and macroeconomic environment where it activates. The technical analysis deals with the demand and supply of securities and the evolution of their trend on the market, using a range of graphics and charts to illustrate the market tendencies for the quick identification of the best moments to buy or sell.

  7. Real analysis with economic applications

    CERN Document Server

    Ok, Efe A

    2011-01-01

    There are many mathematics textbooks on real analysis, but they focus on topics not readily helpful for studying economic theory or they are inaccessible to most graduate students of economics. Real Analysis with Economic Applications aims to fill this gap by providing an ideal textbook and reference on real analysis tailored specifically to the concerns of such students. The emphasis throughout is on topics directly relevant to economic theory. In addition to addressing the usual topics of real analysis, this book discusses the elements of order theory, convex analysis, optimizatio

  8. Strategic Management and Business Analysis

    CERN Document Server

    Williamson, David; Jenkins, Wyn; Moreton, Keith Michael

    2003-01-01

    Strategic Business Analysis shows students how to carry out a strategic analysis of a business, with clear guidelines on where and how to apply the core strategic techniques and models that are the integral tools of strategic management.The authors identify the key questions in strategic analysis and provide an understandable framework for answering these questions.Several case studies are used to focus understanding and enable a more thorough analysis of the concepts and issues, especially useful for students involved with case study analysis.Accompanying the text is a CD-Rom containing the m

  9. Fixed effects analysis of variance

    CERN Document Server

    Fisher, Lloyd; Birnbaum, Z W; Lukacs, E

    1978-01-01

    Fixed Effects Analysis of Variance covers the mathematical theory of the fixed effects analysis of variance. The book discusses the theoretical ideas and some applications of the analysis of variance. The text then describes topics such as the t-test; two-sample t-test; the k-sample comparison of means (one-way analysis of variance); the balanced two-way factorial design without interaction; estimation and factorial designs; and the Latin square. Confidence sets, simultaneous confidence intervals, and multiple comparisons; orthogonal and nonorthologonal designs; and multiple regression analysi

  10. Gait Analysis Using Wearable Sensors

    Directory of Open Access Journals (Sweden)

    Hutian Feng

    2012-02-01

    Full Text Available Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications.

  11. Gait analysis using wearable sensors.

    Science.gov (United States)

    Tao, Weijun; Liu, Tao; Zheng, Rencheng; Feng, Hutian

    2012-01-01

    Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications.

  12. Model Checking as Static Analysis

    DEFF Research Database (Denmark)

    Zhang, Fuyuan

    Both model checking and static analysis are prominent approaches to detecting software errors. Model Checking is a successful formal method for verifying properties specified in temporal logics with respect to transition systems. Static analysis is also a powerful method for validating program...... properties which can predict safe approximations to program behaviors. In this thesis, we have developed several static analysis based techniques to solve model checking problems, aiming at showing the link between static analysis and model checking. We focus on logical approaches to static analysis......-calculus can be encoded as the intended model of SFP. Our research results have strengthened the link between model checking and static analysis. This provides a theoretical foundation for developing a unied tool for both model checking and static analysis techniques....

  13. Direct olive oil analysis

    Directory of Open Access Journals (Sweden)

    Peña, F.

    2002-03-01

    Full Text Available The practical impact of “direct analysis” is undeniable as it strong contributes to enhance the so-called productive analytical features such as expeditiousness, reduction of costs and minimisation of risks for the analysts and environment. The main objective is to establish a reliable bypass to the conventional preliminary operations of the analytical process. This paper offers a systematic approach in this context and emphasises the great field of action of direct methodologies in the routine analysis of olive oil. Two main types of methodologies are considered. On the one hand, the direct determination of volatile components is systematically considered. On the other hand, simple procedures to automatically implement the preliminary operations of the oil analysis using simple devices in which the sample is directly introduced with/without a simple dilution are present and discussed.El impacto práctico del análisis directo es tan innegable como que el contribuye decisivamente a mejorar las denominadas características analíticas relacionadas con la productividad como la rapidez, la reducción de costes y la minimización de riesgos para los analistas y el ambiente. El principal objetivo es establecer un adecuado "bypass" a las operaciones convencionales preliminares del proceso analítico. Este artículo ofrece una propuesta sistemática en este contexto y resalta el gran campo de acción de las metodologías directas en los análisis de rutina del aceite de oliva. Se analizan los dos tipos principales de metodologías. Por una lado, se analiza la determinación directa de los compuestos volátiles. Por el otro, se presentan y discuten los procedimientos simples para implementar automáticamente las operaciones preliminares del análisis del aceite usando sistemas simples en los que la muestra se introduce directamente con/sin un dilución simple.

  14. Nonlinear Analysis of Buckling

    Directory of Open Access Journals (Sweden)

    Psotný Martin

    2014-06-01

    Full Text Available The stability analysis of slender web loaded in compression was presented. To solve this problem, a specialized computer program based on FEM was created. The nonlinear finite element method equations were derived from the variational principle of minimum of potential energy. To obtain the nonlinear equilibrium paths, the Newton-Raphson iteration algorithm was used. Corresponding levels of the total potential energy were defined. The peculiarities of the effects of the initial imperfections were investigated. Special attention was focused on the influence of imperfections on the post-critical buckling mode. The stable and unstable paths of the nonlinear solution were separated. Obtained results were compared with those gained using ANSYS system.

  15. Certification/enforcement analysis

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-06-01

    Industry compliance with minimum energy efficiency standards will be assured through a two-part program approach of certification and enforcement activities. The technical support document (TSD) presents the analyses upon which the proposed rule for assuring that consumer product comply with applicable energy efficiency standards is based. Much of the TSD is based upon support provided DOE by Vitro Laboratories. The OAO Corporation provided additional support in the development of the sampling plan incorporated in the proposed rule. Vitro's recommended approach to appliance certification and enforcement, developed after consideration of various program options, benefits, and impacts, establishes the C/E program framework, general criteria, and procedures for assuring a specified level of energy efficiency performance of covered consumer products. The results of the OAO analysis are given in Volume II of the TSD.

  16. Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Feng, Ling

    2008-01-01

    of audio contexts along with pattern recognition methods to map components to known contexts. It also involves looking for the right representations for auditory inputs, i.e. the data analytic processing pipelines invoked by human brains. The main ideas refer to Cognitive Component Analysis, defined...... as the process of unsupervised grouping of generic data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. Its hypothesis runs ecologically: features which are essentially independent in a context defined ensemble, can be efficiently coded as sparse......-documented that unsupervised learning discovers statistical regularities. However human cognition is too complicated and not yet fully understood. Nevertheless, in our approach we represent human cognitive processes as a classification rule in supervised learning. Thus we have devised a testable protocol to test...

  17. Multimedia content analysis

    CERN Document Server

    Ohm, Jens

    2016-01-01

    This textbook covers the theoretical backgrounds and practical aspects of image, video and audio feature expression, e.g., color, texture, edge, shape, salient point and area, motion, 3D structure, audio/sound in time, frequency and cepstral domains, structure and melody. Up-to-date algorithms for estimation, search, classification and compact expression of feature data are described in detail. Concepts of signal decomposition (such as segmentation, source tracking and separation), as well as composition, mixing, effects, and rendering, are discussed. Numerous figures and examples help to illustrate the aspects covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by problem-solving exercises. The book is also a self-contained introduction both for researchers and developers of multimedia content analysis systems in industry. .

  18. Policy Analysis Reaches Midlife

    Directory of Open Access Journals (Sweden)

    Beryl A. Radin

    2013-07-01

    Full Text Available The field of policy analysis that exists in the 21st century is quite different from that found earlier phases. The world of the 1960s that gave rise to this field in the US often seems unrelated to the world we experience today. These shifts have occurred as a result of a range of developments – technological changes, changes in the structure and processes of government both internally and globally, new expectations about accountability and transparency, economic and fiscal problems, and increased political and ideological conflict.It is clear globalization has had a significant impact on the field. Shifts in the type of decisionmaking also have created challenges for policy analysts since analysts are now clearly in every nook and cranny in the decisionmaking world. Thus it is relevant to look at the work that they do, the skills that they require, and the background experience that is relevant to them.

  19. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  20. NEO Test Stand Analysis

    Science.gov (United States)

    Pike, Cody J.

    2015-01-01

    A project within SwampWorks is building a test stand to hold regolith to study how dust is ejected when exposed to the hot exhaust plume of a rocket engine. The test stand needs to be analyzed, finalized, and fabrication drawings generated to move forward. Modifications of the test stand assembly were made with Creo 2 modeling software. Structural analysis calculations were developed by hand to confirm if the structure will hold the expected loads while optimizing support positions. These calculations when iterated through MatLab demonstrated the optimized position of the vertical support to be 98'' from the far end of the stand. All remaining deflections were shown to be under the 0.6'' requirement and internal stresses to meet NASA Ground Support Equipment (GSE) Safety Standards. Though at the time of writing, fabrication drawings have yet to be generated, but are expected shortly after.

  1. Diabetic Retinopathy Analysis

    Directory of Open Access Journals (Sweden)

    Sivakumar R.

    2005-01-01

    Full Text Available Diabetic retinopathy is one of the common complications of diabetes. Unfortunately, in many cases the patient is not aware of any symptoms until it is too late for effective treatment. Through analysis of evoked potential response of the retina, the optical nerve, and the optical brain center, a way will be paved for early diagnosis of diabetic retinopathy and prognosis during the treatment process. In this paper, we present an artificial-neural-network-based method to classify diabetic retinopathy subjects according to changes in visual evoked potential spectral components and an anatomically realistic computer model of the human eye under normal and retinopathy conditions in a virtual environment using 3D Max Studio and Windows Movie Maker.

  2. Thermal analysis of peat

    Energy Technology Data Exchange (ETDEWEB)

    Bergner, K.; Albano, C. (Swedish University of Agricultural Science, Umea (Sweden))

    1993-02-01

    Thermal analysis has been performed on samples of plants, peat, chemical fractions of peat, and coal. Simultaneous thermogravimetry (TG) and differential scanning calorimetry (DSC) technique has proved to be useful in classifying and separating the samples. Due to probable redundant information in the TG and DSC signals the sampling frequency has been investigated. Quantitative predictions of 15 chemical and physical constituents in peat are performed using partial least squares regression (PLSR). Prediction properties are compared with near infrared reflectance spectroscopy (NIR) which shows that TG/DSC and NIR are comparable in predictability of investigated constituents. The use of simultaneous TG and DSC signals in predictions, compared using TG or DSC separately, shows that the combination leads to increases in the predictability, as shown by the use of standard error of prediction (SEP) values.

  3. SDI CFD MODELING ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2011-05-05

    The Savannah River Remediation (SRR) Organization requested that Savannah River National Laboratory (SRNL) develop a Computational Fluid Dynamics (CFD) method to mix and blend the miscible contents of the blend tanks to ensure the contents are properly blended before they are transferred from the blend tank; such as, Tank 50H, to the Salt Waste Processing Facility (SWPF) feed tank. The work described here consists of two modeling areas. They are the mixing modeling analysis during miscible liquid blending operation, and the flow pattern analysis during transfer operation of the blended liquid. The transient CFD governing equations consisting of three momentum equations, one mass balance, two turbulence transport equations for kinetic energy and dissipation rate, and one species transport were solved by an iterative technique until the species concentrations of tank fluid were in equilibrium. The steady-state flow solutions for the entire tank fluid were used for flow pattern analysis, for velocity scaling analysis, and the initial conditions for transient blending calculations. A series of the modeling calculations were performed to estimate the blending times for various jet flow conditions, and to investigate the impact of the cooling coils on the blending time of the tank contents. The modeling results were benchmarked against the pilot scale test results. All of the flow and mixing models were performed with the nozzles installed at the mid-elevation, and parallel to the tank wall. From the CFD modeling calculations, the main results are summarized as follows: (1) The benchmark analyses for the CFD flow velocity and blending models demonstrate their consistency with Engineering Development Laboratory (EDL) and literature test results in terms of local velocity measurements and experimental observations. Thus, an application of the established criterion to SRS full scale tank will provide a better, physically-based estimate of the required mixing time, and

  4. Bite Mark Analysis

    Directory of Open Access Journals (Sweden)

    SK Padmakumar

    2014-07-01

    Full Text Available Bite mark analysis plays an important role in personal identi- fi cation in forensic odontology. They are commonly seen in violent crimes such as sexual assaults, homicides, child abuse, etc. Human bites are common on the face and are usually seen on prominent locations of the face such as the ears, nose and lips. Individual characteristics recorded in the bite marks such as fractures, rotations, attrition, and congenital malformations are helpful in identifying the individual who caused it. We are reporting the case of a 55-year-old lady with bite marks on her left ear, who was allegedly assaulted by the suspect. On the basis of characteristic features of the suspect’s dentition, it was concluded that the bite marks seen on the victim was most probably caused by the suspect.

  5. Capacitor discharge pulse analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Michael Sean; Griffiths, Stewart K.; Tanner, Danelle Mary

    2013-08-01

    Capacitors used in firing sets and other high discharge current applications are discharge tested to verify performance of the capacitor against the application requirements. Parameters such as capacitance, inductance, rise time, pulse width, peak current and current reversal must be verified to ensure that the capacitor will meet the application needs. This report summarizes an analysis performed on the discharge current data to extract these parameters by fitting a second-order system model to the discharge data and using this fit to determine the resulting performance metrics. Details of the theory and implementation are presented. Using the best-fit second-order system model to extract these metrics results in less sensitivity to noise in the measured data and allows for direct extraction of the total series resistance, inductance, and capacitance.

  6. Sandia PUF Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  7. Exploratory orbit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, L.

    1989-03-01

    Unlike the other documents in these proceedings, this paper is neither a scientific nor a technical report. It is, rather, a short personal essay which attempts to describe an Exploratory Orbit Analysis (EOA) environment. Analyzing the behavior of a four or six dimensional nonlinear dynamical system is at least as difficult as analyzing events in high-energy collisions; the consequences of doing it badly, or slowly, would be at least as devastating; and yet the level of effort and expenditure invested in the latter, the very attention paid to it by physicists at large, must be two orders of magnitude greater than that given to the former. It is difficult to choose the model which best explains the behavior of a physical device if one does not first understand the behavior of the available models. The time is ripe for the development of a functioning EOA environment, which I will try to describe in this paper to help us achieve this goal.

  8. Spectrophotometric Analysis of Caffeine

    Directory of Open Access Journals (Sweden)

    Showkat Ahmad Bhawani

    2015-01-01

    Full Text Available The nature of caffeine reveals that it is a bitter white crystalline alkaloid. It is a common ingredient in a variety of drinks (soft and energy drinks and is also used in combination with various medicines. In order to maintain the optimum level of caffeine, various spectrophotometric methods have been developed. The monitoring of caffeine is very important aspect because of its consumption in higher doses that can lead to various physiological disorders. This paper incorporates various spectrophotometric methods used in the analysis of caffeine in various environmental samples such as pharmaceuticals, soft and energy drinks, tea, and coffee. A range of spectrophotometric methodologies including chemometric techniques and derivatization of spectra have been used to analyse the caffeine.

  9. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  10. Accident Tolerant Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  11. Dynamic Analysis of Shells

    Directory of Open Access Journals (Sweden)

    Charles R. Steele

    1995-01-01

    Full Text Available Shell structures are indispensable in virtually every industry. However, in the design, analysis, fabrication, and maintenance of such structures, there are many pitfalls leading to various forms of disaster. The experience gained by engineers over some 200 years of disasters and brushes with disaster is expressed in the extensive archival literature, national codes, and procedural documentation found in larger companies. However, the advantage of the richness in the behavior of shells is that the way is always open for innovation. In this survey, we present a broad overview of the dynamic response of shell structures. The intention is to provide an understanding of the basic themes behind the detailed codes and stimulate, not restrict, positive innovation. Such understanding is also crucial for the correct computation of shell structures by any computer code. The physics dictates that the thin shell structure offers a challenge for analysis and computation. Shell response can be generally categorized by states of extension, inextensional bending, edge bending, and edge transverse shear. Simple estimates for the magnitudes of stress, deformation, and resonance in the extensional and inextensional states are provided by ring response. Several shell examples demonstrate the different states and combinations. For excitation frequency above the extensional resonance, such as in impact and acoustic excitation, a fine mesh is needed over the entire shell surface. For this range, modal and implicit methods are of limited value. The example of a sphere impacting a rigid surface shows that plastic unloading occurs continuously. Thus, there are no short cuts; the complete material behavior must be included.

  12. IDEAL trial: economic analysis

    Directory of Open Access Journals (Sweden)

    Simona de Portu

    2008-01-01

    Full Text Available Introduction: the IDEAL (“High-dose atorvastatin vs usual-dose simvastatin for secondary prevention after myocardial infarction” study was carried out to compare intensive lowering of low-density lipoprotein (LDL-cholesterol using the highest recommended dose of atorvastatin 80 mg with simvastatin 20 mg. Aim: our aim was to investigate the economic consequence of high dose of atorvastatin vs usual-dose of simvastatin in reducing major coronary events in patients with a history of acute myocardial infarction (AMI. Methods: the analysis is based on clinical outcome data from the IDEAL study. We conducted a cost-effectiveness analysis, comparing high dose of atorvastatin (80 mg/die versus usual-dose of simvastatin (20 mg/die in the perspective of the Italian National Health Service. We identified and quantified medical costs: drug costs according to the Italian National Therapeutic Formulary and hospitalizations were quantified based on the Italian National Health Service tariffs (2008. Effects were measured in terms of morbidity reduction (frequency of hospitalizations. We considered an observation period of 4.8 years. The costs borne after the first 12 months were discounted using an annual rate of 3%. We conducted one and multi-way sensitivity analyses on unit cost and effectiveness. Results: the cost of atorvastatin therapy over the 4.8 years period amounted to approximately 2.4 millions euro per 1,000 patients. The total cost of atorvastatin high dose was about 3.9 millions euro, the incremental cost per patient free from event is 31.176,03 euro. Discussion: this evaluation found that atorvastatin therapy is cost-effective. Results were sensitive to either clinical or economic variables.

  13. Accident tolerant fuel analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis [Idaho National Laboratory; Chichester, Heather [Idaho National Laboratory; Johns, Jesse [Texas A& M University; Teague, Melissa [Idaho National Laboratory; Tonks, Michael Idaho National Laboratory; Youngblood, Robert [Idaho National Laboratory

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant

  14. HSAPS market analysis project

    Energy Technology Data Exchange (ETDEWEB)

    Gloeckner, Ronny; Aaberg, Rolf Jarle

    2006-12-15

    The H-SAPS (Hydrogen Stand-Alone Power System) project, an EU project within the ALTENER programme in the period 2002-2004, was initiated to determine the potential for the introduction of environmentally benign hydrogen technology in what is believed to be a near-term market, namely stand-alone power systems (SAPS). The objective of the project was to examine the technological, political, social and economical factors affecting the emergence of hydrogen technology in the stand-alone power system market today and in the future. The scope of the project was limited to small and medium sized stand-alone power systems, up to a few hundred kilowatts (kW) power rating and based on renewable energy as the primary energy source. The work was divided into five phases: (1) Inception, (2) Data collection and analysis, (3) Market analysis and barrier removal, (4) Dissemination, and (5) Final report. Separate reports were written on these topics, and later summarised this final report. The H-SAPS-project identified the following critical technical barriers (in prioritized order): (1) High costs of both electrolyser and fuel cell solutions, (2) Short lifetime warranties and little lifetime experience for PEM electrolysers and PEM fuel cells, (3) Low energy efficiency of the hydrogen energy system (critical for small systems), and (4) The need to develop easy-to-use and energy efficient gas and electricity control systems. One of the main conclusions from the project is that there is a need to focus on interim solutions, based on conventional energy technologies (e.g., internal combustion engines instead of fuel cells), in order for H-SAPS to compete in the near-term SAPS-market (author) (ml)

  15. On a Combined Analysis Framework for Multimodal Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    窦瑞芳

    2015-01-01

    When people communicate,they do not only use language,that is,a single mode of communication,but also simultaneously use body languages,eye contacts,pictures,etc,which is called multimodal communication. The multimodal communication,as a matter of fact,is the most natural way of communication.Therefore,in order to make a complete discourse analysis,all the modes involved in an interaction or discourse should be taken into account and the new analysis framework for Multimodal Discourse Analysis ought to be created to move forward such type of analysis.In this passage,the author makes a tentative move to shape a new analysis framework for Multimodal Discourse Analysis.

  16. On a Combined Analysis Framework for Multimodal Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    窦瑞芳

    2015-01-01

    When people communicate,they do not only use language,that is,a single mode of communication,but also simultaneously use body languages,eye contacts,pictures,etc,which is called multimodal communication.The multimodal communication,as a matter of fact,is the most natural way of communication.Therefore,in order to make a complete discourse analysis,all the modes involved in an interaction or discourse should be taken into account and the new analysis framework for Multimodal Discourse Analysis ought to be created to move forward such type of analysis.In this passage,the author makes a tentative move to shape a new analysis framework for Multimodal Discourse Analysis.

  17. Static Structural and Modal Analysis Using Isogeometric Analysis

    Science.gov (United States)

    Gondegaon, Sangamesh; Voruganti, Hari K.

    2016-12-01

    Isogeometric Analysis (IGA) is a new analysis method for unification of Computer Aided Design (CAD) and Computer Aided Engineering (CAE). With the use of NURBS basis functions for both modelling and analysis, the bottleneck of meshing is avoided and a seamless integration is achieved. The CAD and computational geometry concepts in IGA are new to the analysis community. Though, there is a steady growth of literature, details of calculations, explanations and examples are not reported. The content of the paper is complimentary to the existing literature and addresses the gaps. It includes summary of the literature, overview of the methodology, step-by-step calculations and Matlab codes for example problems in static structural and modal analysis in 1-D and 2-D. At appropriate places, comparison with the Finite Element Analysis (FEM) is also included, so that those familiar with FEM can appreciate IGA better.

  18. Real time analysis under EDS

    Energy Technology Data Exchange (ETDEWEB)

    Schneberk, D.

    1985-07-01

    This paper describes the analysis component of the Enrichment Diagnostic System (EDS) developed for the Atomic Vapor Laser Isotope Separation Program (AVLIS) at Lawrence Livermore National Laboratory (LLNL). Four different types of analysis are performed on data acquired through EDS: (1) absorption spectroscopy on laser-generated spectral lines, (2) mass spectrometer analysis, (3) general purpose waveform analysis, and (4) separation performance calculations. The information produced from this data includes: measures of particle density and velocity, partial pressures of residual gases, and overall measures of isotope enrichment. The analysis component supports a variety of real-time modeling tasks, a means for broadcasting data to other nodes, and a great degree of flexibility for tailoring computations to the exact needs of the process. A particular data base structure and program flow is common to all types of analysis. Key elements of the analysis component are: (1) a fast access data base which can configure all types of analysis, (2) a selected set of analysis routines, (3) a general purpose data manipulation and graphics package for the results of real time analysis. Each of these components are described with an emphasis upon how each contributes to overall system capability. 3 figs.

  19. Developments in CTG analysis.

    Science.gov (United States)

    Van Geijn, H P

    1996-06-01

    FHR monitoring has been the subject of many debates. The technique, in itself, can be considered to be accurate and reliable both in the antenatal period, when using the Doppler signal in combination with autocorrelation techniques, and during the intrapartum period, in particular when the FHR signal can be obtained from a fetal ECG electrode placed on the presenting part. The major problems with FHR monitoring relate to the reading and interpretation of the CTG tracings. Since the FHR pattern is primarily an expression of the activity of the control by the central and peripheral nervous system over cardiovascular haemodynamics, it is possibly too indirect a signal. In other specialities such as neonatology, anaesthesiology and cardiology, monitoring and graphic display of heart rate patterns have not gained wide acceptance among clinicians. Digitized archiving, numerical analysis and even more advanced techniques, as described in this chapter, have primarily found a place in obstetrics. This can be easily explained, since the obstetrician is fully dependent on indirectly collected information regarding the fetal condition, such as (a) movements experienced by the mother, observed with ultrasound or recorded with kinetocardiotocography (Schmidt, 1994), (b) perfusion of various vessels, as assessed by Doppler velocimetry, (c) the amount of amniotic fluid or (d) changes reflected in the condition of the mother, such as the development of gestation-induced hypertension and (e) the easily, continuously obtainable FHR signal. It is of particular comfort to the obstetrician that a normal FHR tracing reliably predicts the birth of the infant in a good condition, which makes cardiotocography so attractive for widespread application. However, in the intrapartum period, many traces cannot fulfil the criteria of normality, especially in the second stage. In this respect, cardiotocography remains primarily a screening and not so much a diagnostic method. As long as continuous

  20. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.