Combined principal component preprocessing and n-tuple neural networks for improved classification
DEFF Research Database (Denmark)
Høskuldsson, Agnar; Linneberg, Christian
2000-01-01
We present a combined principal component analysis/neural network scheme for classification. The data used to illustrate the method consist of spectral fluorescence recordings from seven different production facilities, and the task is to relate an unknown sample to one of these seven factories....... The data are first preprocessed by performing an individual principal component analysis on each of the seven groups of data. The components found are then used for classifying the data, but instead of making a single multiclass classifier, we follow the ideas of turning a multiclass problem into a number...... of two-class problems. For each possible pair of classes we further apply a transformation to the calculated principal components in order to increase the separation between the classes. Finally we apply the so-called n-tuple neural network to the transformed data in order to give the classification...
Hybrid normed ideal perturbations of n-tuples of operators I
Voiculescu, Dan-Virgil
2018-06-01
In hybrid normed ideal perturbations of n-tuples of operators, the normed ideal is allowed to vary with the component operators. We begin extending to this setting the machinery we developed for normed ideal perturbations based on the modulus of quasicentral approximation and an adaptation of our non-commutative generalization of the Weyl-von Neumann theorem. For commuting n-tuples of hermitian operators, the modulus of quasicentral approximation remains essentially the same when Cn- is replaced by a hybrid n-tuple Cp1,…- , … , Cpn- , p1-1 + ⋯ + pn-1 = 1. The proof involves singular integrals of mixed homogeneity.
n-Tupled Coincidence Point Theorems in Partially Ordered Metric Spaces for Compatible Mappings
Directory of Open Access Journals (Sweden)
Sumitra Dalal
2014-01-01
Full Text Available The intent of this paper is to introduce the notion of compatible mappings for n-tupled coincidence points due to (Imdad et al. (2013. Related examples are also given to support our main results. Our results are the generalizations of the results of (Gnana Bhaskar and Lakshmikantham (2006, Lakshmikantham and Ćirić (2009, Choudhury and Kundu (2010, and Choudhary et al. (2013.
Towards Informetrics: Haitun, Laplace, Zipf, Bradford and the Alvey Programme.
Brookes, B. C.
1984-01-01
Review of recent developments in statistical theories for social sciences highlights Haitun's statistical distributions, Laplace's "Law of Succession" and distribution, Laplace and Bradford analysis of book-index data, inefficiency of frequency distribution analysis, Laws of Bradford and Zipf, natural categorization, and Bradford Law and…
Greedy algorithms and Zipf laws
Moran, José; Bouchaud, Jean-Philippe
2018-04-01
We consider a simple model of firm/city/etc growth based on a multi-item criterion: whenever entity B fares better than entity A on a subset of M items out of K, the agent originally in A moves to B. We solve the model analytically in the cases K = 1 and . The resulting stationary distribution of sizes is generically a Zipf-law provided M > K/2. When , no selection occurs and the size distribution remains thin-tailed. In the special case M = K, one needs to regularize the problem by introducing a small ‘default’ probability ϕ. We find that the stationary distribution has a power-law tail that becomes a Zipf-law when . The approach to the stationary state can also be characterized, with strong similarities with a simple ‘aging’ model considered by Barrat and Mézard.
Towards a seascape typology. I. Zipf versus Pareto laws
Seuront, Laurent; Mitchell, James G.
Two data analysis methods, referred to as the Zipf and Pareto methods, initially introduced in economics and linguistics two centuries ago and subsequently used in a wide range of fields (word frequency in languages and literature, human demographics, finance, city formation, genomics and physics), are described and proposed here as a potential tool to classify space-time patterns in marine ecology. The aim of this paper is, first, to present the theoretical bases of Zipf and Pareto laws, and to demonstrate that they are strictly equivalent. In that way, we provide a one-to-one correspondence between their characteristic exponents and argue that the choice of technique is a matter of convenience. Second, we argue that the appeal of this technique is that it is assumption-free for the distribution of the data and regularity of sampling interval, as well as being extremely easy to implement. Finally, in order to allow marine ecologists to identify and classify any structure in their data sets, we provide a step by step overview of the characteristic shapes expected for Zipf's law for the cases of randomness, power law behavior, power law behavior contaminated by internal and external noise, and competing power laws illustrated on the basis of typical ecological situations such as mixing processes involving non-interacting and interacting species, phytoplankton growth processes and differential grazing by zooplankton.
Why Does Zipf's Law Break Down in Rank-Size Distribution of Cities?
Kuninaka, Hiroto; Matsushita, Mitsugu
2008-01-01
We study rank-size distribution of cities in Japan on the basis of data analysis. From the census data after World War II, we find that the rank-size distribution of cities is composed of two parts, each of which has independent power exponent. In addition, the power exponent of the head part of the distribution changes in time and Zipf's law holds only in a restricted period. We show that Zipf's law broke down due to both of Showa and Heisei great mergers and recovered due to population grow...
Martínez-Santiago, O; Marrero-Ponce, Y; Vivas-Reyes, R; Rivera-Borroto, O M; Hurtado, E; Treto-Suarez, M A; Ramos, Y; Vergara-Murillo, F; Orozco-Ugarriza, M E; Martínez-López, Y
2017-05-01
Graph derivative indices (GDIs) have recently been defined over N-atoms (N = 2, 3 and 4) simultaneously, which are based on the concept of derivatives in discrete mathematics (finite difference), metaphorical to the derivative concept in classical mathematical analysis. These molecular descriptors (MDs) codify topo-chemical and topo-structural information based on the concept of the derivative of a molecular graph with respect to a given event (S) over duplex, triplex and quadruplex relations of atoms (vertices). These GDIs have been successfully applied in the description of physicochemical properties like reactivity, solubility and chemical shift, among others, and in several comparative quantitative structure activity/property relationship (QSAR/QSPR) studies. Although satisfactory results have been obtained in previous modelling studies with the aforementioned indices, it is necessary to develop new, more rigorous analysis to assess the true predictive performance of the novel structure codification. So, in the present paper, an assessment and statistical validation of the performance of these novel approaches in QSAR studies are executed, as well as a comparison with those of other QSAR procedures reported in the literature. To achieve the main aim of this research, QSARs were developed on eight chemical datasets widely used as benchmarks in the evaluation/validation of several QSAR methods and/or many different MDs (fundamentally 3D MDs). Three to seven variable QSAR models were built for each chemical dataset, according to the original dissection into training/test sets. The models were developed by using multiple linear regression (MLR) coupled with a genetic algorithm as the feature wrapper selection technique in the MobyDigs software. Each family of GDIs (for duplex, triplex and quadruplex) behaves similarly in all modelling, although there were some exceptions. However, when all families were used in combination, the results achieved were quantitatively
Zipf's law, power laws and maximum entropy
International Nuclear Information System (INIS)
Visser, Matt
2013-01-01
Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified. (paper)
The Evolution of the Exponent of Zipf's Law in Language Ontogeny
Baixeries, Jaume; Elvevåg, Brita; Ferrer-i-Cancho, Ramon
2013-01-01
It is well-known that word frequencies arrange themselves according to Zipf's law. However, little is known about the dependency of the parameters of the law and the complexity of a communication system. Many models of the evolution of language assume that the exponent of the law remains constant as the complexity of a communication systems increases. Using longitudinal studies of child language, we analysed the word rank distribution for the speech of children and adults participating in conversations. The adults typically included family members (e.g., parents) or the investigators conducting the research. Our analysis of the evolution of Zipf's law yields two main unexpected results. First, in children the exponent of the law tends to decrease over time while this tendency is weaker in adults, thus suggesting this is not a mere mirror effect of adult speech. Second, although the exponent of the law is more stable in adults, their exponents fall below 1 which is the typical value of the exponent assumed in both children and adults. Our analysis also shows a tendency of the mean length of utterances (MLU), a simple estimate of syntactic complexity, to increase as the exponent decreases. The parallel evolution of the exponent and a simple indicator of syntactic complexity (MLU) supports the hypothesis that the exponent of Zipf's law and linguistic complexity are inter-related. The assumption that Zipf's law for word ranks is a power-law with a constant exponent of one in both adults and children needs to be revised. PMID:23516390
Deviations in the Zipf and Heaps laws in natural languages
Bochkarev, Vladimir V.; Lerner, Eduard Yu; Shevlyakova, Anna V.
2014-03-01
This paper is devoted to verifying of the empirical Zipf and Hips laws in natural languages using Google Books Ngram corpus data. The connection between the Zipf and Heaps law which predicts the power dependence of the vocabulary size on the text size is discussed. In fact, the Heaps exponent in this dependence varies with the increasing of the text corpus. To explain it, the obtained results are compared with the probability model of text generation. Quasi-periodic variations with characteristic time periods of 60-100 years were also found.
Pareto-Zipf law in growing systems with multiplicative interactions
Ohtsuki, Toshiya; Tanimoto, Satoshi; Sekiyama, Makoto; Fujihara, Akihiro; Yamamoto, Hiroshi
2018-06-01
Numerical simulations of multiplicatively interacting stochastic processes with weighted selections were conducted. A feedback mechanism to control the weight w of selections was proposed. It becomes evident that when w is moderately controlled around 0, such systems spontaneously exhibit the Pareto-Zipf distribution. The simulation results are universal in the sense that microscopic details, such as parameter values and the type of control and weight, are irrelevant. The central ingredient of the Pareto-Zipf law is argued to be the mild control of interactions.
Deviations in the Zipf and Heaps laws in natural languages
International Nuclear Information System (INIS)
Bochkarev, Vladimir V; Lerner, Eduard Yu; Shevlyakova, Anna V
2014-01-01
This paper is devoted to verifying of the empirical Zipf and Hips laws in natural languages using Google Books Ngram corpus data. The connection between the Zipf and Heaps law which predicts the power dependence of the vocabulary size on the text size is discussed. In fact, the Heaps exponent in this dependence varies with the increasing of the text corpus. To explain it, the obtained results are compared with the probability model of text generation. Quasi-periodic variations with characteristic time periods of 60-100 years were also found
Zipf rank approach and cross-country convergence of incomes
Shao, Jia; Ivanov, Plamen Ch.; Urošević, Branko; Stanley, H. Eugene; Podobnik, Boris
2011-05-01
We employ a concept popular in physics —the Zipf rank approach— in order to estimate the number of years that EU members would need in order to achieve "convergence" of their per capita incomes. Assuming that trends in the past twenty years continue to hold in the future, we find that after t≈30 years both developing and developed EU countries indexed by i will have comparable values of their per capita gross domestic product {\\cal G}_{i,t} . Besides the traditional Zipf rank approach we also propose a weighted Zipf rank method. In contrast to the EU block, on the world level the Zipf rank approach shows that, between 1960 and 2009, cross-country income differences increased over time. For a brief period during the 2007-2008 global economic crisis, at world level the {\\cal G}_{i,t} of richer countries declined more rapidly than the {\\cal G}_{i,t} of poorer countries, in contrast to EU where the {\\cal G}_{i,t} of developing EU countries declined faster than the {\\cal G}_{i,t} of developed EU countries, indicating that the recession interrupted the convergence between EU members. We propose a simple model of GDP evolution that accounts for the scaling we observe in the data.
Variation of Zipf's exponent in one hundred live languages: A study of the Holy Bible translations
Mehri, Ali; Jamaati, Maryam
2017-08-01
Zipf's law, as a power-law regularity, confirms long-range correlations between the elements in natural and artificial systems. In this article, this law is evaluated for one hundred live languages. We calculate Zipf's exponent for translations of the holy Bible to several languages, for this purpose. The results show that, the average of Zipf's exponent in studied texts is slightly above unity. All studied languages in some families have Zipf's exponent lower/higher than unity. It seems that geographical distribution impresses the communication between speakers of different languages in a language family, and affect similarity between their Zipf's exponent. The Bible has unique concept regardless of its language, but the discrepancy in grammatical rules and syntactic regularities in applying stop words to make sentences and imply a certain concept, lead to difference in Zipf's exponent for various languages.
Empirical tests of Zipf's law mechanism in open source Linux distribution.
Maillart, T; Sornette, D; Spaeth, S; von Krogh, G
2008-11-21
Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.
The Zipf Law revisited: An evolutionary model of emerging classification
Energy Technology Data Exchange (ETDEWEB)
Levitin, L.B. [Boston Univ., MA (United States); Schapiro, B. [TINA, Brandenburg (Germany); Perlovsky, L. [NRC, Wakefield, MA (United States)
1996-12-31
Zipf`s Law is a remarkable rank-frequency relationship observed in linguistics (the frequencies of the use of words are approximately inversely proportional to their ranks in the decreasing frequency order) as well as in the behavior of many complex systems of surprisingly different nature. We suggest an evolutionary model of emerging classification of objects into classes corresponding to concepts and denoted by words. The evolution of the system is derived from two basic assumptions: first, the probability to recognize an object as belonging to a known class is proportional to the number of objects in this class already recognized, and, second, there exists a small probability to observe an object that requires creation of a new class ({open_quotes}mutation{close_quotes} that gives birth to a new {open_quotes}species{close_quotes}). It is shown that the populations of classes in such a system obey the Zipf Law provided that the rate of emergence of new classes is small. The model leads also to the emergence of a second-tier structure of {open_quotes}super-classes{close_quotes} - groups of classes with almost equal populations.
Predicted and verified deviations from Zipf's law in ecology of competing products.
Hisano, Ryohei; Sornette, Didier; Mizuno, Takayuki
2011-08-01
Zipf's power-law distribution is a generic empirical statistical regularity found in many complex systems. However, rather than universality with a single power-law exponent (equal to 1 for Zipf's law), there are many reported deviations that remain unexplained. A recently developed theory finds that the interplay between (i) one of the most universal ingredients, namely stochastic proportional growth, and (ii) birth and death processes, leads to a generic power-law distribution with an exponent that depends on the characteristics of each ingredient. Here, we report the first complete empirical test of the theory and its application, based on the empirical analysis of the dynamics of market shares in the product market. We estimate directly the average growth rate of market shares and its standard deviation, the birth rates and the "death" (hazard) rate of products. We find that temporal variations and product differences of the observed power-law exponents can be fully captured by the theory with no adjustable parameters. Our results can be generalized to many systems for which the statistical properties revealed by power-law exponents are directly linked to the underlying generating mechanism.
Systematic analysis of coding and noncoding DNA sequences using methods of statistical linguistics
Mantegna, R. N.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.; Stanley, H. E.
1995-01-01
We compare the statistical properties of coding and noncoding regions in eukaryotic and viral DNA sequences by adapting two tests developed for the analysis of natural languages and symbolic sequences. The data set comprises all 30 sequences of length above 50 000 base pairs in GenBank Release No. 81.0, as well as the recently published sequences of C. elegans chromosome III (2.2 Mbp) and yeast chromosome XI (661 Kbp). We find that for the three chromosomes we studied the statistical properties of noncoding regions appear to be closer to those observed in natural languages than those of coding regions. In particular, (i) a n-tuple Zipf analysis of noncoding regions reveals a regime close to power-law behavior while the coding regions show logarithmic behavior over a wide interval, while (ii) an n-gram entropy measurement shows that the noncoding regions have a lower n-gram entropy (and hence a larger "n-gram redundancy") than the coding regions. In contrast to the three chromosomes, we find that for vertebrates such as primates and rodents and for viral DNA, the difference between the statistical properties of coding and noncoding regions is not pronounced and therefore the results of the analyses of the investigated sequences are less conclusive. After noting the intrinsic limitations of the n-gram redundancy analysis, we also briefly discuss the failure of the zeroth- and first-order Markovian models or simple nucleotide repeats to account fully for these "linguistic" features of DNA. Finally, we emphasize that our results by no means prove the existence of a "language" in noncoding DNA.
Zipf exponent of trajectory distribution in the hidden Markov model
Bochkarev, V. V.; Lerner, E. Yu
2014-03-01
This paper is the first step of generalization of the previously obtained full classification of the asymptotic behavior of the probability for Markov chain trajectories for the case of hidden Markov models. The main goal is to study the power (Zipf) and nonpower asymptotics of the frequency list of trajectories of hidden Markov frequencys and to obtain explicit formulae for the exponent of the power asymptotics. We consider several simple classes of hidden Markov models. We prove that the asymptotics for a hidden Markov model and for the corresponding Markov chain can be essentially different.
Zipf exponent of trajectory distribution in the hidden Markov model
International Nuclear Information System (INIS)
Bochkarev, V V; Lerner, E Yu
2014-01-01
This paper is the first step of generalization of the previously obtained full classification of the asymptotic behavior of the probability for Markov chain trajectories for the case of hidden Markov models. The main goal is to study the power (Zipf) and nonpower asymptotics of the frequency list of trajectories of hidden Markov frequencys and to obtain explicit formulae for the exponent of the power asymptotics. We consider several simple classes of hidden Markov models. We prove that the asymptotics for a hidden Markov model and for the corresponding Markov chain can be essentially different
Using Zipf-Mandelbrot law and graph theory to evaluate animal welfare
de Oliveira, Caprice G. L.; Miranda, José G. V.; Japyassú, Hilton F.; El-Hani, Charbel N.
2018-02-01
This work deals with the construction and testing of metrics of welfare based on behavioral complexity, using assumptions derived from Zipf-Mandelbrot law and graph theory. To test these metrics we compared yellow-breasted capuchins (Sapajus xanthosternos) (Wied-Neuwied, 1826) (PRIMATES CEBIDAE) found in two institutions, subjected to different captive conditions: a Zoobotanical Garden (hereafter, ZOO; n = 14), in good welfare condition, and a Wildlife Rescue Center (hereafter, WRC; n = 8), in poor welfare condition. In the Zipf-Mandelbrot-based analysis, the power law exponent was calculated using behavior frequency values versus behavior rank value. These values allow us to evaluate variations in individual behavioral complexity. For each individual we also constructed a graph using the sequence of behavioral units displayed in each recording (average recording time per individual: 4 h 26 min in the ZOO, 4 h 30 min in the WRC). Then, we calculated the values of the main graph attributes, which allowed us to analyze the complexity of the connectivity of the behaviors displayed in the individuals' behavioral sequences. We found significant differences between the two groups for the slope values in the Zipf-Mandelbrot analysis. The slope values for the ZOO individuals approached -1, with graphs representing a power law, while the values for the WRC individuals diverged from -1, differing from a power law pattern. Likewise, we found significant differences for the graph attributes average degree, weighted average degree, and clustering coefficient when comparing the ZOO and WRC individual graphs. However, no significant difference was found for the attributes modularity and average path length. Both analyses were effective in detecting differences between the patterns of behavioral complexity in the two groups. The slope values for the ZOO individuals indicated a higher behavioral complexity when compared to the WRC individuals. Similarly, graph construction and the
Event analysis using a massively parallel processor
International Nuclear Information System (INIS)
Bale, A.; Gerelle, E.; Messersmith, J.; Warren, R.; Hoek, J.
1990-01-01
This paper describes a system for performing histogramming of n-tuple data at interactive rates using a commercial SIMD processor array connected to a work-station running the well-known Physics Analysis Workstation software (PAW). Results indicate that an order of magnitude performance improvement over current RISC technology is easily achievable
Log-Log Convexity of Type-Token Growth in Zipf's Systems
Font-Clos, Francesc; Corral, Álvaro
2015-06-01
It is traditionally assumed that Zipf's law implies the power-law growth of the number of different elements with the total number of elements in a system—the so-called Heaps' law. We show that a careful definition of Zipf's law leads to the violation of Heaps' law in random systems, with growth curves that have a convex shape in log-log scale. These curves fulfill universal data collapse that only depends on the value of Zipf's exponent. We observe that real books behave very much in the same way as random systems, despite the presence of burstiness in word occurrence. We advance an explanation for this unexpected correspondence.
Weiqi games as a tree: Zipf's law of openings and beyond
Xu, Li-Gong; Li, Ming-Xia; Zhou, Wei-Xing
2015-06-01
Weiqi is one of the most complex board games played by two persons. The placement strategies adopted by Weiqi players are often used to analog the philosophy of human wars. Contrary to the western chess, Weiqi games are less studied by academics partially because Weiqi is popular only in East Asia, especially in China, Japan and Korea. Here, we propose to construct a directed tree using a database of extensive Weiqi games and perform a quantitative analysis of the Weiqi tree. We find that the popularity distribution of Weiqi openings with the same number of moves is distributed according to a power law and the tail exponent increases with the number of moves. Intriguingly, the superposition of the popularity distributions of Weiqi openings with a number of moves not higher than a given number also has a power-law tail in which the tail exponent increases with the number of moves, and the superposed distribution approaches the Zipf law. These findings are the same as for chess and support the conjecture that the popularity distribution of board game openings follows the Zipf law with a universal exponent. We also find that the distribution of out-degrees has a power-law form, the distribution of branching ratios has a very complicated pattern, and the distribution of uniqueness scores defined by the path lengths from the root vertex to the leaf vertices exhibits a unimodal shape. Our work provides a promising direction for the study of the decision-making process of Weiqi playing from the perspective of directed branching tree.
Statistical properties of nucleotides in human chromosomes 21 and 22
International Nuclear Information System (INIS)
Zhang Linxi; Sun Tingting
2005-01-01
In this paper the statistical properties of nucleotides in human chromosomes 21 and 22 are investigated. The n-tuple Zipf analysis with n = 3, 4, 5, 6, and 7 is used in our investigation. It is found that the most common n-tuples are those which consist only of adenine (A) and thymine (T), and the rarest n-tuples are those in which GC or CG pattern appears twice. With the n-tuples become more and more frequent, the double GC or CG pattern becomes a single GC or CG pattern. The percentage of four nucleotides in the rarest ten and the most common ten n-tuples are also considered in human chromosomes 21 and 22, and different behaviors are found in the percentage of four nucleotides. Frequency of appearance of n-tuple f(r) as a function of rank r is also examined. We find the n-tuple Zipf plot shows a power-law behavior for r n-1 and a rapid decrease for r > 4 n-1 . In order to explore the interior statistical properties of human chromosomes 21 and 22 in detail, we divide the chromosome sequence into some moving windows and we discuss the percentage of ξη (ξ, η = A, C, G, T) pair in those moving windows. In some particular regions, there are some obvious changes in the percentage of ξη pair, and there maybe exist functional differences. The normalized number of repeats N 0 (l) can be described by a power law: N 0 (l) ∼ l -μ . The distance distributions P 0 (S) between two nucleotides in human chromosomes 21 and 22 are also discussed. A two-order polynomial fit exists in those distance distributions: log P 0 (S) = a + bS + cS 2 , and it is quite different from the random sequence
Where Gibrat meets Zipf: Scale and scope of French firms
Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano
2017-09-01
The proper characterization of the size distribution and growth of firms represents an important issue in economics and business. We use the Maximum Entropy approach to assess the plausibility of the assumption that firm size follows Lognormal or Pareto distributions, which underlies most recent works on the subject. A comprehensive dataset covering the universe of French firms allows us to draw two major conclusions. First, the Pareto hypothesis for the whole distribution should be rejected. Second, by discriminating across firms based on the number of products sold and markets served, we find that, within the class of multi-product companies active in multiple markets, the distribution converges to a Zipf's law. Conversely, Lognormal distribution is a good benchmark for small single-product firms. The size distribution of firms largely depends on firms' diversification patterns.
Zipf's law and city size distribution: A survey of the literature and future research agenda
Arshad, Sidra; Hu, Shougeng; Ashraf, Badar Nadeem
2018-02-01
This study provides a systematic review of the existing literature on Zipf's law for city size distribution. Existing empirical evidence suggests that Zipf's law is not always observable even for the upper-tail cities of a territory. However, the controversy with empirical findings arises due to sample selection biases, methodological weaknesses and data limitations. The hypothesis of Zipf's law is more likely to be rejected for the entire city size distribution and, in such case, alternative distributions have been suggested. On the contrary, the hypothesis is more likely to be accepted if better empirical methods are employed and cities are properly defined. The debate is still far from to be conclusive. In addition, we identify four emerging areas in Zipf's law and city size distribution research including the size distribution of lower-tail cities, the size distribution of cities in sub-national regions, the alternative forms of Zipf's law, and the relationship between Zipf's law and the coherence property of the urban system.
Urban Concentration and Spatial Allocation of Rents from natural resources. A Zipf's Curve Approach
Directory of Open Access Journals (Sweden)
Tomaz Ponce Dentinho
2017-10-01
Full Text Available This paper aims at demonstrating how countries' dependency on natural resources plays a crucial role in urban concentration. The Zipf's Curve Elasticity is estimated for a group of countries and related to a set of indicators of unilateral transferences. Results show that in comparison to others, countries with higher urban concentration explained by higher Zipf's Curve Elasticity have a higher percentage of income coming from natural resources and education expenditures whereas public spending in health and outflow of Foreign Direct Investment seem to have spatial redistribution effects. Summing up, there are signs that the spatial allocation of property rights over natural resources and related rents influences urban concentration.
Stochastic model of Zipf's law and the universality of the power-law exponent.
Yamamoto, Ken
2014-04-01
We propose a stochastic model of Zipf's law, namely a power-law relation between rank and size, and clarify as to why a specific value of its power-law exponent is quite universal. We focus on the successive total of a multiplicative stochastic process. By employing properties of a well-known stochastic process, we concisely show that the successive total follows a stationary power-law distribution, which is directly related to Zipf's law. The formula of the power-law exponent is also derived. Finally, we conclude that the universality of the rank-size exponent is brought about by symmetry between an increase and a decrease in the random growth rate.
Two Universality Properties Associated with the Monkey Model of Zipf's Law
Perline, Richard; Perline, Ron
2016-03-01
The distribution of word probabilities in the monkey model of Zipf's law is associated with two universality properties: (1) the power law exponent converges strongly to $-1$ as the alphabet size increases and the letter probabilities are specified as the spacings from a random division of the unit interval for any distribution with a bounded density function on $[0,1]$; and (2), on a logarithmic scale the version of the model with a finite word length cutoff and unequal letter probabilities is approximately normally distributed in the part of the distribution away from the tails. The first property is proved using a remarkably general limit theorem for the logarithm of sample spacings from Shao and Hahn, and the second property follows from Anscombe's central limit theorem for a random number of i.i.d. random variables. The finite word length model leads to a hybrid Zipf-lognormal mixture distribution closely related to work in other areas.
Stable power laws in variable economies; Lotka-Volterra implies Pareto-Zipf
Solomon, S.; Richmond, P.
2002-05-01
In recent years we have found that logistic systems of the Generalized Lotka-Volterra type (GLV) describing statistical systems of auto-catalytic elements posses power law distributions of the Pareto-Zipf type. In particular, when applied to economic systems, GLV leads to power laws in the relative individual wealth distribution and in market returns. These power laws and their exponent α are invariant to arbitrary variations in the total wealth of the system and to other endogenously and exogenously induced variations.
Emergence of good conduct, scaling and zipf laws in human behavioral sequences in an online world.
Directory of Open Access Journals (Sweden)
Stefan Thurner
Full Text Available We study behavioral action sequences of players in a massive multiplayer online game. In their virtual life players use eight basic actions which allow them to interact with each other. These actions are communication, trade, establishing or breaking friendships and enmities, attack, and punishment. We measure the probabilities for these actions conditional on previous taken and received actions and find a dramatic increase of negative behavior immediately after receiving negative actions. Similarly, positive behavior is intensified by receiving positive actions. We observe a tendency towards antipersistence in communication sequences. Classifying actions as positive (good and negative (bad allows us to define binary 'world lines' of lives of individuals. Positive and negative actions are persistent and occur in clusters, indicated by large scaling exponents α ~ 0.87 of the mean square displacement of the world lines. For all eight action types we find strong signs for high levels of repetitiveness, especially for negative actions. We partition behavioral sequences into segments of length n (behavioral 'words' and 'motifs' and study their statistical properties. We find two approximate power laws in the word ranking distribution, one with an exponent of κ ~ -1 for the ranks up to 100, and another with a lower exponent for higher ranks. The Shannon n-tuple redundancy yields large values and increases in terms of word length, further underscoring the non-trivial statistical properties of behavioral sequences. On the collective, societal level the timeseries of particular actions per day can be understood by a simple mean-reverting log-normal model.
Multi-fractal measures of city-size distributions based on the three-parameter Zipf model
International Nuclear Information System (INIS)
Chen Yanguang; Zhou Yixing
2004-01-01
A multi-fractal framework of urban hierarchies is presented to address the rank-size distribution of cities. The three-parameter Zipf model based on a pair of exponential-type scaling laws is generalized to multi-scale fractal measures. Then according to the equivalent relationship between Zipf's law and Pareto distribution, a set of multi-fractal equations are derived using dual conversion and the Legendre transform. The US city population data coming from the 2000 census are employed to verify the multi-fractal models and the results are satisfying. The multi-fractal measures reveal some strange symmetry regularity of urban systems. While explaining partially the remains of the hierarchical step-like frequency distribution of city sizes suggested by central place theory, the mathematical framework can be interpreted with the entropy-maximizing principle and some related ideas from self-organization
Kanwal, Jasmeen; Smith, Kenny; Culbertson, Jennifer; Kirby, Simon
2017-08-01
The linguist George Kingsley Zipf made a now classic observation about the relationship between a word's length and its frequency; the more frequent a word is, the shorter it tends to be. He claimed that this "Law of Abbreviation" is a universal structural property of language. The Law of Abbreviation has since been documented in a wide range of human languages, and extended to animal communication systems and even computer programming languages. Zipf hypothesised that this universal design feature arises as a result of individuals optimising form-meaning mappings under competing pressures to communicate accurately but also efficiently-his famous Principle of Least Effort. In this study, we use a miniature artificial language learning paradigm to provide direct experimental evidence for this explanatory hypothesis. We show that language users optimise form-meaning mappings only when pressures for accuracy and efficiency both operate during a communicative task, supporting Zipf's conjecture that the Principle of Least Effort can explain this universal feature of word length distributions. Copyright © 2017 Elsevier B.V. All rights reserved.
A scaling law beyond Zipf's law and its relation to Heaps' law
International Nuclear Information System (INIS)
Font-Clos, Francesc; Corral, Álvaro; Boleda, Gemma
2013-01-01
The dependence on text length of the statistical properties of word occurrences has long been considered a severe limitation on the usefulness of quantitative linguistics. We propose a simple scaling form for the distribution of absolute word frequencies that brings to light the robustness of this distribution as text grows. In this way, the shape of the distribution is always the same, and it is only a scale parameter that increases (linearly) with text length. By analyzing very long novels we show that this behavior holds both for raw, unlemmatized texts and for lemmatized texts. In the latter case, the distribution of frequencies is well approximated by a double power law, maintaining the Zipf's exponent value γ ≃ 2 for large frequencies but yielding a smaller exponent in the low-frequency regime. The growth of the distribution with text length allows us to estimate the size of the vocabulary at each step and to propose a generic alternative to Heaps' law, which turns out to be intimately connected to the distribution of frequencies, thanks to its scaling behavior. (paper)
Finding exact constants in a Markov model of Zipfs law generation
Bochkarev, V. V.; Lerner, E. Yu.; Nikiforov, A. A.; Pismenskiy, A. A.
2017-12-01
According to the classical Zipfs law, the word frequency is a power function of the word rank with an exponent -1. The objective of this work is to find multiplicative constant in a Markov model of word generation. Previously, the case of independent letters was mathematically strictly investigated in [Bochkarev V V and Lerner E Yu 2017 International Journal of Mathematics and Mathematical Sciences Article ID 914374]. Unfortunately, the methods used in this paper cannot be generalized in case of Markov chains. The search of the correct formulation of the Markov generalization of this results was performed using experiments with different ergodic matrices of transition probability P. Combinatory technique allowed taking into account all the words with probability of more than e -300 in case of 2 by 2 matrices. It was experimentally proved that the required constant in the limit is equal to the value reciprocal to conditional entropy of matrix row P with weights presenting the elements of the vector π of the stationary distribution of the Markov chain.
Physico-Chemical and Structural Interpretation of Discrete Derivative Indices on N-Tuples Atoms
Martínez-Santiago, Oscar; Marrero-Ponce, Yovani; Barigye, Stephen J.; Le Thi Thu, Huong; Torres, F. Javier; Zambrano, Cesar H.; Muñiz Olite, Jorge L.; Cruz-Monteagudo, Maykel; Vivas-Reyes, Ricardo; Vázquez Infante, Liliana; Artiles Martínez, Luis M.
2016-01-01
This report examines the interpretation of the Graph Derivative Indices (GDIs) from three different perspectives (i.e., in structural, steric and electronic terms). It is found that the individual vertex frequencies may be expressed in terms of the geometrical and electronic reactivity of the atoms and bonds, respectively. On the other hand, it is demonstrated that the GDIs are sensitive to progressive structural modifications in terms of: size, ramifications, electronic richness, conjugation effects and molecular symmetry. Moreover, it is observed that the GDIs quantify the interaction capacity among molecules and codify information on the activation entropy. A structure property relationship study reveals that there exists a direct correspondence between the individual frequencies of atoms and Hückel’s Free Valence, as well as between the atomic GDIs and the chemical shift in NMR, which collectively validates the theory that these indices codify steric and electronic information of the atoms in a molecule. Taking in consideration the regularity and coherence found in experiments performed with the GDIs, it is possible to say that GDIs possess plausible interpretation in structural and physicochemical terms. PMID:27240357
Physico-Chemical and Structural Interpretation of Discrete Derivative Indices on N-Tuples Atoms
Directory of Open Access Journals (Sweden)
Oscar Martínez-Santiago
2016-05-01
Full Text Available This report examines the interpretation of the Graph Derivative Indices (GDIs from three different perspectives (i.e., in structural, steric and electronic terms. It is found that the individual vertex frequencies may be expressed in terms of the geometrical and electronic reactivity of the atoms and bonds, respectively. On the other hand, it is demonstrated that the GDIs are sensitive to progressive structural modifications in terms of: size, ramifications, electronic richness, conjugation effects and molecular symmetry. Moreover, it is observed that the GDIs quantify the interaction capacity among molecules and codify information on the activation entropy. A structure property relationship study reveals that there exists a direct correspondence between the individual frequencies of atoms and Hückel’s Free Valence, as well as between the atomic GDIs and the chemical shift in NMR, which collectively validates the theory that these indices codify steric and electronic information of the atoms in a molecule. Taking in consideration the regularity and coherence found in experiments performed with the GDIs, it is possible to say that GDIs possess plausible interpretation in structural and physicochemical terms.
Integrating PAW, a graphical analysis interface to Sybase
International Nuclear Information System (INIS)
Fry, A.; Chow, I.
1993-04-01
The program PAW (Physics Analysis Workstation) enjoys tremendous popularity within the high energy physics community. It is implemented on a large number of platforms and is available to the high energy physics community free of charge from the CERN computing division. PAW combines extensive graphical display capability (HPLOT/HIGZ), with histogramming (HBOOK4), file and data handling (ZEBRA), vector arithmetic manipulation (SIGMA), user defined functions (COMIS), powerful function minimization (MINUIT), and a command interpreter (KUIP). To facilitate the possibility of using relational databases in physics analysis, we have added an SQL interface to PAW. This interface allows users to create PAW N-tuples from Sybase tables and vice versa. We discuss the implementations below
Esteganografía lingüística en lengua española basada en modelo N-gram y ley de Zipf
Directory of Open Access Journals (Sweden)
Muñoz Muñoz, Alfonso
2014-08-01
Full Text Available Linguistic Steganography is a science that utilises computational linguistics to design systems that can be used to protect and ensure the privacy of digital communications and for the digital marking of texts. Various proposed ways of achieving this goal have been documented in recent years. This paper analyses the possibility of generating natural language texts in Spanish that conceal information automatically. A number of hypotheses are put forward and tested using an algorithm. Experimental evidence suggests that it is feasible to use N-gram models and specific features of the Zipf law to generate stegotexts with a good linguistic quality where human readers could not differentiate the stegotext from authentic texts. The stegotexts obtained allow the concealment of at least 0.5 bits per word generated.La esteganografía lingüistica es una ciencia que se aprovecha de la lingüistica computacional para diseñar sistemas útiles en la protección y la privacidad de las comunicaciones digitales y en el marcado digital de textos. En los últimos años se han documentado múltiples formas de alcanzar este objetivo. En este artículo se analiza la posibilidad de generar automáticamente textos en lenguaje natural en lengua española que oculten una información dada. Se proponen una serie de hipótesis y se experimenta mediante la implementación de un algoritmo. Las pruebas realizadas indican que es factible utilizar modelos N-Gram y peculiaridades derivadas de la ley de Zipf para generar estegotextos con una calidad lingüistica tal que un lector humano podría no diferenciarlo de otro texto auténtico. Los estegotextos obtenidos permitirán la ocultación de al menos 0,5 bits por palabra generada.
Power-law connections: From Zipf to Heaps and beyond
International Nuclear Information System (INIS)
Eliazar, Iddo I.; Cohen, Morrel H.
2013-01-01
In this paper we explore the asymptotic statistics of a general model of rank distributions in the large-ensemble limit; the construction of the general model is motivated by recent empirical studies of rank distributions. Applying Lorenzian, oligarchic, and Heapsian asymptotic analyses we establish a comprehensive set of closed-form results linking together rank distributions, probability distributions, oligarchy sizes, and innovation rates. In particular, the general results reveal the fundamental underlying connections between Zipf’s law, Pareto’s law, and Heaps’ law—three elemental empirical power-laws that are ubiquitously observed in the sciences. -- Highlights: ► The large-ensemble asymptotic statistics of rank distributions are explored. ► Lorenzian, oligarchic, and Heapsian asymptotic analyses are applied. ► Associated oligarchy sizes and induced innovation rates are analyzed. ► General elemental statistical connections are established. ► The underlying connections between Zipf’s, Pareto’s and Heaps’ laws are unveiled
Emergent intelligent properties of progressively structured pattern recognition nets
Energy Technology Data Exchange (ETDEWEB)
Aleksander, I
1983-07-01
The n-tuple recognition net is seen as a building brick of a progression of network structures. The emergent intelligent properties of such systems are discussed. They include the amplification of confidence for the recognition of images that differ in small detail, a short term memory of the last seen image, sequence sensitivity, sequence sensitivity, sequence acceptance and saccadic inspection as an aid in scene analysis. 12 references.
Quantiprot - a Python package for quantitative analysis of protein sequences.
Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold
2017-07-17
The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.
Analysis of mass incident diffusion in Weibo based on self-organization theory
Pan, Jun; Shen, Huizhang
2018-02-01
This study introduces some theories and methods of self-organization system to the research of the diffusion mechanism of mass incidents in Weibo (Chinese Twitter). Based on the analysis on massive Weibo data from Songjiang battery factory incident happened in 2013 and Jiiangsu Qidong OJI PAPER incident happened in 2012, we find out that diffusion system of mass incident in Weibo satisfies Power Law, Zipf's Law, 1/f noise and Self-similarity. It means this system is the self-organization criticality system and dissemination bursts can be understood as one kind of Self-organization behavior. As the consequence, self-organized criticality (SOC) theory can be used to explain the evolution of mass incident diffusion and people may come up with the right strategy to control such kind of diffusion if they can handle the key ingredients of Self-organization well. Such a study is of practical importance which can offer opportunities for policy makers to have good management on these events.
DEFF Research Database (Denmark)
Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang
2014-01-01
three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...
On power series expansions of the S-resolvent operator and the Taylor formula
Colombo, Fabrizio; Gantner, Jonathan
2016-12-01
The S-functional calculus is based on the theory of slice hyperholomorphic functions and it defines functions of n-tuples of not necessarily commuting operators or of quaternionic operators. This calculus relays on the notion of S-spectrum and of S-resolvent operator. Since most of the properties that hold for the Riesz-Dunford functional calculus extend to the S-functional calculus, it can be considered its non commutative version. In this paper we show that the Taylor formula of the Riesz-Dunford functional calculus can be generalized to the S-functional calculus. The proof is not a trivial extension of the classical case because there are several obstructions due to the non commutativity of the setting in which we work that have to be overcome. To prove the Taylor formula we need to introduce a new series expansion of the S-resolvent operators associated to the sum of two n-tuples of operators. This result is a crucial step in the proof of our main results, but it is also of independent interest because it gives a new series expansion for the S-resolvent operators. This paper is addressed to researchers working in operator theory and in hypercomplex analysis.
Wong, Wing-Cheong; Ng, Hong-Kiat; Tantoso, Erwin; Soong, Richie; Eisenhaber, Frank
2018-02-12
Though earlier works on modelling transcript abundance from vertebrates to lower eukaroytes have specifically singled out the Zip's law, the observed distributions often deviate from a single power-law slope. In hindsight, while power-laws of critical phenomena are derived asymptotically under the conditions of infinite observations, real world observations are finite where the finite-size effects will set in to force a power-law distribution into an exponential decay and consequently, manifests as a curvature (i.e., varying exponent values) in a log-log plot. If transcript abundance is truly power-law distributed, the varying exponent signifies changing mathematical moments (e.g., mean, variance) and creates heteroskedasticity which compromises statistical rigor in analysis. The impact of this deviation from the asymptotic power-law on sequencing count data has never truly been examined and quantified. The anecdotal description of transcript abundance being almost Zipf's law-like distributed can be conceptualized as the imperfect mathematical rendition of the Pareto power-law distribution when subjected to the finite-size effects in the real world; This is regardless of the advancement in sequencing technology since sampling is finite in practice. Our conceptualization agrees well with our empirical analysis of two modern day NGS (Next-generation sequencing) datasets: an in-house generated dilution miRNA study of two gastric cancer cell lines (NUGC3 and AGS) and a publicly available spike-in miRNA data; Firstly, the finite-size effects causes the deviations of sequencing count data from Zipf's law and issues of reproducibility in sequencing experiments. Secondly, it manifests as heteroskedasticity among experimental replicates to bring about statistical woes. Surprisingly, a straightforward power-law correction that restores the distribution distortion to a single exponent value can dramatically reduce data heteroskedasticity to invoke an instant increase in
Evolution of scaling emergence in large-scale spatial epidemic spreading.
Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan
2011-01-01
Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.
Bankruptcy risk model and empirical tests
Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M.; Urošević, Branko; Stanley, H. Eugene
2010-01-01
We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor—the debt-to-asset ratio R—in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes’s theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees—although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903
Do Young Children Have Adult-Like Syntactic Categories? Zipf's Law and the Case of the Determiner
Pine, Julian M.; Freudenthal, Daniel; Krajewski, Grzegorz; Gobet, Fernand
2013-01-01
Generativist models of grammatical development assume that children have adult-like grammatical categories from the earliest observable stages, whereas constructivist models assume that children's early categories are more limited in scope. In the present paper, we test these assumptions with respect to one particular syntactic category, the…
GAO Hongying; WU Kangping
2007-01-01
This paper estimates the Pareto exponent of the city size (population size and economy size) distribution, all provinces, and three regions in China in 1997, 2000 and 2003 by OLS, comparatively analyzes the Pareto exponent cross section and times, and empirically analyzes the factors which impacts on the Pareto exponents of provinces. Our analyses show that the size distributions of cities in China follow the Pareto distribution and are of structural features. Variations in the value of the P...
Kenett, Dror Y.; Shapira, Yoash; Ben-Jacob, Eshel
2009-01-01
We present here assessment of the latent market information embedded in the raw, affinity (normalized), and partial correlations. We compared the Zipf plot, spectrum, and distribution of the eigenvalues for each matrix with the results of the corresponding random matrix. The analysis was performed on stocks belonging to the New York ...
International Nuclear Information System (INIS)
Gravina, M.F.; Kunz, P.F.; Pavel, T.J.; Rensing, P.E.
1992-02-01
Hippo Draw is a result of research into finding better ways to visualize the kind of statistical data that is so common in high energy physics analyses. In these analyses, frequency distributions are visualized as histograms, contour plots, scatter plots, etc. Traditionally, one used a library of subroutines, called a histogram package, within one's analysis programs to create and display such distributions. HippoDraw is a NeXTstep application for viewing statistical data. It has several unique features which make viewing data distributions highly interactive. It also incorporates simple drawing tools. HippoDraw is written in Objective-C and uses the Hippoplotamus library package which handles the n-tuples and displays. Hippoplotamus is written in ANSI C. 4 refs
Modeling fractal structure of city-size distributions using correlation functions.
Chen, Yanguang
2011-01-01
Zipf's law is one the most conspicuous empirical facts for cities, however, there is no convincing explanation for the scaling relation between rank and size and its scaling exponent. Using the idea from general fractals and scaling, I propose a dual competition hypothesis of city development to explain the value intervals and the special value, 1, of the power exponent. Zipf's law and Pareto's law can be mathematically transformed into one another, but represent different processes of urban evolution, respectively. Based on the Pareto distribution, a frequency correlation function can be constructed. By scaling analysis and multifractals spectrum, the parameter interval of Pareto exponent is derived as (0.5, 1]; Based on the Zipf distribution, a size correlation function can be built, and it is opposite to the first one. By the second correlation function and multifractals notion, the Pareto exponent interval is derived as [1, 2). Thus the process of urban evolution falls into two effects: one is the Pareto effect indicating city number increase (external complexity), and the other the Zipf effect indicating city size growth (internal complexity). Because of struggle of the two effects, the scaling exponent varies from 0.5 to 2; but if the two effects reach equilibrium with each other, the scaling exponent approaches 1. A series of mathematical experiments on hierarchical correlation are employed to verify the models and a conclusion can be drawn that if cities in a given region follow Zipf's law, the frequency and size correlations will follow the scaling law. This theory can be generalized to interpret the inverse power-law distributions in various fields of physical and social sciences.
Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System
Directory of Open Access Journals (Sweden)
Wuyang Cheng
2014-01-01
Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.
Codon and amino-acid distribution in DNA
International Nuclear Information System (INIS)
Kim, J.K.; Yang, S.I.; Kwon, Y.H.; Lee, E.I.
2005-01-01
According to the Zipf's law, the distribution of rank-ordered frequency of words in the natural language can be modelled on the power law. In this paper, we examine the frequency distribution of 64 codons over the coding and non-coding regions of 88 DNA from EMBL and GenBank database, using exponential fitting. Also, we regard 20 amino-acids as vocabulary, perform the same frequency analysis to the same database and show that amino-acids can be used as biological meaningful words for Zipf's approach. Our analysis suggests that a natural language structure may exist not only in the coding region of DNA but in the non-coding one of DNA
Decision analysis multicriteria analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis
Chalise, Darshan
2017-01-01
The interaction between Dark Matter particles and Standard Model particles is possible through a force mediated by a Dark Matter(DM) - Standard Model(SM) mediator. If that mediator decays through a dijet event, the reconstructed invariant mass of the jets will peak at a speciﬁc value, in contrast to the smooth QCD background. This analysis is a preliminary work towards the understanding of how changes in detector conditions at the Future Circular Collider aﬀect the sensitivity of the mediator signal. MadGraph 5 was used to produce events with 30 TeV DM mediator and Heppy was used to produce ﬂat n-tuples for ROOT analysis. MadAnalysis 5 was then used to produce histograms of MadGraph events and PyRoot was used to analyze Heppy output. Histograms of invariant mass of the jets after event production through MadGraph as well as after Heppy analysis showed a peak at 30 TeV. This veriﬁed the production of a 30 TeV mediator during event production.
Using Grid for the BABAR Experiment
International Nuclear Information System (INIS)
Bozzi, C.
2005-01-01
The BaBar experiment has been taking data since 1999. In 2001 the computing group started to evaluate the possibility to evolve toward a distributed computing model in a grid environment. We built a prototype system, based on the European Data Grid (EDG), to submit full-scale analysis and Monte Carlo simulation jobs. Computing elements, storage elements, and worker nodes have been installed at SLAC and at various European sites. A BaBar virtual organization (VO) and a test replica catalog (RC) are maintained in Manchester, U.K., and the experiment is using three EDG testbed resource brokers in the U.K. and in Italy. First analysis tests were performed under the assumption that a standard BaBar software release was available at the grid target sites, using RC to register information about the executable and the produced n-tuples. Hundreds of analysis jobs accessing either Objectivity or Root data files ran on the grid. We tested the Monte Carlo production using a farm of the INFN-grid testbed customized to install an Objectivity database and run BaBar simulation software. First simulation production tests were performed using standard Job Description Language commands and the output files were written on the closest storage element. A package that can be officially distributed to grid sites not specifically customized for BaBar has been prepared. We are studying the possibility to add a user friendly interface to access grid services for BaBar
International Nuclear Information System (INIS)
2008-05-01
This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.
Word frequencies: A comparison of Pareto type distributions
Wiegand, Martin; Nadarajah, Saralees; Si, Yuancheng
2018-03-01
Mehri and Jamaati (2017) [18] used Zipf's law to model word frequencies in Holy Bible translations for one hundred live languages. We compare the fit of Zipf's law to a number of Pareto type distributions. The latter distributions are shown to provide the best fit, as judged by a number of comparative plots and error measures. The fit of Zipf's law appears generally poor.
International Nuclear Information System (INIS)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-01
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-15
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Subspace gaps and Weyl's theorem for an elementary operator
Directory of Open Access Journals (Sweden)
B. P. Duggal
2005-01-01
Full Text Available A range-kernal orthogonality property is established for the elementary operators ℰ(X=∑i=1nAiXBi and ℰ*(X=∑i=1nAi*XBi*, where A=(A1,A2,…,An and B=(B1,B2,…,Bn are n-tuples of mutually commuting scalar operators (in the sense of Dunford in the algebra B(H of operators on a Hilbert space H. It is proved that the operator ℰ satisfies Weyl's theorem in the case in which A and B are n-tuples of mutually commuting generalized scalar operators.
Spectral theory of linear operators and spectral systems in Banach algebras
Müller, Vladimir
2003-01-01
This book is dedicated to the spectral theory of linear operators on Banach spaces and of elements in Banach algebras. It presents a survey of results concerning various types of spectra, both of single and n-tuples of elements. Typical examples are the one-sided spectra, the approximate point, essential, local and Taylor spectrum, and their variants. The theory is presented in a unified, axiomatic and elementary way. Many results appear here for the first time in a monograph. The material is self-contained. Only a basic knowledge of functional analysis, topology, and complex analysis is assumed. The monograph should appeal both to students who would like to learn about spectral theory and to experts in the field. It can also serve as a reference book. The present second edition contains a number of new results, in particular, concerning orbits and their relations to the invariant subspace problem. This book is dedicated to the spectral theory of linear operators on Banach spaces and of elements in Banach alg...
Energy Technology Data Exchange (ETDEWEB)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-15
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
International Nuclear Information System (INIS)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-01
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
Feynman's Operational Calculi: Spectral Theory for Noncommuting Self-adjoint Operators
International Nuclear Information System (INIS)
Jefferies, Brian; Johnson, Gerald W.; Nielsen, Lance
2007-01-01
The spectral theorem for commuting self-adjoint operators along with the associated functional (or operational) calculus is among the most useful and beautiful results of analysis. It is well known that forming a functional calculus for noncommuting self-adjoint operators is far more problematic. The central result of this paper establishes a rich functional calculus for any finite number of noncommuting (i.e. not necessarily commuting) bounded, self-adjoint operators A 1 ,..., A n and associated continuous Borel probability measures μ 1 , ?, μ n on [0,1]. Fix A 1 ,..., A n . Then each choice of an n-tuple (μ 1 ,...,μ n ) of measures determines one of Feynman's operational calculi acting on a certain Banach algebra of analytic functions even when A 1 , ..., A n are just bounded linear operators on a Banach space. The Hilbert space setting along with self-adjointness allows us to extend the operational calculi well beyond the analytic functions. Using results and ideas drawn largely from the proof of our main theorem, we also establish a family of Trotter product type formulas suitable for Feynman's operational calculi
Haralick, R. M.; Kanemasu, E. T.; Morain, S. A.; Yarger, H. L.; Ulaby, F. T.; Davis, J. C. (Principal Investigator); Bosley, R. J.; Williams, D. L.; Mccauley, J. R.; Mcnaughton, J. L.
1973-01-01
The author has identified the following significant results. Improvement in the land use classification accuracy of ERTS-1 MSS multi-images over Kansas can be made using two distances between neighboring grey tone N-tuples instead of one distance. Much more information is contained texturally than spectrally on the Kansas image. Ground truth measurements indicate that reflectance ratios of the 545 and 655 nm wavebands provide an index of plant development and possibly physiological stress. Preliminary analysis of MSS 4 and 5 channels substantiate the ground truth interpretation. Results of the land use mapping experiment indicate that ERTS-1 imagery has major potential in regionalization. The ways in which land is utilized within these regions may then be studied more effectively than if no adequate regionalization is available. A model for estimating wheat yield per acre has been applied to acreage estimates derived from ERTS-1 imagery to project the 1973 wheat yields for a ten county area in southwest Kansas. The results are within 3% of the preharvest estimates for the same area prepared by the USDA. Visual identification of winter wheat is readily achieved by using a temporal sequence of images. Identification can be improve by stratifying the project area into subregions having more or less homogeneous agricultural practices and crop mixes.
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
McShane, Edward James
2013-01-01
This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...
Kantorovich, L V
1982-01-01
Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space
International Nuclear Information System (INIS)
Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.
1994-01-01
This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs
Thiemann, Francis C.
Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…
Indian Academy of Sciences (India)
Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.
Bravená, Helena
2009-01-01
This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.
Involutive distributions of operator-valued evolutionary vector fields and their affine geometry
Kiselev, A.V.; van de Leur, J.W.
2010-01-01
We generalize the notion of a Lie algebroid over infinite jet bundle by replacing the variational anchor with an N-tuple of differential operators whose images in the Lie algebra of evolutionary vector fields of the jet space are subject to collective commutation closure. The linear space of such
International Nuclear Information System (INIS)
Gravina, M.F.; Kunz, P.F.; Rensing, P.E.
1992-09-01
HippoDraw is a NeXTSTEP application for viewing statistical data. It has several unique features which make viewing data distributions highly interactive. It also incorporates a set of simple drawing tools. HippoDraw is written in Objective-C and uses the Hippoplotamus library package to handle the n-tuples and displays
International Nuclear Information System (INIS)
Francois, P.
1996-01-01
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs
Energy Technology Data Exchange (ETDEWEB)
Francois, P
1997-12-31
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
International Nuclear Information System (INIS)
Warner, M.
1987-01-01
What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
International Nuclear Information System (INIS)
1988-01-01
Basic studies in nuclear analytical techniques include the examination of underlying assumptions and the development and extention of techniques involving the use of ion beams for elemental and mass analysis. 1 ref., 1 tab
Energy Technology Data Exchange (ETDEWEB)
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Gasinski, Leszek
2005-01-01
Hausdorff Measures and Capacity. Lebesgue-Bochner and Sobolev Spaces. Nonlinear Operators and Young Measures. Smooth and Nonsmooth Analysis and Variational Principles. Critical Point Theory. Eigenvalue Problems and Maximum Principles. Fixed Point Theory.
DEFF Research Database (Denmark)
Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel
2017-01-01
Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...
Schiffrin, Deborah
1990-01-01
Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
International Nuclear Information System (INIS)
1959-01-01
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
Energy Technology Data Exchange (ETDEWEB)
NONE
1959-07-15
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
DEFF Research Database (Denmark)
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...
Rao, G Shanker
2006-01-01
About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
DiBenedetto, Emmanuele
2016-01-01
The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...
International Nuclear Information System (INIS)
Romli
1997-01-01
Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods
Rockafellar, Ralph Tyrell
2015-01-01
Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
International Nuclear Information System (INIS)
Biehl, F.A.
1984-05-01
This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
International Nuclear Information System (INIS)
Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.
1997-01-01
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es
Alan Gallegos
2002-01-01
Watershed analyses and assessments for the Kings River Sustainable Forest Ecosystems Project were done on about 33,000 acres of the 45,500-acre Big Creek watershed and 32,000 acres of the 85,100-acre Dinkey Creek watershed. Following procedures developed for analysis of cumulative watershed effects (CWE) in the Pacific Northwest Region of the USDA Forest Service, the...
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
International Nuclear Information System (INIS)
Unterberger, A.
1987-01-01
We study the Klein-Gordon symbolic calculus of operators acting on solutions of the free Klein-Gordon equation. It contracts to the Weyl calculus as c→∞. Mathematically, it may also be considered as a pseudodifferential analysis on the unit ball of R n [fr
International Nuclear Information System (INIS)
Woodard, K.
1985-01-01
The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties
DEFF Research Database (Denmark)
Hjørland, Birger
2017-01-01
The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....
International Nuclear Information System (INIS)
Rhoades, W.A.; Dray, B.J.
1970-01-01
The effect of Gadolinium-155 on the prompt kinetic behavior of a zirconium hydride reactor has been deduced, using experimental data from the SNAPTRAN machine. The poison material makes the temperature coefficient more positive, and the Type IV sleeves were deduced to give a positive coefficient above 1100 0 F. A thorough discussion of the data and analysis is included. (U.S.)
International Nuclear Information System (INIS)
Saadi, Radouan; Marah, Hamid
2014-01-01
This report presents results related to Tritium analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal, within the framework of the RAF7011 project. It describes analytical method and instrumentation including general uncertainty estimation: Electrolytic enrichment and liquid scintillation counting; The results are expressed in Tritium Unit (TU); Low Limit Detection: 0.02 TU
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance
DEFF Research Database (Denmark)
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonan...
Adrian Ioana; Tiberiu Socaciu
2013-01-01
The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...
International Nuclear Information System (INIS)
Smith, M.; Jones, D.R.
1991-01-01
The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning
DEFF Research Database (Denmark)
The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...... area within the four participating Nordic countries. It is a regional meeting of the International Association for Pattern Recognition (IAPR). We would like to thank all authors who submitted works to this year’s SCIA, the invited speakers, and our Program Committee. In total 67 papers were submitted....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...
Helson, Henry
2010-01-01
This second edition has been enlarged and considerably rewritten. Among the new topics are infinite product spaces with applications to probability, disintegration of measures on product spaces, positive definite functions on the line, and additional information about Weyl's theorems on equidistribution. Topics that have continued from the first edition include Minkowski's theorem, measures with bounded powers, idempotent measures, spectral sets of bounded functions and a theorem of Szego, and the Wiener Tauberian theorem. Readers of the book should have studied the Lebesgue integral, the elementary theory of analytic and harmonic functions, and the basic theory of Banach spaces. The treatment is classical and as simple as possible. This is an instructional book, not a treatise. Mathematics students interested in analysis will find here what they need to know about Fourier analysis. Physicists and others can use the book as a reference for more advanced topics.
Bray, Hubert L; Mazzeo, Rafe; Sesum, Natasa
2015-01-01
This volume includes expanded versions of the lectures delivered in the Graduate Minicourse portion of the 2013 Park City Mathematics Institute session on Geometric Analysis. The papers give excellent high-level introductions, suitable for graduate students wishing to enter the field and experienced researchers alike, to a range of the most important areas of geometric analysis. These include: the general issue of geometric evolution, with more detailed lectures on Ricci flow and Kähler-Ricci flow, new progress on the analytic aspects of the Willmore equation as well as an introduction to the recent proof of the Willmore conjecture and new directions in min-max theory for geometric variational problems, the current state of the art regarding minimal surfaces in R^3, the role of critical metrics in Riemannian geometry, and the modern perspective on the study of eigenfunctions and eigenvalues for Laplace-Beltrami operators.
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
International Nuclear Information System (INIS)
Quinn, C.A.
1983-01-01
The article deals with spectrographic analysis and the analytical methods based on it. The theory of spectrographic analysis is discussed as well as the layout of a spectrometer system. The infrared absorption spectrum of a compound is probably its most unique property. The absorption of infrared radiation depends on increasing the energy of vibration and rotation associated with a covalent bond. The infrared region is intrinsically low in energy thus the design of infrared spectrometers is always directed toward maximising energy throughput. The article also considers atomic absorption - flame atomizers, non-flame atomizers and the source of radiation. Under the section an emission spectroscopy non-electrical energy sources, electrical energy sources and electrical flames are discussed. Digital computers form a part of the development on spectrographic instrumentation
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
International Nuclear Information System (INIS)
Hwang, Hun
2007-02-01
This book explains potentiometry, voltametry, amperometry and basic conception of conductometry with eleven chapters. It gives the specific descriptions on electrochemical cell and its mode, basic conception of electrochemical analysis on oxidation-reduction reaction, standard electrode potential, formal potential, faradaic current and faradaic process, mass transfer and overvoltage, potentiometry and indirect potentiometry, polarography with TAST, normal pulse and deferential pulse, voltammetry, conductometry and conductometric titration.
International Nuclear Information System (INIS)
Badwe, R.A.
1999-01-01
The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model
DEFF Research Database (Denmark)
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
Mucha, Hans-Joachim; Sofyan, Hizir
2000-01-01
As an explorative technique, duster analysis provides a description or a reduction in the dimension of the data. It classifies a set of observations into two or more mutually exclusive unknown groups based on combinations of many variables. Its aim is to construct groups in such a way that the profiles of objects in the same groups are relatively homogenous whereas the profiles of objects in different groups are relatively heterogeneous. Clustering is distinct from classification techniques, ...
International Nuclear Information System (INIS)
Garbarino, J.R.; Steinheimer, T.R.; Taylor, H.E.
1985-01-01
This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references
Energy Technology Data Exchange (ETDEWEB)
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
DEFF Research Database (Denmark)
Moore, R; Brødsgaard, I; Miller, ML
1997-01-01
A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were ...... of covalidating questionnaires that reflect results of qualitative interviews are recommended in order to estimate sample parameters such as intersubject agreement, individual subject accuracy, and minimum required sample sizes.......A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were...... of consistency in use of descriptors within groups, validity of description, accuracy of individuals compared with others in their group, and minimum required sample size were calculated using Cronbach's alpha, factor analysis, and Bayesian probability. Ethnic and professional differences within and across...
International Nuclear Information System (INIS)
Iorio, A.F.; Crespi, J.C.
1987-01-01
After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)
International Nuclear Information System (INIS)
1988-01-01
In a search for correlations between the elemental composition of trace elements in human stones and the stone types with relation to their growth pattern, a combined PIXE and x-ray diffraction spectrometry approach was implemented. The combination of scanning PIXE and XRD has proved to be an advance in the methodology of stone analysis and may point to the growth pattern in the body. The exact role of trace elements in the formation and growth of urinary stones is not fully understood. Efforts are thus continuing firstly to solve the analytical problems concerned and secondly to design suitable experiments that would provide information about the occurrence and distribution of trace elements in urine. 1 fig., 1 ref
A self-similar hierarchy of the Korean stock market
Lim, Gyuchang; Min, Seungsik; Yoo, Kun-Woo
2013-01-01
A scaling analysis is performed on market values of stocks listed on Korean stock exchanges such as the KOSPI and the KOSDAQ. Different from previous studies on price fluctuations, market capitalizations are dealt with in this work. First, we show that the sum of the two stock exchanges shows a clear rank-size distribution, i.e., the Zipf's law, just as each separate one does. Second, by abstracting Zipf's law as a γ-sequence, we define a self-similar hierarchy consisting of many levels, with the numbers of firms at each level forming a geometric sequence. We also use two exponential functions to describe the hierarchy and derive a scaling law from them. Lastly, we propose a self-similar hierarchical process and perform an empirical analysis on our data set. Based on our findings, we argue that all money invested in the stock market is distributed in a hierarchical way and that a slight difference exists between the two exchanges.
International Nuclear Information System (INIS)
Straub, W.A.
1987-01-01
This review is the seventh in the series compiled by using the Dialog on-line CA Search facilities at the Information Resource Center of USS Technical Center covering the period from Oct. 1984 to Nov. 1, 1986. The quest for better surface properties, through the application of various electrochemical and other coating techniques, seems to have increased and reinforces the notion that only through the value added to a steel by proper finishing steps can a major supplier hope to compete profitably. The detection, determination, and control of microalloying constituents has also been generating a lot of interest as evidenced by the number of publications devoted to this subject in the last few years. Several recent review articles amplify on the recent trends in the application of modern analytical technology to steelmaking. Another review has been devoted to the determination of trace elements and the simultaneous determination of elements in metals by mass spectrometry, atomic absorption spectrometry, and multielement emission spectrometry. Problems associated with the analysis of electroplating wastewaters have been reviewed in a recent publication that has described the use of various spectrophotometric methods for this purpose. The collection and treatment of analytical data in the modern steel making environment have been extensively reviewed emphasis on the interaction of the providers and users of the analytical data, its quality, and the cost of its collection. Raw material treatment and beneficiation was the dominant theme
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
A common real time framework for SuperKEKB and Hyper Suprime-Cam at Subaru telescope
International Nuclear Information System (INIS)
Lee, S; Itoh, R; Katayama, N; Furusawa, H; Aihara, H; Mineo, S
2010-01-01
The real time data analysis at next generation experiments is a challenge because of their enormous data rate and size. The SuperKEKB experiment, the upgraded Belle experiment, requires to process 100 times larger data of current one. The offline-level data analysis is necessary in the HLT farm for the efficient data reduction. The real time processing of huge data is also the key at the planned dark energy survey using the Subaru telescope. The main camera for the survey called Hyper Suprime-Cam consists of 100 CCDs with 8 mega pixels each, and the total data size is expected to become comparable with that of SuperKEKB. The online tuning of measurement parameters is being planned by the real time processing, which was done empirically in the past. We started a joint development of the real time framework to be shared both by SuperKEKB and Hyper Suprime-Cam. The parallel processing technique is widely adopted in the framework design to utilize a huge number of network-connected PCs with multi-core CPUs. The parallel processing is performed not only in the trivial event-by-event manner, but also in the pipeline of the software modules which are dynamically placed over the distributed computing nodes. The object data flow in the framework is realized by the object serializing technique with the object persistency. On-the-fly collection of histograms and N-tuples is supported for the run-time monitoring. The detailed design and the development status of the framework is presented.
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
International Nuclear Information System (INIS)
2003-08-01
This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
Benjafield, John G
2016-05-01
The digital humanities are being applied with increasing frequency to the analysis of historically important texts. In this study, the methods of G. K. Zipf are used to explore the digital history of the vocabulary of psychology. Zipf studied a great many phenomena, from word frequencies to city sizes, showing that they tend to have a characteristic distribution in which there are a few cases that occur very frequently and many more cases that occur very infrequently. We find that the number of new words and word senses that writers contribute to the vocabulary of psychology have such a Zipfian distribution. Moreover, those who make the most contributions, such as William James, tend also to invent new metaphorical senses of words rather than new words. By contrast, those who make the fewest contributions tend to invent entirely new words. The use of metaphor makes a text easier for a reader to understand. While the use of new words requires more effort on the part of the reader, it may lead to more precise understanding than does metaphor. On average, new words and word senses become a part of psychology's vocabulary in the time leading up to World War I, suggesting that psychology was "finding its language" (Danziger, 1997) during this period. (c) 2016 APA, all rights reserved).
International Nuclear Information System (INIS)
Son, Seung Hui
2004-02-01
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Analysis of Project Finance | Energy Analysis | NREL
Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable
International Nuclear Information System (INIS)
Wright, A.C.D.
2002-01-01
This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations
An example of multidimensional analysis: Discriminant analysis
International Nuclear Information System (INIS)
Lutz, P.
1990-01-01
Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr
... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
An Algorithm to Solve the Equal-Sum-Product Problem
Nyblom, M. A.; Evans, C. D.
2013-01-01
A recursive algorithm is constructed which finds all solutions to a class of Diophantine equations connected to the problem of determining ordered n-tuples of positive integers satisfying the property that their sum is equal to their product. An examination of the use of Binary Search Trees in implementing the algorithm into a working program is given. In addition an application of the algorithm for searching possible extra exceptional values of the equal-sum-product problem is explored after...
Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
International Nuclear Information System (INIS)
PECH, S.H.
2000-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Quantitative analysis chemistry
International Nuclear Information System (INIS)
Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung
1995-02-01
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Energy Technology Data Exchange (ETDEWEB)
PECH, S.H.
2000-08-23
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
International Nuclear Information System (INIS)
WEBB, R.H.
1999-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Philipp Mayring
2000-01-01
The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...
RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...
African Journals Online (AJOL)
eobe
Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...
DTI analysis methods : Voxel-based analysis
Van Hecke, Wim; Leemans, Alexander; Emsell, Louise
2016-01-01
Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does
Analysis of Precision of Activation Analysis Method
DEFF Research Database (Denmark)
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Hazard Analysis Database Report
Grams, W H
2000-01-01
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Báze nejsou písmena / The Bases Are Not the Letters
Directory of Open Access Journals (Sweden)
Vladimír Matlach
2016-06-01
Full Text Available In this paper we show some interpretation of the genetic code design. We proceed from the discovery of DNA structure to current stage of the molecular biology. Generally we introduce the basic semiotic assumptions of molecular biology in the description of the structure of DNA, proteins and genetic code. We focus on interpretations of Francis Crick, another molecular biologists, biosemioticians and linguists. For the aims of the paper we describe some fundaments of molecular biology. Core of our text is quantitative analysis (n-gram structure, Zipf ’s law of mRNA strings and natural language text. We take into consideration representative quantitative analysis of DNA, RNA and proteins too. Our analysis of mRNA confirms the assumption that the design of the genetic code cannot analogize DNA bases and letters.
Activation analysis in food analysis. Pt. 9
International Nuclear Information System (INIS)
Szabo, S.A.
1992-01-01
An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
International Nuclear Information System (INIS)
Burgess, R.L.
1978-01-01
Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models
Confirmatory Composite Analysis
Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.
2018-01-01
We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are
Introductory numerical analysis
Pettofrezzo, Anthony J
2006-01-01
Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.
Gap Analysis: Application to Earned Value Analysis
Langford, Gary O.; Franck, Raymond (Chip)
2008-01-01
Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...
Importance-performance analysis based SWOT analysis
Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.
2016-01-01
SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...
Discourse analysis and Foucault's
Directory of Open Access Journals (Sweden)
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
2014-01-01
M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...
Slice hyperholomorphic Schur analysis
Alpay, Daniel; Sabadini, Irene
2016-01-01
This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Trend Analysis Using Microcomputers.
Berger, Carl F.
A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Automation of activation analysis
International Nuclear Information System (INIS)
Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.
1985-01-01
The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Analysis in usability evaluations
DEFF Research Database (Denmark)
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...
DEFF Research Database (Denmark)
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
International Nuclear Information System (INIS)
Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana
2010-01-01
VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.
Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics
Wang, Min; Wang, Jun
A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.
Cost benefit analysis cost effectiveness analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data
International Nuclear Information System (INIS)
Sommer, S; Tinh Tran, T.
2008-01-01
Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process
Functional analysis and applications
Siddiqi, Abul Hasan
2018-01-01
This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...
DEFF Research Database (Denmark)
Bemman, Brian; Meredith, David
it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...
Fundamentals of functional analysis
Farenick, Douglas
2016-01-01
This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Analysis apparatus and method of analysis
International Nuclear Information System (INIS)
1976-01-01
A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique
International Nuclear Information System (INIS)
Dougherty, E.M.; Fragola, J.R.
1988-01-01
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Emission spectrochemical analysis
International Nuclear Information System (INIS)
Rives, R.D.; Bruks, R.R.
1983-01-01
The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized
International Nuclear Information System (INIS)
Crawford, H.J.; Lindstrom, P.J.
1983-06-01
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
Fast neutron activation analysis
International Nuclear Information System (INIS)
Pepelnik, R.
1986-01-01
Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
The ATLAS Analysis Architecture
International Nuclear Information System (INIS)
Cranmer, K.S.
2008-01-01
We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
DEFF Research Database (Denmark)
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
Geospatial Data Analysis Facility
Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...
National Research Council Canada - National Science Library
Gilbert, John
1984-01-01
... quantification methods used in the analysis of mycotoxins in foods - Confirmation and quantification of trace organic food contaminants by mass spectrometry-selected ion monitoring - Chemiluminescence...
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.
2004-01-01
The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated
International Nuclear Information System (INIS)
Hahn, A.A.
1994-11-01
The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques
Activation analysis. Detection limits
International Nuclear Information System (INIS)
Revel, G.
1999-01-01
Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)
SMART performance analysis methodology
International Nuclear Information System (INIS)
Lim, H. S.; Kim, H. C.; Lee, D. J.
2001-04-01
To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis
Contrast analysis : A tutorial
Haans, A.
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient
Interactive Controls Analysis (INCA)
Bauer, Frank H.
1989-01-01
Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.
Marketing research cluster analysis
Directory of Open Access Journals (Sweden)
Marić Nebojša
2002-01-01
Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
SWOT ANALYSIS - CHINESE PETROLEUM
Directory of Open Access Journals (Sweden)
Chunlan Wang
2014-01-01
Full Text Available This article was written in early December 2013,combined with the historical development andthe latest data on the Chinese Petroleum carried SWOT- analysis. This paper discusses corporate resources, cost, management and external factorssuch as the political environment and the marketsupply and demand, conducted a comprehensiveand profound analysis.
de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.
2000-01-01
In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual
F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker
2000-01-01
textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient
Directory of Open Access Journals (Sweden)
Satu Elo
2014-02-01
Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.
Schraagen, J.M.C.
2000-01-01
Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the
DEFF Research Database (Denmark)
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....
Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.
2011-01-01
Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.
DEFF Research Database (Denmark)
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...
International Nuclear Information System (INIS)
Arien, B.
2000-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported
Factorial Analysis of Profitability
Georgeta VINTILA; Ilie GHEORGHE; Ioana Mihaela POCAN; Madalina Gabriela ANGHEL
2012-01-01
The DuPont analysis system is based on decomposing the profitability ratio in factors of influence. This paper describes the factorial analysis of profitability based on the DuPont system. Significant importance is given to the impact on various indicators on the shares value and profitability.
Spool assembly support analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
International Nuclear Information System (INIS)
Hansen, J.D.
1976-01-01
This article discusses the partial wave analysis of two, three and four meson systems. The difference between the two approaches, referred to as amplitude and Ascoli analysis is discussed. Some of the results obtained with these methods are shown. (B.R.H.)
Enabling interdisciplinary analysis
L. M. Reid
1996-01-01
'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...
Shot loading platform analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Marketing research cluster analysis
Marić Nebojša
2002-01-01
One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
Towards Cognitive Component Analysis
DEFF Research Database (Denmark)
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...
Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs
2014-01-01
Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...
Interaction Analysis and Supervision.
Amidon, Edmund
This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)
Activation analysis. Chapter 4
International Nuclear Information System (INIS)
1976-01-01
The principle, sample and calibration standard preparation, activation by neutrons, charged particles and gamma radiation, sample transport after activation, activity measurement, and chemical sample processing are described for activation analysis. Possible applications are shown of nondestructive activation analysis. (J.P.)
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
Ian M. Franks; Mike Hughes
2004-01-01
This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Directory of Open Access Journals (Sweden)
Ian M. Franks
2004-06-01
Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
International Nuclear Information System (INIS)
Ishii, Keizo
1997-01-01
Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)
International Nuclear Information System (INIS)
Porten, D.R.; Crowe, R.D.
1994-01-01
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
International Nuclear Information System (INIS)
Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.
2012-01-01
Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)
J Olive, David
2017-01-01
This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given. The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory. The robust techniques are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis. A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...
Field, Michael
2017-01-01
This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry. Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...
Real analysis and applications
Botelho, Fabio Silva
2018-01-01
This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.
Nonactivation interaction analysis. Chapter 5
International Nuclear Information System (INIS)
1976-01-01
Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)
Is activation analysis still active?
International Nuclear Information System (INIS)
Chai Zhifang
2001-01-01
This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)
International Nuclear Information System (INIS)
He, Ling-Yun; Fan, Ying; Wei, Yi-Ming
2009-01-01
Based on time series of crude oil prices (daily spot), this paper analyses price fluctuation with two significant parameters τ (speculators' time scales of investment) and ε (speculators' expectations of return) by using Zipf analysis technique, specifically, by mapping τ-returns of prices into 3-alphabeted sequences (absolute frequencies) and 2-alphabeted sequences (relative frequencies), containing the fundamental information of price fluctuations. This paper empirically explores parameters and identifies various types of speculators' cognition patterns of price behavior. In order to quantify the degree of distortion, a feasible reference is proposed: an ideal speculator. Finally, this paper discusses the similarities and differences between those cognition patterns of speculators' and those of an ideal speculator. The resultant analyses identify the possible distortion of price behaviors by their patterns. (author)
Universality and Shannon entropy of codon usage
Frappat, L; Sciarrino, A; Sorba, Paul
2003-01-01
The distribution functions of the codon usage probabilities, computed over all the available GenBank data, for 40 eukaryotic biological species and 5 chloroplasts, do not follow a Zipf law, but are best fitted by the sum of a constant, an exponential and a linear function in the rank of usage. For mitochondriae the analysis is not conclusive. A quantum-mechanics-inspired model is proposed to describe the observed behaviour. These functions are characterized by parameters that strongly depend on the total GC content of the coding regions of biological species. It is predicted that the codon usage is the same in all exonic genes with the same GC content. The Shannon entropy for codons, also strongly depending on the exonic GC content, is computed.
Dependence of exponents on text length versus finite-size scaling for word-frequency distributions
Corral, Álvaro; Font-Clos, Francesc
2017-08-01
Some authors have recently argued that a finite-size scaling law for the text-length dependence of word-frequency distributions cannot be conceptually valid. Here we give solid quantitative evidence for the validity of this scaling law, using both careful statistical tests and analytical arguments based on the generalized central-limit theorem applied to the moments of the distribution (and obtaining a novel derivation of Heaps' law as a by-product). We also find that the picture of word-frequency distributions with power-law exponents that decrease with text length [X. Yan and P. Minnhagen, Physica A 444, 828 (2016), 10.1016/j.physa.2015.10.082] does not stand with rigorous statistical analysis. Instead, we show that the distributions are perfectly described by power-law tails with stable exponents, whose values are close to 2, in agreement with the classical Zipf's law. Some misconceptions about scaling are also clarified.
Directory of Open Access Journals (Sweden)
Dror Y. Kenett
2009-01-01
Full Text Available We present here assessment of the latent market information embedded in the raw, affinity (normalized, and partial correlations. We compared the Zipf plot, spectrum, and distribution of the eigenvalues for each matrix with the results of the corresponding random matrix. The analysis was performed on stocks belonging to the New York and Tel Aviv Stock Exchange, for the time period of January 2000 to March 2009. Our results show that in comparison to the raw correlations, the affinity matrices highlight the dominant factors of the system, and the partial correlation matrices contain more information. We propose that significant stock market information, which cannot be captured by the raw correlations, is embedded in the affinity and partial correlations. Our results further demonstrate the differences between NY and TA markets.
Complexity multiscale asynchrony measure and behavior for interacting financial dynamics
Yang, Ge; Wang, Jun; Niu, Hongli
2016-08-01
A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.
International Nuclear Information System (INIS)
González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E
2012-01-01
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.
Hazard Analysis Database Report
Energy Technology Data Exchange (ETDEWEB)
GAULT, G.W.
1999-10-13
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.
Containment vessel stability analysis
International Nuclear Information System (INIS)
Harstead, G.A.; Morris, N.F.; Unsal, A.I.
1983-01-01
The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)
International Nuclear Information System (INIS)
Thompson, W.A. Jr.
1979-11-01
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
International Nuclear Information System (INIS)
Kartiwa Sumadi; Yayah Rohayati
1996-01-01
The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
International Nuclear Information System (INIS)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-01
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
International Nuclear Information System (INIS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart
2011-01-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Factor analysis and scintigraphy
International Nuclear Information System (INIS)
Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.
1976-01-01
The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr
Energy Technology Data Exchange (ETDEWEB)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-15
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
DEFF Research Database (Denmark)
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
International Nuclear Information System (INIS)
Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.
2001-01-01
An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Iremonger, M J
1982-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
Fundamentals of mathematical analysis
Paul J Sally, Jr
2013-01-01
This is a textbook for a course in Honors Analysis (for freshman/sophomore undergraduates) or Real Analysis (for junior/senior undergraduates) or Analysis-I (beginning graduates). It is intended for students who completed a course in "AP Calculus", possibly followed by a routine course in multivariable calculus and a computational course in linear algebra. There are three features that distinguish this book from many other books of a similar nature and which are important for the use of this book as a text. The first, and most important, feature is the collection of exercises. These are spread
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Plasma data analysis using statistical analysis system
International Nuclear Information System (INIS)
Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.
1987-01-01
Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab
Summary Analysis: Hanford Site Composite Analysis Update
Energy Technology Data Exchange (ETDEWEB)
Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
He, Jingrui
2012-01-01
This book focuses on rare category analysis where the majority classes have smooth distributions and the minority classes exhibit the compactness property. It focuses on challenging cases where the support regions of the majority and minority classes overlap.
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
International Nuclear Information System (INIS)
1981-09-01
Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de
Denker, A; Rauschenberg, J; Röhrich, J; Strub, E
2006-01-01
Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.
DEFF Research Database (Denmark)
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid
2016-01-01
, conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....
PWR systems transient analysis
International Nuclear Information System (INIS)
Kennedy, M.F.; Peeler, G.B.; Abramson, P.B.
1985-01-01
Analysis of transients in pressurized water reactor (PWR) systems involves the assessment of the response of the total plant, including primary and secondary coolant systems, steam piping and turbine (possibly including the complete feedwater train), and various control and safety systems. Transient analysis is performed as part of the plant safety analysis to insure the adequacy of the reactor design and operating procedures and to verify the applicable plant emergency guidelines. Event sequences which must be examined are developed by considering possible failures or maloperations of plant components. These vary in severity (and calculational difficulty) from a series of normal operational transients, such as minor load changes, reactor trips, valve and pump malfunctions, up to the double-ended guillotine rupture of a primary reactor coolant system pipe known as a Large Break Loss of Coolant Accident (LBLOCA). The focus of this paper is the analysis of all those transients and accidents except loss of coolant accidents
Full closure strategic analysis.
2014-07-01
The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...
Electrical Subsurface Grounding Analysis
International Nuclear Information System (INIS)
J.M. Calle
2000-01-01
The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2009-01-01
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...
Canonical Information Analysis
DEFF Research Database (Denmark)
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Qualitative Data Analysis Strategies
Greaves, Kristoffer
2014-01-01
A set of concept maps for qualitative data analysis strategies, inspired by Corbin, JM & Strauss, AL 2008, Basics of qualitative research: Techniques and procedures for developing grounded theory, 3rd edn, Sage Publications, Inc, Thousand Oaks, California.
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... is being tested? Synovial fluid is a thick liquid that acts as a lubricant for the body's ...
Hytönen, Tuomas; Veraar, Mark; Weis, Lutz
The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...
Analysis Streamlining in ATLAS
Heinrich, Lukas; The ATLAS collaboration
2018-01-01
We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...
Wolff, Thomas H; Shubin, Carol
2003-01-01
This book demonstrates how harmonic analysis can provide penetrating insights into deep aspects of modern analysis. It is both an introduction to the subject as a whole and an overview of those branches of harmonic analysis that are relevant to the Kakeya conjecture. The usual background material is covered in the first few chapters: the Fourier transform, convolution, the inversion theorem, the uncertainty principle and the method of stationary phase. However, the choice of topics is highly selective, with emphasis on those frequently used in research inspired by the problems discussed in the later chapters. These include questions related to the restriction conjecture and the Kakeya conjecture, distance sets, and Fourier transforms of singular measures. These problems are diverse, but often interconnected; they all combine sophisticated Fourier analysis with intriguing links to other areas of mathematics and they continue to stimulate first-rate work. The book focuses on laying out a solid foundation for fu...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
CSIR Research Space (South Africa)
Khuluse, S
2009-04-01
Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...
Ziemer, William P
2017-01-01
This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
International Nuclear Information System (INIS)
Holland, W.E.
1980-02-01
A method was developed to determine if boron-loaded polymeric material contained enriched boron or natural boron. A prototype analyzer was constructed, and initial planning was done for an actual analysis facility
Stakeholder Analysis Worksheet
Stakeholder Analysis WorksheetA worksheet that can be used to document potential stakeholder groups, the information or expertise they hold, the role that they can play, their interests or concerns about the HIA
Energy Technology Data Exchange (ETDEWEB)
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Energy Technology Data Exchange (ETDEWEB)
Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-23
Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.
Biodiesel Emissions Analysis Program
Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.
Introduction to global analysis
Kahn, Donald W
2007-01-01
This text introduces the methods of mathematical analysis as applied to manifolds, including the roles of differentiation and integration, infinite dimensions, Morse theory, Lie groups, and dynamical systems. 1980 edition.
Biorefinery Sustainability Analysis
DEFF Research Database (Denmark)
J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist
2017-01-01
This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...
Pesticide Instrumental Analysis
International Nuclear Information System (INIS)
Samir, E.; Fonseca, E.; Baldyga, N.; Acosta, A.; Gonzalez, F.; Felicita, F.; Tomasso, M.; Esquivel, D.; Parada, A.; Enriquez, P.; Amilibia, M.
2012-01-01
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...
Perspectives in shape analysis
Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie
2016-01-01
This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...
National Research Council Canada - National Science Library
1998-01-01
.... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...
Main: Nucleotide Analysis [KOME
Lifescience Database Archive (English)
Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against jap...onica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...
Schuller, Björn W
2013-01-01
This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition. Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...
Scientific stream pollution analysis
National Research Council Canada - National Science Library
Nemerow, Nelson Leonard
1974-01-01
A comprehensive description of the analysis of water pollution that presents a careful balance of the biological,hydrological, chemical and mathematical concepts involved in the evaluation of stream...
International Nuclear Information System (INIS)
Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S
2010-01-01
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Invitation to classical analysis
Duren, Peter
2012-01-01
This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ
Analysis of irradiated materials
International Nuclear Information System (INIS)
Bellamy, B.A.
1988-01-01
Papers presented at the UKAEA Conference on Materials Analysis by Physical Techniques (1987) covered a wide range of techniques as applied to the analysis of irradiated materials. These varied from reactor component materials, materials associated with the Authority's radwaste disposal programme, fission products and products associated with the decommissioning of nuclear reactors. An invited paper giving a very comprehensive review of Laser Ablation Microprobe Mass Spectroscopy (LAMMS) was included in the programme. (author)
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
International Nuclear Information System (INIS)
Niehaus, F.
1988-01-01
In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of four main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents, the development of an expert system for the aid to diagnosis; the development and application of a probabilistic reactor dynamics method. Main achievements in 1999 are reported
Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A
2016-10-01
Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Tibari, Elghali; Taous, Fouad; Marah, Hamid
2014-01-01
This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic
2012-01-01
The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...
Forensic neutron activation analysis
International Nuclear Information System (INIS)
Kishi, T.
1987-01-01
The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Directory of Open Access Journals (Sweden)
Iulian N. BUJOREANU
2011-01-01
Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.
International Nuclear Information System (INIS)
Grimanis, A.P.
1985-01-01
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Integrated genetic analysis microsystems
International Nuclear Information System (INIS)
Lagally, Eric T; Mathies, Richard A
2004-01-01
With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Directory of Open Access Journals (Sweden)
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
Professionalizing Intelligence Analysis
Directory of Open Access Journals (Sweden)
James B. Bruce
2015-09-01
Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander
2015-01-01
Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
International Nuclear Information System (INIS)
Sitek, J.; Degmova, J.; Dekan, J.
2011-01-01
Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)
Foundations of VISAR analysis.
Energy Technology Data Exchange (ETDEWEB)
Dolan, Daniel H.
2006-06-01
The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described
Energy Technology Data Exchange (ETDEWEB)
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
International Nuclear Information System (INIS)
Deville, J.P.
1998-01-01
Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)
Power electronics reliability analysis.
Energy Technology Data Exchange (ETDEWEB)
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
International Nuclear Information System (INIS)
Gregg, H.R.; Meltzer, M.P.
1996-01-01
The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig
Jorgensen, Palle
2017-01-01
The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.
DEFF Research Database (Denmark)
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
DEFF Research Database (Denmark)
Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan
2015-01-01
, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....
Sohrab, Houshang H
2014-01-01
This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....
Software safety hazard analysis
International Nuclear Information System (INIS)
Lawrence, J.D.
1996-02-01
Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper
Vágner, Petr; Pavelka, Michal; Maršík, František
2017-04-01
The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.
Directory of Open Access Journals (Sweden)
Sutawanir Darwis
2012-05-01
Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Trajectory Based Traffic Analysis
DEFF Research Database (Denmark)
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...
International Nuclear Information System (INIS)
Johnstad, H.
1989-06-01
The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig
Choudary, A D R
2014-01-01
The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...
Banks, David L; Rios Insua, David
2015-01-01
Flexible Models to Analyze Opponent Behavior A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on a...
COMPUTER METHODS OF GENETIC ANALYSIS.
Directory of Open Access Journals (Sweden)
A. L. Osipov
2017-02-01
Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.
Evens, Nicholas P; Buchner, Peter; Williams, Lorraine E; Hawkesford, Malcolm J
2017-10-01
Understanding the molecular basis of zinc (Zn) uptake and transport in staple cereal crops is critical for improving both Zn content and tolerance to low-Zn soils. This study demonstrates the importance of group F bZIP transcription factors and ZIP transporters in responses to Zn deficiency in wheat (Triticum aestivum). Seven group F TabZIP genes and 14 ZIPs with homeologs were identified in hexaploid wheat. Promoter analysis revealed the presence of Zn-deficiency-response elements (ZDREs) in a number of the ZIPs. Functional complementation of the zrt1/zrt2 yeast mutant by TaZIP3, -6, -7, -9 and -13 supported an ability to transport Zn. Group F TabZIPs contain the group-defining cysteine-histidine-rich motifs, which are the predicted binding site of Zn 2+ in the Zn-deficiency response. Conservation of these motifs varied between the TabZIPs suggesting that individual TabZIPs may have specific roles in the wheat Zn-homeostatic network. Increased expression in response to low Zn levels was observed for several of the wheat ZIPs and bZIPs; this varied temporally and spatially suggesting specific functions in the response mechanism. The ability of the group F TabZIPs to bind to specific ZDREs in the promoters of TaZIPs indicates a conserved mechanism in monocots and dicots in responding to Zn deficiency. In support of this, TabZIPF1-7DL and TabZIPF4-7AL afforded a strong level of rescue to the Arabidopsis hypersensitive bzip19 bzip23 double mutant under Zn deficiency. These results provide a greater understanding of Zn-homeostatic mechanisms in wheat, demonstrating an expanded repertoire of group F bZIP transcription factors, adding to the complexity of Zn homeostasis. © 2017 The Authors The Plant Journal published by John Wiley & Sons Ltd and Society for Experimental Biology.
International Nuclear Information System (INIS)
Strait, R.S.
1996-01-01
The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies
Schramm, Michael J
2008-01-01
This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl
Fourier analysis an introduction
Stein, Elias M
2003-01-01
This first volume, a three-part introduction to the subject, is intended for students with a beginning knowledge of mathematical analysis who are motivated to discover the ideas that shape Fourier analysis. It begins with the simple conviction that Fourier arrived at in the early nineteenth century when studying problems in the physical sciences--that an arbitrary function can be written as an infinite sum of the most basic trigonometric functions.The first part implements this idea in terms of notions of convergence and summability of Fourier series, while highlighting applications such as th
DEFF Research Database (Denmark)
Reinau, Kristian Hegner
Traditionally, focus in the transport field, both politically and scientifically, has been on private cars and public transport. Freight transport has been a neglected topic. Recent years has seen an increased focus upon congestion as a core issue across Europe, resulting in a great need for know...... speed data for freight. Secondly, the analytical methods used, space-time cubes and emerging hot spot analysis, are also new in the freight transport field. The analysis thus estimates precisely how fast freight moves on the roads in Northern Jutland and how this has evolved over time....
Abrahams, J R; Hiller, N
1965-01-01
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther
Automated Software Vulnerability Analysis
Sezer, Emre C.; Kil, Chongkyung; Ning, Peng
Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.
Spectral analysis by correlation
International Nuclear Information System (INIS)
Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G.
1969-01-01
The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [fr
DEFF Research Database (Denmark)
Josefsen, Knud; Nielsen, Henrik
2011-01-01
Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... is analysed by hybridization to one or more specific probes that are labelled for subsequent detection. Northern blotting is relatively simple to perform, inexpensive, and not plagued by artefacts. Recent developments of hybridization membranes and buffers have resulted in increased sensitivity closing...
Subseabed disposal safety analysis
International Nuclear Information System (INIS)
Koplick, C.M.; Kabele, T.J.
1982-01-01
This report summarizes the status of work performed by Analytic Sciences Corporation (TASC) in FY'81 on subseabed disposal safety analysis. Safety analysis for subseabed disposal is divided into two phases: pre-emplacement which includes all transportation, handling, and emplacement activities; and long-term (post-emplacement), which is concerned with the potential hazard after waste is safely emplaced. Details of TASC work in these two areas are provided in two technical reports. The work to date, while preliminary, supports the technical and environmental feasibility of subseabed disposal of HLW
Sprecher, David A
2010-01-01
This classic text in introductory analysis delineates and explores the intermediate steps between the basics of calculus and the ultimate stage of mathematics: abstraction and generalization.Since many abstractions and generalizations originate with the real line, the author has made it the unifying theme of the text, constructing the real number system from the point of view of a Cauchy sequence (a step which Dr. Sprecher feels is essential to learn what the real number system is).The material covered in Elements of Real Analysis should be accessible to those who have completed a course in
Energy Technology Data Exchange (ETDEWEB)
Johnson, M A
1983-03-01
Energy analysis contributed to the public debate on the gasohol programme in the U.S. where this analysis became a legal requirement. The published energy analyses for gasohol are reviewed and we assess their inherent assumptions and data sources. The analyses are normalised to S.I. units to faciltate comparisons. The process of rationalising the various treatments uncovered areas of uncertainties particularly in the methodologies which could be used to analyse some parts of the process. Although the definitive study has still to be written, the consensus is that maize to fuel ethanol via the traditional fermentation route is a net consumer of energy. (Refs. 13).
Bhatia, Rajendra
2009-01-01
These notes are a record of a one semester course on Functional Analysis given by the author to second year Master of Statistics students at the Indian Statistical Institute, New Delhi. Students taking this course have a strong background in real analysis, linear algebra, measure theory and probability, and the course proceeds rapidly from the definition of a normed linear space to the spectral theorem for bounded selfadjoint operators in a Hilbert space. The book is organised as twenty six lectures, each corresponding to a ninety minute class session. This may be helpful to teachers planning a course on this topic. Well prepared students can read it on their own.
Analysis of maintenance strategies
International Nuclear Information System (INIS)
Laakso, K.; Simola, K.
1998-01-01
The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view
Foundations of stochastic analysis
Rao, M M; Lukacs, E
1981-01-01
Foundations of Stochastic Analysis deals with the foundations of the theory of Kolmogorov and Bochner and its impact on the growth of stochastic analysis. Topics covered range from conditional expectations and probabilities to projective and direct limits, as well as martingales and likelihood ratios. Abstract martingales and their applications are also discussed. Comprised of five chapters, this volume begins with an overview of the basic Kolmogorov-Bochner theorem, followed by a discussion on conditional expectations and probabilities containing several characterizations of operators and mea
International Nuclear Information System (INIS)
Chatelus, R.; Schot, P.M.
2010-01-01
In order to verify compliance with safeguards and draw conclusions on the absence of undeclared nuclear material and activities, the International Atomic Energy Agency (IAEA) collects and analyses trade information that it receives from open sources as well as from Member States. Although the IAEA does not intervene in national export controls, it has to monitor the trade of dual use items. Trade analysis helps the IAEA to evaluate global proliferation threats, to understand States' ability to report exports according to additional protocols but also to compare against State declarations. Consequently, the IAEA has explored sources of trade-related information and has developed analysis methodologies beyond its traditional safeguards approaches. (author)
Cai, Tony
2010-01-01
Over the last few years, significant developments have been taking place in highdimensional data analysis, driven primarily by a wide range of applications in many fields such as genomics and signal processing. In particular, substantial advances have been made in the areas of feature selection, covariance estimation, classification and regression. This book intends to examine important issues arising from highdimensional data analysis to explore key ideas for statistical inference and prediction. It is structured around topics on multiple hypothesis testing, feature selection, regression, cla
Senthilkumar, K.; Ruchika Mehra Vijayan, E.
2017-11-01
This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language
DEFF Research Database (Denmark)
Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels
2015-01-01
of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...
Hoffman, Kenneth
2007-01-01
Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq
International Nuclear Information System (INIS)
Clark, R.D.
1996-01-01
This analysis defines and evaluates the surface water supply system from the existing J-13 well to the North Portal. This system includes the pipe running from J-13 to a proposed Booster Pump Station at the intersection of H Road and the North Portal access road. Contained herein is an analysis of the proposed Booster Pump Station with a brief description of the system that could be installed to the South Portal and the optional shaft. The tanks that supply the water to the North Portal are sized, and the supply system to the North Portal facilities and up to Topopah Spring North Ramp is defined
Structural analysis for Diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2001-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...... of relating faults to known parameters and measurements of a system. Using explicit fault modelling, minimal over-determined subsystems are shown to provide necessary redundancy relations from the matching. Details of the method are presented and a realistic example used to clearly describe individual steps...
Structural analysis for diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2002-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential tech-nique to obtain redundant information for diagnosis, is reconsidered in this paper. Matching is reformulated as a problem...... of relating faults to known parameters and measurements of a system. Using explicit fault modelling, minimal overdetermined subsystems are shown to provide necessary redundancy relations from the matching. Details of the method are presented and a realistic example used to clearly describe individual steps....
International Nuclear Information System (INIS)
Bergman, R.
1980-12-01
The report describes the development of a method for in vivo Cd-analysis. The method is based on the analysis of the prompt gamma radiation which is emitted by neutron capture of the isotope Cd113. Different parts of the body can be analysed selectively by neutrons in the interval of 1 to 100 KeV. The results show that the level of Cd in Kidneys can be measured without exceeding the dose of 40 mrad and that only 20% uncertainty is introduced when analysing Cd. The development has been made at the R2 reactor in Studsvik using 25 KeV neutrons. (G.B.)
Brieda, Lubos
2015-01-01
This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Electronic Circuit Analysis Language (ECAL)
Chenghang, C.
1983-03-01
The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.
Prehistory analysis using photon activation analysis
International Nuclear Information System (INIS)
Krausova, I.; Chvatil, D.; Tajer, J.
2017-01-01
Instrumental photon activation analysis (IPAA) is a suitable radio-analytical method for non-destructive determination of total nitrogen in various matrices. IPAA determination of nitrogen is based on 14 N (γ, n) 13 N nuclear reaction after high-energy photon irradiation. The analytically usable product of this photo-nuclear reaction is a positron emitter emitting only non-specific annihilation of 511 keV, which can be emitted by other radionuclides present in the sample. Some of them, besides the non-specific 511 keV line, also emit specific lines that allow their contribution to analytical radionuclide 13 N to be subtracted. An efficient source of high-energy photon radiation is the secondary bremsstrahlung generated by the conversion of the electron beam accelerated by a high-frequency circular accelerator - a microtron. The non-destructive IPAA contributed to the clarification of the origins of a precious bracelet originating from a fortified settlement in the area of Karlovy Vary - Drahovice from the late Bronze Age. (authors)
Systems analysis - independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Using Xrootd to Federate Regional Storage
International Nuclear Information System (INIS)
Bauerdick, L; Benjamin, D; Bloom, K; Bockelman, B; Bradley, D; Dasu, S; Ernst, M; Ito, H; Rind, O; Gardner, R; Vukotic, I; Hanushevsky, A; Lesny, D; McGuigan, P; McKee, S; Severini, H; Sfiligoi, I; Tadel, M; Würthwein, F; Williams, S
2012-01-01
While the LHC data movement systems have demonstrated the ability to move data at the necessary throughput, we have identified two weaknesses: the latency for physicists to access data and the complexity of the tools involved. To address these, both ATLAS and CMS have begun to federate regional storage systems using Xrootd. Xrootd, referring to a protocol and implementation, allows us to provide data access to all disk-resident data from a single virtual endpoint. This “redirector” discovers the actual location of the data and redirects the client to the appropriate site. The approach is particularly advantageous since typically the redirection requires much less than 500 milliseconds and the Xrootd client is conveniently built into LHC physicists’ analysis tools. Currently, there are three regional storage federations - a US ATLAS region, a European CMS region, and a US CMS region. The US ATLAS and US CMS regions include their respective Tier 1, Tier 2 and some Tier 3 facilities; a large percentage of experimental data is available via the federation. Additionally, US ATLAS has begun studying low-latency regional federations of close-by sites. From the base idea of federating storage behind an endpoint, the implementations and use cases diverge. The CMS software framework is capable of efficiently processing data over high-latency links, so using the remote site directly is comparable to accessing local data. The ATLAS processing model allows a broad spectrum of user applications with varying degrees of performance with regard to latency; a particular focus has been optimizing n-tuple analysis. Both VOs use GSI security. ATLAS has developed a mapping of VOMS roles to specific file system authorizations, while CMS has developed callouts to the site's mapping service. Each federation presents a global namespace to users. For ATLAS, the global-to-local mapping is based on a heuristic-based lookup from the site's local file catalog, while CMS does the mapping
Using Xrootd to federate regional storage
Energy Technology Data Exchange (ETDEWEB)
Bauerdick, L.; et al.
2012-01-01
While the LHC data movement systems have demonstrated the ability to move data at the necessary throughput, we have identified two weaknesses: the latency for physicists to access data and the complexity of the tools involved. To address these, both ATLAS and CMS have begun to federate regional storage systems using Xrootd. Xrootd, referring to a protocol and implementation, allows us to provide data access to all disk-resident data from a single virtual endpoint. This redirector discovers the actual location of the data and redirects the client to the appropriate site. The approach is particularly advantageous since typically the redirection requires much less than 500 milliseconds and the Xrootd client is conveniently built into LHC physicists analysis tools. Currently, there are three regional storage federations - a US ATLAS region, a European CMS region, and a US CMS region. The US ATLAS and US CMS regions include their respective Tier 1, Tier 2 and some Tier 3 facilities, a large percentage of experimental data is available via the federation. Additionally, US ATLAS has begun studying low-latency regional federations of close-by sites. From the base idea of federating storage behind an endpoint, the implementations and use cases diverge. The CMS software framework is capable of efficiently processing data over high-latency links, so using the remote site directly is comparable to accessing local data. The ATLAS processing model allows a broad spectrum of user applications with varying degrees of performance with regard to latency, a particular focus has been optimizing n-tuple analysis. Both VOs use GSI security. ATLAS has developed a mapping of VOMS roles to specific file system authorizations, while CMS has developed callouts to the site's mapping service. Each federation presents a global namespace to users. For ATLAS, the global-to-local mapping is based on a heuristic-based lookup from the site's local file catalog, while CMS does the mapping
Learning Haskell data analysis
Church, James
2015-01-01
If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.
International Nuclear Information System (INIS)
Malik, S; Bloom, K; Shipsey, I; Cavanaugh, R; Klima, B; Chan, Kai-Feng; D'Hondt, J; Narain, M; Palla, F; Rolandi, G; Schörner-Sadenius, T
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
International Nuclear Information System (INIS)
2002-01-01
This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)
Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip
2011-01-01
Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.
Israel, Carsten W; Ekosso-Ejangue, Lucy; Sheta, Mohamed-Karim
2015-09-01
The key to a successful analysis of a pacemaker electrocardiogram (ECG) is the application of the systematic approach used for any other ECG without a pacemaker: analysis of (1) basic rhythm and rate, (2) QRS axis, (3) PQ, QRS and QT intervals, (4) morphology of P waves, QRS, ST segments and T(U) waves and (5) the presence of arrhythmias. If only the most obvious abnormality of a pacemaker ECG is considered, wrong conclusions can easily be drawn. If a systematic approach is skipped it may be overlooked that e.g. atrial pacing is ineffective, the left ventricle is paced instead of the right ventricle, pacing competes with intrinsic conduction or that the atrioventricular (AV) conduction time is programmed too long. Apart from this analysis, a pacemaker ECG which is not clear should be checked for the presence of arrhythmias (e.g. atrial fibrillation, atrial flutter, junctional escape rhythm and endless loop tachycardia), pacemaker malfunction (e.g. atrial or ventricular undersensing or oversensing, atrial or ventricular loss of capture) and activity of specific pacing algorithms, such as automatic mode switching, rate adaptation, AV delay modifying algorithms, reaction to premature ventricular contractions (PVC), safety window pacing, hysteresis and noise mode. A systematic analysis of the pacemaker ECG almost always allows a probable diagnosis of arrhythmias and malfunctions to be made, which can be confirmed by pacemaker control and can often be corrected at the touch of the right button to the patient's benefit.
Nancy E. Fleenor
2002-01-01
A Landscape Analysis Plan (LAP) sets out broad guidelines for project development within boundaries of the Kings River Sustainable Forest Ecosystems Project. The plan must be a dynamic, living document, subject to change as new information arises over the course of this very long-term project (several decades). Two watersheds, each of 32,000 acres, were dedicated to...
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Monotowns: A Quantitative Analysis
Directory of Open Access Journals (Sweden)
Shastitko Andrei
2016-06-01
Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.
SWOT ANALYSIS - CHINESE PETROLEUM
Chunlan Wang; Lei Zhang; Qi Zhong
2014-01-01
This article was written in early December 2013, combined with the historical development and the latest data on the Chinese Petroleum carried SWOTanalysis. This paper discusses corporate resources, cost, management and external factors such as the political environment and the market supply and demand, conducted a comprehensive and profound analysis.
Sensitivity Analysis Without Assumptions.
Ding, Peng; VanderWeele, Tyler J
2016-05-01
Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.
Seber, George A F
2012-01-01
Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.
DEFF Research Database (Denmark)
Kirchmeier-Andersen, Sabine; Møller Christensen, Jakob; Lihn Jensen, Bente
2004-01-01
This article presents the latest version of VIA (version 3.0). The development of the program was initiated by a demand for more systematic training of language analysis in high schools and universities. The system is now web-based, which enables teachers and students to share exercises across...
Shabalin, P L; Yakubenko, A A; Pokhilevich, VA; Krein, M G
1986-01-01
This collection of eleven papers covers a broad spectrum of topics in analysis, from the study of certain classes of analytic functions to the solvability of singular problems for differential and integral equations to computational schemes for the partial differential equations and singular integral equations.
Lyman L. McDonald; Christina D. Vojta; Kevin S. McKelvey
2013-01-01
Perhaps the greatest barrier between monitoring and management is data analysis. Data languish in drawers and spreadsheets because those who collect or maintain monitoring data lack training in how to effectively summarize and analyze their findings. This chapter serves as a first step to surmounting that barrier by empowering any monitoring team with the basic...
Idris, Ivan
2014-01-01
This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.
Haskell data analysis cookbook
Shukla, Nishant
2014-01-01
Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.
Energy Technology Data Exchange (ETDEWEB)
Frame, Katherine Chiyoko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-06-28
Neutron multiplicity measurements are widely used for nondestructive assay (NDA) of special nuclear material (SNM). When combined with isotopic composition information, neutron multiplicity analysis can be used to estimate the spontaneous fission rate and leakage multiplication of SNM. When combined with isotopic information, the total mass of fissile material can also be determined. This presentation provides an overview of this technique.
Greenberg, Marc W.; Laing, William
2013-01-01
An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.
Activation Analysis of Aluminium
Energy Technology Data Exchange (ETDEWEB)
Brune, Dag
1961-01-15
An analysis of pure aluminium alloyed with magnesium was per- formed by means of gamma spectrometry , Chemical separations were not employed. The isotopes to be determined were obtained in conditions of optimum activity by suitably choosing the time of irradiation and decay. The following elements were detected and measured quantitatively: Iron, zinc, copper, gallium, manganese, chromium, scandium and hafnium.
VENTILATION TECHNOLOGY SYSTEMS ANALYSIS
The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...
International Nuclear Information System (INIS)
Kaiser, V.
1993-01-01
In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs
DEFF Research Database (Denmark)
Mai, Jens Erik
2005-01-01
is presented as an alternative and the paper discusses how this approach includes a broader range of analyses and how it requires a new set of actions from using this approach; analysis of the domain, users and indexers. The paper concludes that the two-step procedure to indexing is insufficient to explain...
On frame multiresolution analysis
DEFF Research Database (Denmark)
Christensen, Ole
2003-01-01
We use the freedom in frame multiresolution analysis to construct tight wavelet frames (even in the case where the refinable function does not generate a tight frame). In cases where a frame multiresolution does not lead to a construction of a wavelet frame we show how one can nevertheless...
Methods in algorithmic analysis
Dobrushkin, Vladimir A
2009-01-01
…helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010
Quantitative Moessbauer analysis
International Nuclear Information System (INIS)
Collins, R.L.
1978-01-01
The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)
DEFF Research Database (Denmark)
Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi
2015-01-01
This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....
Energy Technology Data Exchange (ETDEWEB)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
DEFF Research Database (Denmark)
Conti, Roberto; Hong, Jeong Hee; Szymanski, Wojciech
2012-01-01
of such an algebra. Then we outline a powerful combinatorial approach to analysis of endomorphisms arising from permutation unitaries. The restricted Weyl group consists of automorphisms of this type. We also discuss the action of the restricted Weyl group on the diagonal MASA and its relationship...
International Nuclear Information System (INIS)
Kelsey, C.A.; Mettler, F.A.
1988-01-01
An elementary introduction to ROC analysis illustrates how ROC curves depend on observer threshold levels and discusses the relation between ROC curve parameters and other measures of observer performance including accuracy sensitivity specificity true positive fraction, true negative fraction, false positive fraction and false negative fraction
Information Security Risk Analysis
Peltier, Thomas R
2010-01-01
Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.
Lubsch, A.; Timmermans, K.
2017-01-01
Texture analysis is a method to test the physical properties of a material by tension and compression. The growing interest in commercialisation of seaweeds for human food has stimulated research into the physical properties of seaweed tissue. These are important parameters for the survival of
Shifted Independent Component Analysis
DEFF Research Database (Denmark)
Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai
2007-01-01
Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...
Multiscale principal component analysis
International Nuclear Information System (INIS)
Akinduko, A A; Gorban, A N
2014-01-01
Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis
Euler principal component analysis
Liwicki, Stephan; Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja
Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ 2-norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA,
International Nuclear Information System (INIS)
Santoro, R.T.; Iida, H.; Khripunov, V.; Petrizzi, L.; Sato, S.; Sawan, M.; Shatalov, G.; Schipakin, O.
2001-01-01
This paper summarizes the main results of nuclear analysis calculations performed during the International Thermonuclear Experimental Reactor (ITER) Engineering Design Activity (EDA). Major efforts were devoted to fulfilling the General Design Requirements to minimize the nuclear heating rate in the superconducting magnets and ensuring that radiation conditions at the cryostat are suitable for hands-on-maintenance after reactor shut-down. (author)
Elementary functional analysis
Shilov, Georgi E
1996-01-01
Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.
Computer aided safety analysis
International Nuclear Information System (INIS)
1988-05-01
The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs
1979-01-31
but expinds ’acordiohlike.’ (4) The height- integrated intensity ratio of the red (6300 A) to green (5577 A) emisions of atomic o\\)gen is a good... molecular ion: Analysis of two rocket experiments, Planet. Space Sci. 16, 737, 1968. Hays, P. B. and C. D. Anger, The influence of ground scattering on
Communication Network Analysis Methods.
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Making Strategic Analysis Matter
2012-01-01
Bryan Gabbard , Assessing the Tradecraft of Intelligence Analysis, Santa Monica, Calif.: RAND Corporation, TR-293, 2008. 4 See The Commission on the...July 7, 2011: http://www.rand.org/pubs/occasional_papers/OP152.html Treverton, Gregory F., and C. Bryan Gabbard , Assessing the Tradecraft of
Instrumental analysis, second edition
International Nuclear Information System (INIS)
Christian, G.D.; O'Reilly, J.E.
1988-01-01
The second edition of Instrumental Analysis is a survey of the major instrument-based methods of chemical analysis. It appears to be aimed at undergraduates but would be equally useful in a graduate course. The volume explores all of the classical quantitative methods and contains sections on techniques that usually are not included in a semester course in instrumentation (such as electron spectroscopy and the kinetic methods). Adequate coverage of all of the methods contained in this book would require several semesters of focused study. The 25 chapters were written by different authors, yet the style throughout the book is more uniform than in the earlier edition. With the exception of a two-chapter course in analog and digital circuits, the book purports to de-emphasize instrumentation, focusing more on the theory behind the methods and the application of the methods to analytical problems. However, a detailed analysis of the instruments used in each method is by no means absent. The book has the favor of a user's guide to analysis
Kolmogorov, A N; Silverman, Richard A
1975-01-01
Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.
Rotation in correspondence analysis
van de Velden, Michel; Kiers, Henk A.L.
2005-01-01
In correspondence analysis rows and columns of a nonnegative data matrix are depicted as points in a, usually, two-dimensional plot. Although such a two-dimensional plot often provides a reasonable approximation, the situation can occur that an approximation of higher dimensionality is required.
2016-04-01
Expand Childcare Center hours Dual-military Co-location Policy Maternity , Paternity, and Adoption leave o Women in Service Increase...Distribution unlimited Analysis of Undesignated Work Karan A. Schriver, Edward J. Schmitz, Greggory J. Schell, Hoda Parvin April 2016...designated and undesignated work requirements. Over time, this mix fluctuates, causing changes to the force profile. Undesignated workload has
Indian Academy of Sciences (India)
Chrissa G. Tsiara
2018-03-13
Mar 13, 2018 ... a meta-analysis of case–control studies was conducted. Univariate and ...... recent hepatitis C virus: potential benefit for ribavirin use in. HCV/HIV ... C/G polymorphism in breast pathologies and in HIV-infected patients.
International Nuclear Information System (INIS)
Abe, Toshinori
2001-01-01
The North American Linear Collider Detector group has developed simulation and analysis program packages. LCDROOT is one of the packages, and is based on ROOT and the C++ programing language to maximally benefit from object oriented programming techniques. LCDROOT is constantly improved and now has a new topological vertex finder, ZVTOP3. In this proceeding, the features of the LCDROOT simulation are briefly described
Communication Analysis of Environment.
Malik, M. F.; Thwaites, H. M.
This textbook was developed for use in a Concordia University (Quebec) course entitled "Communication Analysis of Environment." Designed as a practical application of information theory and cybernetics in the field of communication studies, the course is intended to be a self-instructional process, whereby each student chooses one…
Learning: An Evolutionary Analysis
Swann, Joanna
2009-01-01
This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…
Stress Analysis of Composites.
1981-01-01
8217, Finite Elements in Nonlinear Mechanics, 1., 109-130, Tapir Publishers, Norway (1978). 9. A.J. Barnard and P.W. Sharman, ’Elastic-Plastic Analysis Using...Hybrid Stress Finite Elements,’ Finite Elements in Nonlinear Mechanics, 1, 131-148, Tapir Publishers Norway, (1978). ’.........Pian, ’Variational
Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas
2012-01-01
Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all
Multidisciplinary System Reliability Analysis
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Deterministic uncertainty analysis
International Nuclear Information System (INIS)
Worley, B.A.
1987-01-01
Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig
International Nuclear Information System (INIS)
1997-10-01
The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ''Fugen'' was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ''Fugen'' has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)
International Nuclear Information System (INIS)
Lima-e-Silva, Pedro Paulo de
1996-01-01
The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)
Energy Technology Data Exchange (ETDEWEB)
Fudge, A.
1978-12-15
The following aspects of isotope dilution analysis are covered in this report: fundamental aspects of the technique; elements of interest in the nuclear field, choice and standardization of spike nuclide; pre-treatment to achieve isotopic exchange and chemical separation; sensitivity; selectivity; and accuracy.
Szapacs, Cindy
2006-01-01
Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…
Perfusion dyssynchrony analysis
Chiribiri, A.; Villa, A.D.M.; Sammut, E.; Breeuwer, M.; Nagel, E.
2015-01-01
AIMS: We sought to describe perfusion dyssynchrony analysis specifically to exploit the high temporal resolution of stress perfusion CMR. This novel approach detects differences in the temporal distribution of the wash-in of contrast agent across the left ventricular wall. METHODS AND RESULTS:
Proteoglycan isolation and analysis
DEFF Research Database (Denmark)
Woods, A; Couchman, J R
2001-01-01
Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...
Uranium and transuranium analysis
International Nuclear Information System (INIS)
Regnaud, F.
1989-01-01
Analytical chemistry of uranium, neptunium, plutonium, americium and curium is reviewed. Uranium and neptunium are mainly treated and curium is only briefly evoked. Analysis methods include coulometry, titration, mass spectrometry, absorption spectrometry, spectrofluorometry, X-ray spectrometry, nuclear methods and radiation spectrometry [fr
International Nuclear Information System (INIS)
Preyssl, C.
1986-01-01
Safety analysis provides the only tool for evaluation and quantification of rare or hypothetical events leading to system failure. So far probability theory has been used for the fault- and event-tree methodology. The phenomenon of uncertainties constitutes an important aspect in risk analysis. Uncertainties can be classified as originating from 'randomness' or 'fuzziness'. Probability theory addresses randomness only. The use of 'fuzzy set theory' makes it possible to include both types of uncertainty in the mathematical model of risk analysis. Thus the 'fuzzy fault tree' is expressed in 'possibilistic' terms implying a range of simplifications and improvements. 'Human failure' and 'conditionality' can be treated correctly. Only minimum-maximum relations are used to combine the possibility distributions of events. Various event-classifications facilitate the interpretation of the results. The method is demonstrated by application to a TRIGA-research reactor. Uncertainty as an implicit part of 'fuzzy risk' can be quantified explicitly using an 'uncertainty measure'. Based on this the 'degree of relative compliance' with a quantizative safety goal can be defined for a particular risk. The introduction of 'weighting functionals' guarantees the consideration of the importances attached to different parts of the risk exceeding or complying with the standard. The comparison of two reference systems is demonstrated in a case study. It is concluded that any application of the 'fuzzy risk analysis' has to be free of any hypostatization when reducing subjective to objective information. (Author)
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Ignalina Safety Analysis Group
International Nuclear Information System (INIS)
Ushpuras, E.
1995-01-01
The article describes the fields of activities of Ignalina NPP Safety Analysis Group (ISAG) in the Lithuanian Energy Institute and overview the main achievements gained since the group establishment in 1992. The group is working under the following guidelines: in-depth analysis of the fundamental physical processes of RBMK-1500 reactors; collection, systematization and verification of the design and operational data; simulation and analysis of potential accident consequences; analysis of thermohydraulic and neutronic characteristics of the plant; provision of technical and scientific consultations to VATESI, Governmental authorities, and also international institutions, participating in various projects aiming at Ignalina NPP safety enhancement. The ISAG is performing broad scientific co-operation programs with both Eastern and Western scientific groups, supplying engineering assistance for Ignalina NPP. ISAG is also participating in the joint Lithuanian - Swedish - Russian project - Barselina, the first Probabilistic Safety Assessment (PSA) study of Ignalina NPP. The work is underway together with Maryland University (USA) for assessment of the accident confinement system for a range of breaks in the primary circuit. At present the ISAG personnel is also involved in the project under the grant from the Nuclear Safety Account, administered by the European Bank for reconstruction and development for the preparation and review of an in-depth safety assessment of the Ignalina plant
Kane, Jonathan M
2016-01-01
This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...
Russian Language Analysis Project
Serianni, Barbara; Rethwisch, Carolyn
2011-01-01
This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…
Douglas, David
2016-01-01
Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it
Polysome Profile Analysis - Yeast
Czech Academy of Sciences Publication Activity Database
Pospíšek, M.; Valášek, Leoš Shivaya
2013-01-01
Roč. 530, č. 2013 (2013), s. 173-181 ISSN 0076-6879 Institutional support: RVO:61388971 Keywords : grow yeast cultures * polysome profile analysis * sucrose density gradient centrifugation Subject RIV: CE - Biochemistry Impact factor: 2.194, year: 2013
1999-03-01
analysis that takes place in anatomy or circuit diagrams. The goal is to break an entity down into a set of non- overlapping parts, and to specify the...components. For example, one subject in predicting the fate of different species, broke them into three types: animals that humans would save (e.g., gorillas
ATLAS Distributed Analysis Tools
Gonzalez de la Hoz, Santiago; Liko, Dietrich
2008-01-01
The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...
Isaacson, Eugene
1994-01-01
This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.
Dyess, Susan Macleod
2011-12-01
This paper reports a concept analysis of faith. There are numerous scholars who consider spirituality and religiosity as they relate to health and nursing. Faith is often implied as linked to these concepts but deserves distinct exploration. In addition, as nursing practice conducted within communities of faith continues to emerge, concept clarification of faith is warranted. Qualitative analysis deliberately considered the concept of faith within the lens of Margaret Newman's health as expanding consciousness. Data sources used included a secondary analysis of stories collected within a study conducted in 2008, two specific reconstructed stories, the identification of attributes noted within these various stories and selected philosophical literature from 1950 to 2009. A definition was identified from the analysis; faith is an evolving pattern of believing, that grounds and guides authentic living and gives meaning in the present moment of inter-relating. Four key attributes of faith were also identified as focusing on beliefs, foundational meaning for life, living authentically in accordance with beliefs, and interrelating with self, others and/or Divine. Although a seemingly universal concept, faith was defined individually. Faith appeared to be broader than spiritual practices and religious ritual and became the very foundation that enabled human beings to make sense of their world and circumstances. More work is needed to understand how faith community nursing can expand the traditional understanding of denominationally defined faith community practices and how nurses can support faith for individuals with whom they encounter within all nursing practice. © 2011 Blackwell Publishing Ltd.
Don S. Stone; Joseph E. Jakes; Jonathan Puthoff; Abdelmageed A. Elmustafa
2010-01-01
Finite element analysis is used to simulate cone indentation creep in materials across a wide range of hardness, strain rate sensitivity, and work-hardening exponent. Modeling reveals that the commonly held assumption of the hardness strain rate sensitivity (mΗ) equaling the flow stress strain rate sensitivity (mσ...
Energy-Water Modeling and Analysis | Energy Analysis | NREL
Generation (ReEDS Model Analysis) U.S. Energy Sector Vulnerabilities to Climate Change and Extreme Weather Modeling and Analysis Energy-Water Modeling and Analysis NREL's energy-water modeling and analysis vulnerabilities from various factors, including water. Example Projects Renewable Electricity Futures Study
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical
Energy Technology Data Exchange (ETDEWEB)
Dwayne C. Kicker
2001-09-28
A statistical description of the probable block sizes formed by fractures around the emplacement drifts has been developed for each of the lithologic units of the repository host horizon. A range of drift orientations with the drift azimuth varied in 15{sup o} increments has been considered in the static analysis. For the quasi-static seismic analysis, and the time-dependent and thermal effects analysis, two drift orientations have been considered: a drift azimuth of 105{sup o} and the current emplacement drift azimuth of 75{sup o}. The change in drift profile resulting from progressive deterioration of the emplacement drifts has been assessed both with and without backfill. Drift profiles have been determined for four different time increments, including static (i.e., upon excavation), 200 years, 2,000 years, and 10,000 years. The effect of seismic events on rock fall has been analyzed. Block size distributions and drift profiles have been determined for three seismic levels, including a 1,000-year event, a 5,000-year event, and a 10,000-year event. Data developed in this modeling and analysis activity have been entered into the TDMS (DTN: MO0109RDDAAMRR.003). The following conclusions have resulted from this drift degradation analysis: (1) The available fracture data are suitable for supporting a detailed key block analysis of the repository host horizon rock mass. The available data from the north-south Main Drift and the east-west Cross Drift provide a sufficient representative fracture sample of the repository emplacement drift horizon. However, the Tptpln fracture data are only available from a relatively small section of the Cross Drift, resulting in a smaller fracture sample size compared to the other lithologic units. This results in a lower degree of confidence that the key block data based on the Tptpln data set is actually representative of the overall Tptpln key block population. (2) The seismic effect on the rock fall size distribution for all events
Energy Technology Data Exchange (ETDEWEB)
Kouzes, Richard T.; Zhu, Zihua
2011-09-12
The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology, possibly one under development at Nonlinear Ion Dynamics (NID), will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL in January 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are reported here. A second sample of isotopically separated copper was provided by NID to PNNL in August 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are also reported here.
International Nuclear Information System (INIS)
Kunz, P.F.
1991-04-01
There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed
Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian
2012-05-01
GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.
Extended Testability Analysis Tool
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Multiparameter Cell Cycle Analysis.
Jacobberger, James W; Sramkoski, R Michael; Stefan, Tammy; Woost, Philip G
2018-01-01
Cell cycle cytometry and analysis are essential tools for studying cells of model organisms and natural populations (e.g., bone marrow). Methods have not changed much for many years. The simplest and most common protocol is DNA content analysis, which is extensively published and reviewed. The next most common protocol, 5-bromo-2-deoxyuridine S phase labeling detected by specific antibodies, is also well published and reviewed. More recently, S phase labeling using 5'-ethynyl-2'-deoxyuridine incorporation and a chemical reaction to label substituted DNA has been established as a basic, reliable protocol. Multiple antibody labeling to detect epitopes on cell cycle regulated proteins, which is what this chapter is about, is the most complex of these cytometric cell cycle assays, requiring knowledge of the chemistry of fixation, the biochemistry of antibody-antigen reactions, and spectral compensation. However, because this knowledge is relatively well presented methodologically in many papers and reviews, this chapter will present a minimal Methods section for one mammalian cell type and an extended Notes section, focusing on aspects that are problematic or not well described in the literature. Most of the presented work involves how to segment the data to produce a complete, progressive, and compartmentalized cell cycle analysis from early G1 to late mitosis (telophase). A more recent development, using fluorescent proteins fused with proteins or peptides that are degraded by ubiquitination during specific periods of the cell cycle, termed "Fucci" (fluorescent, ubiquitination-based cell cycle indicators) provide an analysis similar in concept to multiple antibody labeling, except in this case cells can be analyzed while living and transgenic organisms can be created to perform cell cycle analysis ex or in vivo (Sakaue-Sawano et al., Cell 132:487-498, 2007). This technology will not be discussed.
Complementing Gender Analysis Methods.
Kumar, Anant
2016-01-01
The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.
Blind Analysis in Particle Physics
International Nuclear Information System (INIS)
Roodman, A
2003-01-01
A review of the blind analysis technique, as used in particle physics measurements, is presented. The history of blind analyses in physics is briefly discussed. Next the dangers of and the advantages of a blind analysis are described. Three distinct kinds of blind analysis in particle physics are presented in detail. Finally, the BABAR collaboration's experience with the blind analysis technique is discussed
Proton exciting X ray analysis
International Nuclear Information System (INIS)
Ma Xinpei
1986-04-01
The analyzing capability of proton exciting X ray analysis for different elements in organisms was discussed, and dealing with examples of trace element analysis in the human body and animal organisms, such as blood serum, urine, and hair. The sensitivity, accuracy, and capability of multielement analysis were discussed. Its strong points for the trace element analysis in biomedicine were explained
DEFF Research Database (Denmark)
Larsen, Michael Holm
1999-01-01
This note introduces the IDEF0 modelling language (semantics and syntax), and associated rules and techniques, for developing structured graphical representations of a system or enterprise. Use of this standard for IDEF0 permits the construction of models comprising system functions (activities...... that require a modelling technique for the analysis, development, re-engineering, integration, or acquisition of information systems; and incorporate a systems or enterprise modelling technique into a business process analysis or software engineering methodology.This note is a summary of the Standard...... for Integration Definition for Function Modelling (IDEF0). I.e. the Draft Federal Information Processing Standards Publication 183, 1993, December 21, Announcing the Standard for Integration Definition for Function Modelling (IDEF0)....
International Nuclear Information System (INIS)
Tomar, B.S.
2016-01-01
In the present talk the fundamentals of the nuclear forensic investigations will be discussed followed by the detailed standard operating procedure (SOP) for the nuclear forensic analysis. The characteristics, such as, dimensions, particle size, elemental and isotopic composition help the nuclear forensic analyst in source attribution of the interdicted material, as the specifications of the nuclear materials used by different countries are different. The analysis of elemental composition could be done by SEM-EDS, XRF, CHNS analyser, etc. depending upon the type of the material. Often the trace constituents (analysed by ICP-AES, ICP-MS, AAS, etc) provide valuable information about the processes followed during the production of the material. Likewise the isotopic composition determined by thermal ionization mass spectrometry provides useful information about the enrichment of the nuclear fuel and hence its intended use
Visualization analysis and design
Munzner, Tamara
2015-01-01
Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...
Invitation to complex analysis
Boas, Ralph P
2010-01-01
Ideal for a first course in complex analysis, this book can be used either as a classroom text or for independent study. Written at a level accessible to advanced undergraduates and beginning graduate students, the book is suitable for readers acquainted with advanced calculus or introductory real analysis. The treatment goes beyond the standard material of power series, Cauchy's theorem, residues, conformal mapping, and harmonic functions by including accessible discussions of intriguing topics that are uncommon in a book at this level. The flexibility afforded by the supplementary topics and applications makes the book adaptable either to a short, one-term course or to a comprehensive, full-year course. Detailed solutions of the exercises both serve as models for students and facilitate independent study. Supplementary exercises, not solved in the book, provide an additional teaching tool. This second edition has been painstakingly revised by the author's son, himself an award-winning mathematical expositor...
Tohyama, Mikio
2015-01-01
What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...
Energy Technology Data Exchange (ETDEWEB)
Hopke, P K [Department of Chemistry, Clarkson Univ., Potsdam, NY (United States)
2000-07-01
As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)
Kass, Robert E; Brown, Emery N
2014-01-01
Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.
In Silico Expression Analysis.
Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz
2016-01-01
Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress.
Leonard, Kathryn; Tari, Sibel; Hubert, Evelyne; Morin, Geraldine; El-Zehiry, Noha; Chambers, Erin
2018-01-01
Based on the second Women in Shape (WiSH) workshop held in Sirince, Turkey in June 2016, these proceedings offer the latest research on shape modeling and analysis and their applications. The 10 peer-reviewed articles in this volume cover a broad range of topics, including shape representation, shape complexity, and characterization in solving image-processing problems. While the first six chapters establish understanding in the theoretical topics, the remaining chapters discuss important applications such as image segmentation, registration, image deblurring, and shape patterns in digital fabrication. The authors in this volume are members of the WiSH network and their colleagues, and most were involved in the research groups formed at the workshop. This volume sheds light on a variety of shape analysis methods and their applications, and researchers and graduate students will find it to be an invaluable resource for further research in the area.