WorldWideScience

Sample records for source theory analysis

  1. Gravitation and source theory

    Yilmaz, H.

    1975-01-01

    Schwinger's source theory is applied to the problem of gravitation and its quantization. It is shown that within the framework of a flat-space the source theory implementation leads to a violation of probability. To avoid the difficulty one must introduce a curved space-time hence the source concept may be said to necessitate the transition to a curved-space theory of gravitation. It is further shown that the curved-space theory of gravitation implied by the source theory is not equivalent to the conventional Einstein theory. The source concept leads to a different theory where the gravitational field has a stress-energy tensor t/sup nu//sub mu/ which contributes to geometric curvatures

  2. Analysis, Design and Implementation of an Embedded Realtime Sound Source Localization System Based on Beamforming Theory

    Arko Djajadi

    2009-12-01

    Full Text Available This project is intended to analyze, design and implement a realtime sound source localization system by using a mobile robot as the media. The implementated system uses 2 microphones as the sensors, Arduino Duemilanove microcontroller system with ATMega328p as the microprocessor, two permanent magnet DC motors as the actuators for the mobile robot and a servo motor as the actuator to rotate the webcam directing to the location of the sound source, and a laptop/PC as the simulation and display media. In order to achieve the objective of finding the position of a specific sound source, beamforming theory is applied to the system. Once the location of the sound source is detected and determined, the choice is either the mobile robot will adjust its position according to the direction of the sound source or only webcam will rotate in the direction of the incoming sound simulating the use of this system in a video conference. The integrated system has been tested and the results show the system could localize in realtime a sound source placed randomly on a half circle area (0 - 1800 with a radius of 0.3m - 3m, assuming the system is the center point of the circle. Due to low ADC and processor speed, achievable best angular resolution is still limited to 25o.

  3. Superradiance in the source theory

    Kim, Y.D.

    1979-01-01

    The radiative transition rate is formulated in a new approach within the framework of the source theory which makes use of a vacuum persistence amplitude. It is also shown that the source theory can be applied effectively to detemine the likelihood of superradiance of coherence in spontaneous emission. Since the source theory is applicable not only to electromagnetic interactions but also to many other interaction, it would be most interesting to inquire if superradiance can occur in processes other than the elctromagnetic radiative process, such as nuclear or gravitational process. (Author)

  4. Non-equilibrium thermodynamics theory of econometric source discovery for large data analysis

    van Bergem, Rutger; Jenkins, Jeffrey; Benachenhou, Dalila; Szu, Harold

    2014-05-01

    Almost all consumer and firm transactions are achieved using computers and as a result gives rise to increasingly large amounts of data available for analysts. The gold standard in Economic data manipulation techniques matured during a period of limited data access, and the new Large Data Analysis (LDA) paradigm we all face may quickly obfuscate most tools used by Economists. When coupled with an increased availability of numerous unstructured, multi-modal data sets, the impending 'data tsunami' could have serious detrimental effects for Economic forecasting, analysis, and research in general. Given this reality we propose a decision-aid framework for Augmented-LDA (A-LDA) - a synergistic approach to LDA which combines traditional supervised, rule-based Machine Learning (ML) strategies to iteratively uncover hidden sources in large data, the artificial neural network (ANN) Unsupervised Learning (USL) at the minimum Helmholtz free energy for isothermal dynamic equilibrium strategies, and the Economic intuitions required to handle problems encountered when interpreting large amounts of Financial or Economic data. To make the ANN USL framework applicable to economics we define the temperature, entropy, and energy concepts in Economics from non-equilibrium molecular thermodynamics of Boltzmann viewpoint, as well as defining an information geometry, on which the ANN can operate using USL to reduce information saturation. An exemplar of such a system representation is given for firm industry equilibrium. We demonstrate the traditional ML methodology in the economics context and leverage firm financial data to explore a frontier concept known as behavioral heterogeneity. Behavioral heterogeneity on the firm level can be imagined as a firm's interactions with different types of Economic entities over time. These interactions could impose varying degrees of institutional constraints on a firm's business behavior. We specifically look at behavioral heterogeneity for firms

  5. Random matrix theory with an external source

    Brézin, Edouard

    2016-01-01

    This is a first book to show that the theory of the Gaussian random matrix is essential to understand the universal correlations with random fluctuations and to demonstrate that it is useful to evaluate topological universal quantities. We consider Gaussian random matrix models in the presence of a deterministic matrix source. In such models the correlation functions are known exactly for an arbitrary source and for any size of the matrices. The freedom given by the external source allows for various tunings to different classes of universality. The main interest is to use this freedom to compute various topological invariants for surfaces such as the intersection numbers for curves drawn on a surface of given genus with marked points, Euler characteristics, and the Gromov–Witten invariants. A remarkable duality for the average of characteristic polynomials is essential for obtaining such topological invariants. The analysis is extended to nonorientable surfaces and to surfaces with boundaries.

  6. Blind source separation theory and applications

    Yu, Xianchuan; Xu, Jindong

    2013-01-01

    A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies    The book presents an overview of Blind Source Separation, a relatively new signal processing method.  Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers

  7. Antenna theory: Analysis and design

    Balanis, C. A.

    The book's main objective is to introduce the fundamental principles of antenna theory and to apply them to the analysis, design, and measurements of antennas. In a description of antennas, the radiation mechanism is discussed along with the current distribution on a thin wire. Fundamental parameters of antennas are examined, taking into account the radiation pattern, radiation power density, radiation intensity, directivity, numerical techniques, gain, antenna efficiency, half-power beamwidth, beam efficiency, bandwidth, polarization, input impedance, and antenna temperature. Attention is given to radiation integrals and auxiliary potential functions, linear wire antennas, loop antennas, linear and circular arrays, self- and mutual impedances of linear elements and arrays, broadband dipoles and matching techniques, traveling wave and broadband antennas, frequency independent antennas and antenna miniaturization, the geometrical theory of diffraction, horns, reflectors and lens antennas, antenna synthesis and continuous sources, and antenna measurements.

  8. An overview of gravitational waves theory, sources and detection

    Auger, Gerard

    2017-01-01

    This book describes detection techniques used to search for and analyze gravitational waves (GW). It covers the whole domain of GW science, starting from the theory and ending with the experimental techniques (both present and future) used to detect them. The theoretical sections of the book address the theory of general relativity and of GW, followed by the theory of GW detection. The various sources of GW are described as well as the methods used to analyse them and to extract their physical parameters. It includes an analysis of the consequences of GW observations in terms of astrophysics as well as a description of the different detectors that exist and that are planned for the future. With the recent announcement of GW detection and the first results from LISA Pathfinder, this book will allow non-specialists to understand the present status of the field and the future of gravitational wave science

  9. An analysis of natural gas exploration potential in the Qiongdongnan Basin by use of the theory of “joint control of source rocks and geothermal heat”

    Zhang Gongcheng

    2014-10-01

    Full Text Available The Oligocene Yacheng Fm contains the most important source rocks that have been confirmed by exploratory wells in the Qiongdongnan Basin. The efficiency of these source rocks is the key to the breakthrough in natural gas exploration in the study area. This paper analyzes the hydrocarbon potential of each sag in this basin from the perspective of control of both source rocks and geothermal heat. Two types of source rocks occur in the Yacheng Fm, namely mudstone of transitional facies and mudstone of neritic facies. Both of them are dominated by a kerogen of type-III, followed by type-II. Their organic matter abundances are controlled by the amount of continental clastic input. The mudstone of transitional facies is commonly higher in organic matter abundance, while that of neritic facies is lower. The coal-measure source rocks of transitional facies were mainly formed in such environments as delta plains, coastal plains and barrier tidal flat-marshes. Due to the control of Cenozoic lithosphere extension and influence of neotectonism, the geothermal gradient, terrestrial heat flow value (HFV and level of thermal evolution are generally high in deep water. The hot setting not only determines the predominance of gas generation in the deep-water sags, but can promote the shallow-buried source rocks in shallow water into oil window to generate oil. In addition to promoting the hydrocarbon generation of source rocks, the high geothermal and high heat flow value can also speed up the cracking of residual hydrocarbons, thus enhancing hydrocarbon generation efficiency and capacity. According to the theory of joint control of source quality and geothermal heat on hydrocarbon generation, we comprehensively evaluate and rank the exploration potentials of major sags in the Qiongdongnan Basin. These sags are divided into 3 types, of which type-I sags including Yanan, Lingshui, Baodao, Ledong and Huaguang are the highest in hydrocarbon exploration potential.

  10. Functional analysis theory and applications

    Edwards, RE

    2011-01-01

    ""The book contains an enormous amount of information - mathematical, bibliographical and historical - interwoven with some outstanding heuristic discussions."" - Mathematical Reviews.In this massive graduate-level study, Emeritus Professor Edwards (Australian National University, Canberra) presents a balanced account of both the abstract theory and the applications of linear functional analysis. Written for readers with a basic knowledge of set theory, general topology, and vector spaces, the book includes an abundance of carefully chosen illustrative examples and excellent exercises at the

  11. Analysis of the orderly distribution of oil and gas fields in China based on the theory of co-control of source and heat

    Gongcheng Zhang

    2015-01-01

    Full Text Available Taking a hydrocarbon zone or a basin group as a unit, this paper analyzed the vertical hydrocarbon generation regularity of onshore and offshore oil and gas fields in China, based on the theory of co-control of source and heat. The results demonstrated that the hydrocarbon generation modes of oil and gas fields in China are orderly. First, the hydrocarbon zones in southeastern China offshore area, including the East and South China Sea basins, are dominated by single hydrocarbon generation mode, which displays as either single oil generation in the near shore or single gas generation in the offshore controlled by both source and heat. Second, the eastern hydrocarbon zones, including the Bohai Bay, Songliao and Jianghan basins and the North and South Yellow Sea basins, are dominated by a two-layer hydrocarbon generation mode, which performs as “upper oil and lower gas”. Third, the central hydrocarbon zones, including the Ordos, Sichuan and Chuxiong basins, are also dominated by the “upper oil and lower gas” two-layer hydrocarbon generation mode. In the Ordos Basin, gas is mainly generated in the Triassic, and oil is predominantly generated in the Paleozoic. In the Sichuan Basin, oil was discovered in the Jurassic, and gas was mostly discovered in the Sinian and Triassic. Fourth, the western hydrocarbon zones are dominated by a “sandwich” multi-layer mode, such as the Junggar, Tarim, Qaidam basins. In summary, the theory of co-control of source and heat will be widely applied to oil and gas exploration all over China. Oil targets should be focused on the near shore areas in the southeastern China sea, the upper strata in the eastern and middle hydrocarbon zones, and the Ordovician, Permian and Paleogene strata in the western hydrocarbon zone, while gas targets should be focused on the off-shore areas in the southeastern China sea, the Cambrian, Carboniferous, Jurassic, and Quaternary strata in the western hydrocarbon zone. A pattern of

  12. Acoustic source localization : Exploring theory and practice

    Wind, Jelmer

    2009-01-01

    Over the past few decades, noise pollution became an important issue in modern society. This has led to an increased effort in the industry to reduce noise. Acoustic source localization methods determine the location and strength of the vibrations which are the cause of sound based onmeasurements of

  13. Dimensional analysis in field theory

    Stevenson, P.M.

    1981-01-01

    Dimensional Transmutation (the breakdown of scale invariance in field theories) is reconciled with the commonsense notions of Dimensional Analysis. This makes possible a discussion of the meaning of the Renormalisation Group equations, completely divorced from the technicalities of renormalisation. As illustrations, I describe some very farmiliar QCD results in these terms

  14. A theory of gradient analysis

    Braak, ter C.J.F.

    1988-01-01

    The theory of gradient analysis is presented in this chapter, in which the heuristic techniques are integrated with regression, calibration, ordination and constrained ordination as distinct, well-defined statistical problems. The various techniques used for each type of problem are classified into

  15. Mathematical theory of sedimentation analysis

    Fujita, Hiroshi; Van Rysselberghe, P

    1962-01-01

    Mathematical Theory of Sedimentation Analysis presents the flow equations for the ultracentrifuge. This book is organized into two parts encompassing six chapters that evaluate the systems of reacting components, the differential equations for the ultracentrifuge, and the case of negligible diffusion. The first chapters consider the Archibald method for molecular weight determination; pressure-dependent sedimentation; expressions for the refractive index and its gradient; relation between refractive index and concentration; and the analysis of Gaussian distribution. Other chapters deal with th

  16. Polar source analysis : technical memorandum

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  17. Hamiltonian analysis of Plebanski theory

    Buffenoir, E; Henneaux, M; Noui, K; Roche, Ph

    2004-01-01

    We study the Hamiltonian formulation of Plebanski theory in both the Euclidean and Lorentzian cases. A careful analysis of the constraints shows that the system is non-regular, i.e., the rank of the Dirac matrix is non-constant on the non-reduced phase space. We identify the gravitational and topological sectors which are regular subspaces of the non-reduced phase space. The theory can be restricted to the regular subspace which contains the gravitational sector. We explicitly identify first- and second-class constraints in this case. We compute the determinant of the Dirac matrix and the natural measure for the path integral of the Plebanski theory (restricted to the gravitational sector). This measure is the analogue of the Leutwyler-Fradkin-Vilkovisky measure of quantum gravity

  18. Sources of Wilhelm Johannsen's genotype theory.

    Roll-Hansen, Nils

    2009-01-01

    This paper describes the historical background and early formation of Wilhelm Johannsen's distinction between genotype and phenotype. It is argued that contrary to a widely accepted interpretation (For instance, W. Provine, 1971. The Origins of Theoretical Population Genetics. Chicago: The University of Chicago Press; Mayr, 1973; F. B. Churchill, 1974. Journal of the History of Biology 7: 5-30; E. Mayr, 1982. The Growth of Biological Thought, Cambridge: Harvard University Press; J. Sapp, 2003. Genesis. The Evolution of Biology. New York: Oxford University Press) his concepts referred primarily to properties of individual organisms and not to statistical averages. Johannsen's concept of genotype was derived from the idea of species in the tradition of biological systematics from Linnaeus to de Vries: An individual belonged to a group - species, subspecies, elementary species - by representing a certain underlying type (S. Müller-Wille and V. Orel, 2007. Annals of Science 64: 171-215). Johannsen sharpened this idea theoretically in the light of recent biological discoveries, not least those of cytology. He tested and confirmed it experimentally combining the methods of biometry, as developed by Francis Galton, with the individual selection method and pedigree analysis, as developed for instance by Louis Vilmorin. The term "genotype" was introduced in W. Johannsen's 1909 (Elemente der Exakten Erblichkeitslehre. Jena: Gustav Fischer) treatise, but the idea of a stable underlying biological "type" distinct from observable properties was the core idea of his classical bean selection experiment published 6 years earlier (W. Johannsen, 1903. Ueber Erblichkeit in Populationen und reinen Linien. Eine Beitrag zur Beleuchtung schwebender Selektionsfragen, Jena: Gustav Fischer, pp. 58-59). The individual ontological foundation of population analysis was a self-evident presupposition in Johannsen's studies of heredity in populations from their start in the early 1890s till his

  19. Classical electromagnetic field theory in the presence of magnetic sources

    Chen, Wen-Jun; Li, Kang; Naón, Carlos

    2001-01-01

    Using two new well defined 4-dimensional potential vectors, we formulate the classical Maxwell's field theory in a form which has manifest Lorentz covariance and SO(2) duality symmetry in the presence of magnetic sources. We set up a consistent Lagrangian for the theory. Then from the action principle we get both Maxwell's equation and the equation of motion of a dyon moving in the electro-magnetic field.

  20. On the finite line source problem in diffusion theory

    Mikkelsen, T.; Troen, I.; Larsen, S.E.

    1981-09-01

    A simple formula for calculating dispersion from a continuous finite line source, placed at right angles to the mean wind direction, is derived on the basis of statistical theory. Comparison is made with the virtual source concept usually used and this is shown to be correct only in the limit where the virtual time lag Tsub(v) is small compared to the timescale of the turbulence tsub(l). (Auth.)

  1. Bifurcation and stability in Yang-Mills theory with sources

    Jackiw, R.

    1979-06-01

    A lecture is presented in which some recent work on solutions to classical Yang-Mills theory is discussed. The investigations summarized include the field equations with static, external sources. A pattern allowing a comprehensive description of the solutions and stability in dynamical systems are covered. A list of open questions and problems for further research is given. 20 references

  2. Point sources and multipoles in inverse scattering theory

    Potthast, Roland

    2001-01-01

    Over the last twenty years, the growing availability of computing power has had an enormous impact on the classical fields of direct and inverse scattering. The study of inverse scattering, in particular, has developed rapidly with the ability to perform computational simulations of scattering processes and led to remarkable advances in a range of applications, from medical imaging and radar to remote sensing and seismic exploration. Point Sources and Multipoles in Inverse Scattering Theory provides a survey of recent developments in inverse acoustic and electromagnetic scattering theory. Focusing on methods developed over the last six years by Colton, Kirsch, and the author, this treatment uses point sources combined with several far-reaching techniques to obtain qualitative reconstruction methods. The author addresses questions of uniqueness, stability, and reconstructions for both two-and three-dimensional problems.With interest in extracting information about an object through scattered waves at an all-ti...

  3. Theories of police legitimacy – its sources and effects

    Pavla Homolová

    2018-04-01

    Full Text Available The review of theories on police legitimacy aims at introducing the subject with a multidisciplinary approach. It quotes criminological, sociological as well as psychological and institutional theories of legitimacy, in order to provide the reader a rich framework, in which the findings of the presented current empirical studies can be evaluated. Police legitimacy is conceived as a social phenomenon, closely related to social norms such as socially constructed police roles and models of policing. The prevailing normative model of police legitimacy in criminology is discussed in greater detail, including critical outlook on procedural fairness as the assumed main source of police empirical legitimacy. Recent findings concerning legal socialization and theories of legitimization myths are high- lighted in order to supplement the micro-level oriented criminological literature on police legitimacy. Possible future pathways of legitimacy research in criminology are discussed.

  4. A Meta-Analysis of Institutional Theories

    1989-06-01

    GPOUP SUBGROUP Institutional Theory , Isomorphism, Administrative Difterpntiation, Diffusion of Change, Rational, Unit Of Analysis 19 ABSTRACT (Continue on... institutional theory may lead to better decision making and evaluation criteria on the part of managers in the non-profit sector. C. SCOPE This paper... institutional theory : I) Organizations evolving in environments with elabora- ted institutional rules create structure that conform to those rules. 2

  5. The theory of magnetohydrodynamic wave generation by localized sources. I - General asymptotic theory

    Collins, William

    1989-01-01

    The magnetohydrodynamic wave emission from several localized, periodic, kinematically specified fluid velocity fields are calculated using Lighthill's method for finding the far-field wave forms. The waves propagate through an isothermal and uniform plasma with a constant B field. General properties of the energy flux are illustrated with models of pulsating flux tubes and convective rolls. Interference theory from geometrical optics is used to find the direction of minimum fast-wave emission from multipole sources and slow-wave emission from discontinuous sources. The distribution of total flux in fast and slow waves varies with the ratios of the source dimensions l to the acoustic and Alfven wavelengths.

  6. Mathematical analysis, approximation theory and their applications

    Gupta, Vijay

    2016-01-01

    Designed for graduate students, researchers, and engineers in mathematics, optimization, and economics, this self-contained volume presents theory, methods, and applications in mathematical analysis and approximation theory. Specific topics include: approximation of functions by linear positive operators with applications to computer aided geometric design, numerical analysis, optimization theory, and solutions of differential equations. Recent and significant developments in approximation theory, special functions and q-calculus along with their applications to mathematics, engineering, and social sciences are discussed and analyzed. Each chapter enriches the understanding of current research problems and theories in pure and applied research.

  7. Antenna theory analysis and design

    Balanis, Constantine A

    2005-01-01

    The discipline of antenna theory has experienced vast technological changes. In response, Constantine Balanis has updated his classic text, Antenna Theory, offering the most recent look at all the necessary topics. New material includes smart antennas and fractal antennas, along with the latest applications in wireless communications. Multimedia material on an accompanying CD presents PowerPoint viewgraphs of lecture notes, interactive review questions, Java animations and applets, and MATLAB features. Like the previous editions, Antenna Theory, Third Edition meets the needs of e

  8. Fourier analysis in combinatorial number theory

    Shkredov, Il'ya D

    2010-01-01

    In this survey applications of harmonic analysis to combinatorial number theory are considered. Discussion topics include classical problems of additive combinatorics, colouring problems, higher-order Fourier analysis, theorems about sets of large trigonometric sums, results on estimates for trigonometric sums over subgroups, and the connection between combinatorial and analytic number theory. Bibliography: 162 titles.

  9. Fourier analysis in combinatorial number theory

    Shkredov, Il' ya D [M. V. Lomonosov Moscow State University, Moscow (Russian Federation)

    2010-09-16

    In this survey applications of harmonic analysis to combinatorial number theory are considered. Discussion topics include classical problems of additive combinatorics, colouring problems, higher-order Fourier analysis, theorems about sets of large trigonometric sums, results on estimates for trigonometric sums over subgroups, and the connection between combinatorial and analytic number theory. Bibliography: 162 titles.

  10. Source Similarity and Social Media Health Messages: Extending Construal Level Theory to Message Sources.

    Young, Rachel

    2015-09-01

    Social media users post messages about health goals and behaviors to online social networks. Compared with more traditional sources of health communication such as physicians or health journalists, peer sources are likely to be perceived as more socially close or similar, which influences how messages are processed. This experimental study uses construal level theory of psychological distance to predict how mediated health messages from peers influence health-related cognition and behavioral intention. Participants were exposed to source cues that identified peer sources as being either highly attitudinally and demographically similar to or different from participants. As predicted by construal level theory, participants who perceived sources of social media health messages as highly similar listed a greater proportion of beliefs about the feasibility of health behaviors and a greater proportion of negative beliefs, while participants who perceived sources as more dissimilar listed a greater proportion of positive beliefs about the health behaviors. Results of the study could be useful in determining how health messages from peers could encourage individuals to set realistic health goals.

  11. Diffusion theory model for optimization calculations of cold neutron sources

    Azmy, Y.Y.

    1987-01-01

    Cold neutron sources are becoming increasingly important and common experimental facilities made available at many research reactors around the world due to the high utility of cold neutrons in scattering experiments. The authors describe a simple two-group diffusion model of an infinite slab LD 2 cold source. The simplicity of the model permits to obtain an analytical solution from which one can deduce the reason for the optimum thickness based solely on diffusion-type phenomena. Also, a second more sophisticated model is described and the results compared to a deterministic transport calculation. The good (particularly qualitative) agreement between the results suggests that diffusion theory methods can be used in parametric and optimization studies to avoid the generally more expensive transport calculations

  12. Functional analysis, spectral theory, and applications

    Einsiedler, Manfred

    2017-01-01

    This textbook provides a careful treatment of functional analysis and some of its applications in analysis, number theory, and ergodic theory. In addition to discussing core material in functional analysis, the authors cover more recent and advanced topics, including Weyl’s law for eigenfunctions of the Laplace operator, amenability and property (T), the measurable functional calculus, spectral theory for unbounded operators, and an account of Tao’s approach to the prime number theorem using Banach algebras. The book further contains numerous examples and exercises, making it suitable for both lecture courses and self-study. Functional Analysis, Spectral Theory, and Applications is aimed at postgraduate and advanced undergraduate students with some background in analysis and algebra, but will also appeal to everyone with an interest in seeing how functional analysis can be applied to other parts of mathematics.

  13. Operator theory a comprehensive course in analysis, part 4

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 4 focuses on operator theory, especially on a Hilbert space. Central topics are the spectral theorem, the theory of trace class and Fredholm determinants, and the study of

  14. Acoustic Source Localization and Beamforming: Theory and Practice

    Chen Joe C

    2003-01-01

    Full Text Available We consider the theoretical and practical aspects of locating acoustic sources using an array of microphones. A maximum-likelihood (ML direct localization is obtained when the sound source is near the array, while in the far-field case, we demonstrate the localization via the cross bearing from several widely separated arrays. In the case of multiple sources, an alternating projection procedure is applied to determine the ML estimate of the DOAs from the observed data. The ML estimator is shown to be effective in locating sound sources of various types, for example, vehicle, music, and even white noise. From the theoretical Cramér-Rao bound analysis, we find that better source location estimates can be obtained for high-frequency signals than low-frequency signals. In addition, large range estimation error results when the source signal is unknown, but such unknown parameter does not have much impact on angle estimation. Much experimentally measured acoustic data was used to verify the proposed algorithms.

  15. Methods of Fourier analysis and approximation theory

    Tikhonov, Sergey

    2016-01-01

    Different facets of interplay between harmonic analysis and approximation theory are covered in this volume. The topics included are Fourier analysis, function spaces, optimization theory, partial differential equations, and their links to modern developments in the approximation theory. The articles of this collection were originated from two events. The first event took place during the 9th ISAAC Congress in Krakow, Poland, 5th-9th August 2013, at the section “Approximation Theory and Fourier Analysis”. The second event was the conference on Fourier Analysis and Approximation Theory in the Centre de Recerca Matemàtica (CRM), Barcelona, during 4th-8th November 2013, organized by the editors of this volume. All articles selected to be part of this collection were carefully reviewed.

  16. Formal analysis of physical theories

    Dalla Chiara, M.L.; Toraldo di Francia, G.

    1979-01-01

    The rules of inference that are made use of in formalization are considered. It is maintained that a physical law represents the universal assertion of a probability, and not the assessment of the probability of a universal assertion. The precision of the apparatus used to collect the experimental evidence is introduced as an essential part of the theoretical structure of physics. This approach allows the author to define the concept of truth in a satisfactory way, abandoning the unacceptable notion of approximate truth. It is shown that a considerable amount of light can be shed on a number of much debated problems arising in the logic of quantum mechanics. It is stressed that the deductive structure of quantum theory seems to be essentially founded on a kind of mixture of different logics. Two different concepts of truth are distinguished within quantum theory, an empirical truth and quantum-logical truth. (Auth.)

  17. Complex space source theory of partially coherent light wave.

    Seshadri, S R

    2010-07-01

    The complex space source theory is used to derive a general integral expression for the vector potential that generates the extended full Gaussian wave in terms of the input value of the vector potential of the corresponding paraxial beam. The vector potential and the fields are assumed to fluctuate on a time scale that is large compared to the wave period. The Poynting vector in the propagation direction averaged over a wave period is expressed in terms of the cross-spectral density of the fluctuating vector potential across the input plane. The Schell model is assumed for the cross-spectral density. The radiation intensity distribution and the power radiated are determined. The effect of spatial coherence on the radiation intensity distribution and the radiated power are investigated for different values of the physical parameters. Illustrative numerical results are provided to bring out the effect of spatial coherence on the propagation characteristics of the fluctuating light wave.

  18. Fixed point theory, variational analysis, and optimization

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  19. Blind source separation advances in theory, algorithms and applications

    Wang, Wenwu

    2014-01-01

    Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.

  20. Decision analysis with cumulative prospect theory.

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  1. Spectral theory and nonlinear functional analysis

    Lopez-Gomez, Julian

    2001-01-01

    This Research Note addresses several pivotal problems in spectral theory and nonlinear functional analysis in connection with the analysis of the structure of the set of zeroes of a general class of nonlinear operators. It features the construction of an optimal algebraic/analytic invariant for calculating the Leray-Schauder degree, new methods for solving nonlinear equations in Banach spaces, and general properties of components of solutions sets presented with minimal use of topological tools. The author also gives several applications of the abstract theory to reaction diffusion equations and systems.The results presented cover a thirty-year period and include recent, unpublished findings of the author and his coworkers. Appealing to a broad audience, Spectral Theory and Nonlinear Functional Analysis contains many important contributions to linear algebra, linear and nonlinear functional analysis, and topology and opens the door for further advances.

  2. Evolution of source term definition and analysis

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  3. Second-order generalized perturbation theory for source-driven systems

    Greenspan, E.; Gilai, D.; Oblow, E.M.

    1978-01-01

    A second-order generalized perturbation theory (GPT) for the effect of multiple system variations on a general flux functional in source-driven systems is derived. The derivation is based on a functional Taylor series in which second-order derivatives are retained. The resulting formulation accounts for the nonlinear effect of a given variation accurate to third order in the flux and adjoint perturbations. It also accounts for the effect of interaction between any number of variations. The new formulation is compared with exact perturbation theory as well as with perturbation theory for altered systems. The usefulnes of the second-order GPT formulation is illustrated by applying it to optimization problems. Its applicability to areas of cross-section sensitivity analysis and system design and evaluation is also discussed

  4. Evolutionary Game Theory Analysis of Tumor Progression

    Wu, Amy; Liao, David; Sturm, James; Austin, Robert

    2014-03-01

    Evolutionary game theory applied to two interacting cell populations can yield quantitative prediction of the future densities of the two cell populations based on the initial interaction terms. We will discuss how in a complex ecology that evolutionary game theory successfully predicts the future densities of strains of stromal and cancer cells (multiple myeloma), and discuss the possible clinical use of such analysis for predicting cancer progression. Supported by the National Science Foundation and the National Cancer Institute.

  5. Probabilistic Structural Analysis Theory Development

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  6. Kepler's theory of force and his medical sources.

    Regier, Jonathan

    2014-01-01

    Johannes Kepler (1571-1630) makes extensive use of souls and spiritus in his natural philosophy. Recent studies have highlighted their importance in his accounts of celestial generation and astrology. In this study, I would like to address two pressing issues. The first is Kepler's context. The biological side of his natural philosophy is not naively Aristotelian. Instead, he is up to date with contemporary discussions in medically flavored natural philosophy. I will examine his relationship to Melanchthon's anatomical-theological Liber de anima (1552) and to Jean Femel's very popular Physiologia (1567), two Galenic sources with a noticeable impact on how he understands the functions of life. The other issue that will direct my article is force at a distance. Medical ideas deeply inform Kepler's theories of light and solar force (virtus motrix). It will become clear that they are not a hindrance even to the hardcore of his celestial physics. Instead, he makes use of soul and spiritus in order to develop a fully mathematized dynamics.

  7. Generalised perturbation theory and source of information through chemical measurements

    Lelek, V.; Marek, T.

    2001-01-01

    It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)

  8. Theory-of-Mind Development Influences Suggestibility and Source Monitoring

    Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B.

    2008-01-01

    According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind…

  9. Pangenesis as a source of new genetic information. The history of a now disproven theory.

    Bergman, Gerald

    2006-01-01

    Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.

  10. Theory and Application of DNA Histogram Analysis.

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  11. Blind source separation dependent component analysis

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  12. Correspondence analysis theory, practice and new strategies

    Beh, Eric J

    2014-01-01

    A comprehensive overview of the internationalisation of correspondence analysis Correspondence Analysis: Theory, Practice and New Strategies examines the key issues of correspondence analysis, and discusses the new advances that have been made over the last 20 years. The main focus of this book is to provide a comprehensive discussion of some of the key technical and practical aspects of correspondence analysis, and to demonstrate how they may be put to use.  Particular attention is given to the history and mathematical links of the developments made. These links include not just those majo

  13. An Introduction to Wavelet Theory and Analysis

    Miner, N.E.

    1998-10-01

    This report reviews the history, theory and mathematics of wavelet analysis. Examination of the Fourier Transform and Short-time Fourier Transform methods provides tiormation about the evolution of the wavelet analysis technique. This overview is intended to provide readers with a basic understanding of wavelet analysis, define common wavelet terminology and describe wavelet amdysis algorithms. The most common algorithms for performing efficient, discrete wavelet transforms for signal analysis and inverse discrete wavelet transforms for signal reconstruction are presented. This report is intended to be approachable by non- mathematicians, although a basic understanding of engineering mathematics is necessary.

  14. Dimensional analysis and group theory in astrophysics

    Kurth, Rudolf

    2013-01-01

    Dimensional Analysis and Group Theory in Astrophysics describes how dimensional analysis, refined by mathematical regularity hypotheses, can be applied to purely qualitative physical assumptions. The book focuses on the continuous spectral of the stars and the mass-luminosity relationship. The text discusses the technique of dimensional analysis, covering both relativistic phenomena and the stellar systems. The book also explains the fundamental conclusion of dimensional analysis, wherein the unknown functions shall be given certain specified forms. The Wien and Stefan-Boltzmann Laws can be si

  15. Interior point algorithms theory and analysis

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  16. Analysis of open source GIS software

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  17. Noncommutative analysis, operator theory and applications

    Cipriani, Fabio; Colombo, Fabrizio; Guido, Daniele; Sabadini, Irene; Sauvageot, Jean-Luc

    2016-01-01

    This book illustrates several aspects of the current research activity in operator theory, operator algebras and applications in various areas of mathematics and mathematical physics. It is addressed to specialists but also to graduate students in several fields including global analysis, Schur analysis, complex analysis, C*-algebras, noncommutative geometry, operator algebras, operator theory and their applications. Contributors: F. Arici, S. Bernstein, V. Bolotnikov, J. Bourgain, P. Cerejeiras, F. Cipriani, F. Colombo, F. D'Andrea, G. Dell'Antonio, M. Elin, U. Franz, D. Guido, T. Isola, A. Kula, L.E. Labuschagne, G. Landi, W.A. Majewski, I. Sabadini, J.-L. Sauvageot, D. Shoikhet, A. Skalski, H. de Snoo, D. C. Struppa, N. Vieira, D.V. Voiculescu, and H. Woracek.

  18. Rhetorical structure theory and text analysis

    Mann, William C.; Matthiessen, Christian M. I. M.; Thompson, Sandra A.

    1989-11-01

    Recent research on text generation has shown that there is a need for stronger linguistic theories that tell in detail how texts communicate. The prevailing theories are very difficult to compare, and it is also very difficult to see how they might be combined into stronger theories. To make comparison and combination a bit more approachable, we have created a book which is designed to encourage comparison. A dozen different authors or teams, all experienced in discourse research, are given exactly the same text to analyze. The text is an appeal for money by a lobbying organization in Washington, DC. It informs, stimulates and manipulates the reader in a fascinating way. The joint analysis is far more insightful than any one team's analysis alone. This paper is our contribution to the book. Rhetorical Structure Theory (RST), the focus of this paper, is a way to account for the functional potential of text, its capacity to achieve the purposes of speakers and produce effects in hearers. It also shows a way to distinguish coherent texts from incoherent ones, and identifies consequences of text structure.

  19. Information Foraging Theory: A Framework for Intelligence Analysis

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  20. Crime analysis using open source information

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  1. Global Sourcing of Heterogeneous Firms: Theory and Evidence

    Kohler, Wilhelm; Smolka, Marcel

    the Encuesta sobre Estrategias Empresariales (ESEE). We find a pattern of effects whereby productivity stimulates vertical integration in industries of low sourcing intensity, but favors outsourcing in industries of high sourcing intensity. Moreover, we find that productivity boosts offshoring throughout all...

  2. Theory and applications of numerical analysis

    Phillips, G M

    1996-01-01

    This text is a self-contained Second Edition, providing an introductory account of the main topics in numerical analysis. The book emphasizes both the theorems which show the underlying rigorous mathematics andthe algorithms which define precisely how to program the numerical methods. Both theoretical and practical examples are included.* a unique blend of theory and applications* two brand new chapters on eigenvalues and splines* inclusion of formal algorithms* numerous fully worked examples* a large number of problems, many with solutions

  3. Can one extract source radii from transport theories?

    Aichelin, J.

    1996-01-01

    To known the space time evolution of a heavy ion reaction is of great interest especially in cases where the measured spectra do not allow to ascertain the underlying reaction mechanism. In recent times it became popular to believe that the comparison of Hanbury-Brown Twiss correlation functions obtained from classical or semiclassical transport theories, like Boltzmann Uehling Uhlenbeck (BUU), Quantum Molecular Dynamics (QMD), VENUS or ARC, with experiments may provide this insight. It is the purpose of this article to show that this is not the case. None of these transport theories provides a reliable time evolution of those quantities which are mandatory for a correct calculation of the correlation function. The reason for this failure is different for the different transport theories. (author)

  4. Can one extract source radii from transport theories?

    Aichelin, J.

    1996-12-31

    To known the space time evolution of a heavy ion reaction is of great interest especially in cases where the measured spectra do not allow to ascertain the underlying reaction mechanism. In recent times it became popular to believe that the comparison of Hanbury-Brown Twiss correlation functions obtained from classical or semiclassical transport theories, like Boltzmann Uehling Uhlenbeck (BUU), Quantum Molecular Dynamics (QMD), VENUS or ARC, with experiments may provide this insight. It is the purpose of this article to show that this is not the case. None of these transport theories provides a reliable time evolution of those quantities which are mandatory for a correct calculation of the correlation function. The reason for this failure is different for the different transport theories. (author).

  5. Compositional Data Analysis Theory and Applications

    Pawlowsky-Glahn, Vera

    2011-01-01

    This book presents the state-of-the-art in compositional data analysis and will feature a collection of papers covering theory, applications to various fields of science and software. Areas covered will range from geology, biology, environmental sciences, forensic sciences, medicine and hydrology. Key features:Provides the state-of-the-art text in compositional data analysisCovers a variety of subject areas, from geology to medicineWritten by leading researchers in the fieldIs supported by a website featuring R code

  6. Characterizing Sources of Uncertainty in Item Response Theory Scale Scores

    Yang, Ji Seung; Hansen, Mark; Cai, Li

    2012-01-01

    Traditional estimators of item response theory scale scores ignore uncertainty carried over from the item calibration process, which can lead to incorrect estimates of the standard errors of measurement (SEMs). Here, the authors review a variety of approaches that have been applied to this problem and compare them on the basis of their statistical…

  7. Theory of nanolaser devices: Rate equation analysis versus microscopic theory

    Lorke, Michael; Skovgård, Troels Suhr; Gregersen, Niels

    2013-01-01

    A rate equation theory for quantum-dot-based nanolaser devices is developed. We show that these rate equations are capable of reproducing results of a microscopic semiconductor theory, making them an appropriate starting point for complex device simulations of nanolasers. The input...

  8. Mechanistic facility safety and source term analysis

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  9. Astrophysical data analysis with information field theory

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  10. Astrophysical data analysis with information field theory

    Enßlin, Torsten

    2014-12-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  11. Astrophysical data analysis with information field theory

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  12. Model Theory in Algebra, Analysis and Arithmetic

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  13. Legal Theory, Sources of Law and the Semantic Web

    Boer, A

    2009-01-01

    Attempts to construct an integrated conceptual framework for the application-neutral and problem-neutral representation of sources of law using Semantic Web technology and concepts and some technically straightforward extensions to Semantic Web technology based on established practices found in fielded applications

  14. Sourcing Premia with Incomplete Contracts: Theory and Evidence

    Kohler, Wilhelm; Smolka, Marcel

    2011-01-01

    We identify general conditions that result in an unambiguous mapping of a firm's productivity level into the organizational form and location of its input sourcing. We then explore a Spanish firm-level data set in order to establish a number of stylized facts about firm-level heterogeneity...

  15. Global Sourcing of Heterogeneous Firms: Theory and Evidence

    Kohler, Wilhelm; Smolka, Marcel

    2015-01-01

    The share of international trade within firm boundaries varies greatly across countries. This column presents new evidence on how the productivity of a firm affects the choice between vertical integration and outsourcing, as well as between foreign and domestic sourcing. The productivity effects...

  16. Source-system windowing for speech analysis

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  17. Isotopic neutron sources for neutron activation analysis

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  18. Constructivism theory analysis and application to curricula.

    Brandon, Amy F; All, Anita C

    2010-01-01

    Today's nursing programs are struggling to accommodate the changing needs of the health care environment and need to make changes in how students are taught. Using constructivism theory, whereby learning is an active process in which learners construct new ideas or concepts based upon their current or past knowledge, leaders in nursing education can make a paradigm shift toward concept-based curricula. This article presents a summary and analysis of constructivism and an innovative application of its active-learning principles to curriculum development, specifically for the education of nursing students.

  19. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  20. How Many Separable Sources? Model Selection In Independent Components Analysis

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  1. THE RESPONSIBILITY TO PROTECT. A JUST WAR THEORY BASED ANALYSIS

    Andreea IANCU

    2014-11-01

    Full Text Available This paper analyzes the Responsibility to protect principle as the paradigm that reinforces the just war theory in the current international relations. The importance of this analysis is given by the fact that in the current change of source of international conflicts, the Responsibility to protect principle affirms the responsibility of the international community to protect all the citizens of the world. In this context we witness a translation toward a Post-Westphalian international system, which values the individual as a security referent. This article discusses the origins of the responsibility to protect principle and problematizes (discusses the legitimacy of use of violence and force in the current international system. Moreover, the paper analyzes the possible humanization of the current international relations and, simultaneously, the persistency of conflict and warfare in the international system. The conclusion of this research states that the Responsibility to protect principle revises the just war theory by centering it on the individual.

  2. Nonlinear analysis approximation theory, optimization and applications

    2014-01-01

    Many of our daily-life problems can be written in the form of an optimization problem. Therefore, solution methods are needed to solve such problems. Due to the complexity of the problems, it is not always easy to find the exact solution. However, approximate solutions can be found. The theory of the best approximation is applicable in a variety of problems arising in nonlinear functional analysis and optimization. This book highlights interesting aspects of nonlinear analysis and optimization together with many applications in the areas of physical and social sciences including engineering. It is immensely helpful for young graduates and researchers who are pursuing research in this field, as it provides abundant research resources for researchers and post-doctoral fellows. This will be a valuable addition to the library of anyone who works in the field of applied mathematics, economics and engineering.

  3. Perturbative analysis in higher-spin theories

    Didenko, V.E. [I.E. Tamm Department of Theoretical Physics, Lebedev Physical Institute,Leninsky prospect 53, 119991, Moscow (Russian Federation); Misuna, N.G. [Moscow Institute of Physics and Technology,Institutsky lane 9, 141700, Dolgoprudny, Moscow region (Russian Federation); Vasiliev, M.A. [I.E. Tamm Department of Theoretical Physics, Lebedev Physical Institute,Leninsky prospect 53, 119991, Moscow (Russian Federation)

    2016-07-28

    A new scheme of the perturbative analysis of the nonlinear HS equations is developed giving directly the final result for the successive application of the homotopy integrations which appear in the standard approach. It drastically simplifies the analysis and results from the application of the standard spectral sequence approach to the higher-spin covariant derivatives, allowing us in particular to reduce multiple homotopy integrals resulting from the successive application of the homotopy trick to a single integral. Efficiency of the proposed method is illustrated by various examples. In particular, it is shown how the Central on-shell theorem of the free theory immediately results from the nonlinear HS field equations with no intermediate computations.

  4. A Comparative Analysis of Three Unique Theories of Organizational Learning

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  5. The liquidity preference theory: a critical analysis

    Giancarlo Bertocco; Andrea Kalajzic

    2014-01-01

    Keynes in the General Theory, explains the monetary nature of the interest rate by means of the liquidity preference theory. The objective of this paper is twofold. First, to point out the limits of the liquidity preference theory. Second, to present an explanation of the monetary nature of the interest rate based on the arguments with which Keynes responded to the criticism levelled at the liquidity preference theory by supporters of the loanable funds theory such as Ohlin and Robertson. It ...

  6. How Many Separable Sources? Model Selection In Independent Components Analysis

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  7. Identifying the Source of Misfit in Item Response Theory Models.

    Liu, Yang; Maydeu-Olivares, Alberto

    2014-01-01

    When an item response theory model fails to fit adequately, the items for which the model provides a good fit and those for which it does not must be determined. To this end, we compare the performance of several fit statistics for item pairs with known asymptotic distributions under maximum likelihood estimation of the item parameters: (a) a mean and variance adjustment to bivariate Pearson's X(2), (b) a bivariate subtable analog to Reiser's (1996) overall goodness-of-fit test, (c) a z statistic for the bivariate residual cross product, and (d) Maydeu-Olivares and Joe's (2006) M2 statistic applied to bivariate subtables. The unadjusted Pearson's X(2) with heuristically determined degrees of freedom is also included in the comparison. For binary and ordinal data, our simulation results suggest that the z statistic has the best Type I error and power behavior among all the statistics under investigation when the observed information matrix is used in its computation. However, if one has to use the cross-product information, the mean and variance adjusted X(2) is recommended. We illustrate the use of pairwise fit statistics in 2 real-data examples and discuss possible extensions of the current research in various directions.

  8. The problem of electric sources in Einstein's Hermite-symmetric field theory

    Kreisel, E.

    1986-01-01

    The possibility is investigated to introduce a geometric source without A-invariance and Hermite-symmetry breaking of Einstein's Hermitian relativity. It would be very meaningful to interpret a source of this kind as electric current. With this extension Einstein's unitary field theory contains Einstein's gravitation, electromagnetism and the gluonic vacuum of chromodynamics. (author)

  9. Accounting for uncertain fault geometry in earthquake source inversions - I: theory and simplified application

    Ragon, Théa; Sladen, Anthony; Simons, Mark

    2018-05-01

    The ill-posed nature of earthquake source estimation derives from several factors including the quality and quantity of available observations and the fidelity of our forward theory. Observational errors are usually accounted for in the inversion process. Epistemic errors, which stem from our simplified description of the forward problem, are rarely dealt with despite their potential to bias the estimate of a source model. In this study, we explore the impact of uncertainties related to the choice of a fault geometry in source inversion problems. The geometry of a fault structure is generally reduced to a set of parameters, such as position, strike and dip, for one or a few planar fault segments. While some of these parameters can be solved for, more often they are fixed to an uncertain value. We propose a practical framework to address this limitation by following a previously implemented method exploring the impact of uncertainties on the elastic properties of our models. We develop a sensitivity analysis to small perturbations of fault dip and position. The uncertainties in fault geometry are included in the inverse problem under the formulation of the misfit covariance matrix that combines both prediction and observation uncertainties. We validate this approach with the simplified case of a fault that extends infinitely along strike, using both Bayesian and optimization formulations of a static inversion. If epistemic errors are ignored, predictions are overconfident in the data and source parameters are not reliably estimated. In contrast, inclusion of uncertainties in fault geometry allows us to infer a robust posterior source model. Epistemic uncertainties can be many orders of magnitude larger than observational errors for great earthquakes (Mw > 8). Not accounting for uncertainties in fault geometry may partly explain observed shallow slip deficits for continental earthquakes. Similarly, ignoring the impact of epistemic errors can also bias estimates of

  10. Challenges in combining different data sets during analysis when using grounded theory.

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  11. Theoretical and methodological analysis of personality theories of leadership

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  12. Transport perturbation theory in nuclear reactor analysis

    Nishigori, Takeo; Takeda, Toshikazu; Selvi, S.

    1985-01-01

    Perturbation theory is formulated on the basis of transport theory to obtain a formula for the reactivity changes due to possible variations of cross sections. Useful applications to cell homogenization are presented for the whole core calculation in transport and in diffusion theories. (author)

  13. Applications of model theory to functional analysis

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  14. The flow analysis of supercavitating cascade by linear theory

    Park, E.T. [Sung Kyun Kwan Univ., Seoul (Korea, Republic of); Hwang, Y. [Seoul National Univ., Seoul (Korea, Republic of)

    1996-06-01

    In order to reduce damages due to cavitation effects and to improve performance of fluid machinery, supercavitation around the cascade and the hydraulic characteristics of supercavitating cascade must be analyzed accurately. And the study on the effects of cavitation on fluid machinery and analysis on the performances of supercavitating hydrofoil through various elements governing flow field are critically important. In this study comparison of experiment results with the computed results of linear theory using singularity method was obtainable. Specially singularity points like sources and vortexes on hydrofoil and freestreamline were distributed to analyze two dimensional flow field of supercavitating cascade, and governing equations of flow field were derived and hydraulic characteristics of cascade were calculated by numerical analysis of the governing equations. 7 refs., 6 figs.

  15. Data analysis and source modelling for LISA

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  16. Efficient image enhancement using sparse source separation in the Retinex theory

    Yoon, Jongsu; Choi, Jangwon; Choe, Yoonsik

    2017-11-01

    Color constancy is the feature of the human vision system (HVS) that ensures the relative constancy of the perceived color of objects under varying illumination conditions. The Retinex theory of machine vision systems is based on the HVS. Among Retinex algorithms, the physics-based algorithms are efficient; however, they generally do not satisfy the local characteristics of the original Retinex theory because they eliminate global illumination from their optimization. We apply the sparse source separation technique to the Retinex theory to present a physics-based algorithm that satisfies the locality characteristic of the original Retinex theory. Previous Retinex algorithms have limited use in image enhancement because the total variation Retinex results in an overly enhanced image and the sparse source separation Retinex cannot completely restore the original image. In contrast, our proposed method preserves the image edge and can very nearly replicate the original image without any special operation.

  17. Analysis of Multidimensional Poverty: Theory and Case Studies ...

    2009-08-18

    Aug 18, 2009 ... ... of applying a factorial technique, Multiple Correspondence Analysis, to poverty analysis. ... Analysis of Multidimensional Poverty: Theory and Case Studies ... agreement to support joint research projects in December 2017.

  18. The Mayaguez Incident: An Organizational Theory Analysis

    Lengel, Edward J; Rambo, Charles R; Rodriguez, Shelley A; Tyynismaa, Michael D

    2006-01-01

    .... Henry Mintzberg's structural contingency model and Lee Bowman and Terrence Deal's frames model within organizational theory are applied to the executive-level decisions made during the operation...

  19. Variational analysis of regular mappings theory and applications

    Ioffe, Alexander D

    2017-01-01

    This monograph offers the first systematic account of (metric) regularity theory in variational analysis. It presents new developments alongside classical results and demonstrates the power of the theory through applications to various problems in analysis and optimization theory. The origins of metric regularity theory can be traced back to a series of fundamental ideas and results of nonlinear functional analysis and global analysis centered around problems of existence and stability of solutions of nonlinear equations. In variational analysis, regularity theory goes far beyond the classical setting and is also concerned with non-differentiable and multi-valued operators. The present volume explores all basic aspects of the theory, from the most general problems for mappings between metric spaces to those connected with fairly concrete and important classes of operators acting in Banach and finite dimensional spaces. Written by a leading expert in the field, the book covers new and powerful techniques, whic...

  20. Organizational Theories and Analysis: A Feminist Perspective

    Irefin, Peace; Ifah, S. S.; Bwala, M. H.

    2012-06-01

    This paper is a critique of organization theories and their failure to come to terms with the fact of the reproduction of labour power within a particular form of the division of labour. It examines feminist theory and its aims to understand the nature of inequality and focuses on gender, power relations and sexuality part of the task of feminists which organizational theories have neglected is to offer an account of how the different treatments of the sexes operate in our culture. The paper concludes that gender has been completely neglected within the organizational theory which result in a rhetorical reproduction of males as norms and women as others. It is recommended that only radical form of organization theory can account for the situation of women in organisational setting

  1. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  2. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Tewari Susanta

    2012-10-01

    Full Text Available Abstract Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach.

  3. Soprano and source: A laryngographic analysis

    Bateman, Laura Anne

    2005-04-01

    Popular music in the 21st century uses a particular singing quality for female voice that is quite different from the trained classical singing quality. Classical quality has been the subject of a vast body of research, whereas research that deals with non-classical qualities is limited. In order to learn more about these issues, the author chose to do research on singing qualities using a variety of standard voice quality tests. This paper looks at voice qualities found in various different styles of singing: Classical, Belt, Legit, R&B, Jazz, Country, and Pop. The data was elicited from a professional soprano and the voice qualities reflect industry standards. The data set for this paper is limited to samples using the vowel [i]. Laryngographic (LGG) data was generated simultaneously with the audio samples. This paper will focus on the results of the LGG analysis; however, an audio analysis was also performed using Spectrogram, LPC, and FFT. Data from the LGG is used to calculate the contact quotient, speed quotient, and ascending slope. The LGG waveform is also visually assessed. The LGG analysis gives insights into the source vibration for the different singing styles.

  4. An Analysis of Theories on Stock Returns

    Ahmet Sekreter

    2017-03-01

    Full Text Available Objective in writing this article is to provide an overview of the theories that has been developed for stock returns which is an important area of financial markets’ researches. Since the researches in this field are very active for the past quarter, it is not possible to describe all works that has been done in this area. Most important researches will be discussed without going deeper in mathematical tools and theories.

  5. Applying circular economy innovation theory in business process modeling and analysis

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  6. Does the source energy change when gravitaion waves are emitted in the einstein's gravitation theory

    Logunov, A.A.; Folomeshkin, V.N.

    1977-01-01

    It is shown that in the Einstein's gravitation theory the total ''energy'' of a plane gravitational wave calculated with any pseudotensor is equal to zero. The known Einstein's result, according to which the energy of a sourceis decreased when plane weak gravitational waves are emitted, have no place in the Einstein's gravitational theory. The examples are given of exact wave solutions for which the pseudotensor is strictly equal to zero. The energy-momentum of any weak gravitational waves is always equal to zero in the Einstein's gravitation theory. When such waves are emitted the energy of the source cannot change, although these waves are real curvature waves. By the means in the Einstein's gravitation theory the energy, e, is in essenc generated from nothing

  7. Modern Theory of Gratings Resonant Scattering: Analysis Techniques and Phenomena

    Sirenko, Yuriy K

    2010-01-01

    Diffraction gratings are one of the most popular objects of analysis in electromagnetic theory. The requirements of applied optics and microwave engineering lead to many new problems and challenges for the theory of diffraction gratings, which force us to search for new methods and tools for their resolution. In Modern Theory of Gratings, the authors present results of the electromagnetic theory of diffraction gratings that will constitute the base of further development of this theory, which meet the challenges provided by modern requirements of fundamental and applied science. This volume covers: spectral theory of gratings (Chapter 1) giving reliable grounds for physical analysis of space-frequency and space-time transformations of the electromagnetic field in open periodic resonators and waveguides; authentic analytic regularization procedures (Chapter 2) that, in contradistinction to the traditional frequency-domain approaches, fit perfectly for the analysis of resonant wave scattering processes; paramet...

  8. Source-Free Exchange-Correlation Magnetic Fields in Density Functional Theory.

    Sharma, S; Gross, E K U; Sanna, A; Dewhurst, J K

    2018-03-13

    Spin-dependent exchange-correlation energy functionals in use today depend on the charge density and the magnetization density: E xc [ρ, m]. However, it is also correct to define the functional in terms of the curl of m for physical external fields: E xc [ρ,∇ × m]. The exchange-correlation magnetic field, B xc , then becomes source-free. We study this variation of the theory by uniquely removing the source term from local and generalized gradient approximations to the functional. By doing so, the total Kohn-Sham moments are improved for a wide range of materials for both functionals. Significantly, the moments for the pnictides are now in good agreement with experiment. This source-free method is simple to implement in all existing density functional theory codes.

  9. Concept analysis and the building blocks of theory: misconceptions regarding theory development.

    Bergdahl, Elisabeth; Berterö, Carina M

    2016-10-01

    The purpose of this article is to discuss the attempts to justify concepts analysis as a way to construct theory - a notion often advocated in nursing. The notion that concepts are the building blocks or threads from which theory is constructed is often repeated. It can be found in many articles and well-known textbooks. However, this notion is seldom explained or defended. The notion of concepts as building blocks has also been questioned by several authors. However, most of these authors seem to agree to some degree that concepts are essential components from which theory is built. Discussion paper. Literature was reviewed to synthesize and debate current knowledge. Our point is that theory is not built by concepts analysis or clarification and we will show that this notion has its basis in some serious misunderstandings. We argue that concept analysis is not a part of sound scientific method and should be abandoned. The current methods of concept analysis in nursing have no foundation in philosophy of science or in language philosophy. The type of concept analysis performed in nursing is not a way to 'construct' theory. Rather, theories are formed by creative endeavour to propose a solution to a scientific and/or practical problem. The bottom line is that the current style and form of concept analysis in nursing should be abandoned in favour of methods in line with modern theory of science. © 2016 John Wiley & Sons Ltd.

  10. The resolution of point sources of light as analyzed by quantum detection theory

    Helstrom, C. W.

    1972-01-01

    The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  11. Resolution of point sources of light as analyzed by quantum detection theory.

    Helstrom, C. W.

    1973-01-01

    The resolvability of point sources of incoherent thermal light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  12. Concept of spatial channel theory applied to reactor shielding analysis

    Williams, M.L.; Engle, W.W. Jr.

    1977-01-01

    The concept of channel theory is used to locate spatial regions that are important in contributing to a shielding response. The method is analogous to the channel-theory method developed for ascertaining important energy channels in cross-section analysis. The mathematical basis for the theory is shown to be the generalized reciprocity relation, and sample problems are given to exhibit and verify properties predicted by the mathematical equations. A practical example is cited from the shielding analysis of the Fast Flux Test Facility performed at Oak Ridge National Laboratory, in which a perspective plot of channel-theory results was found useful in locating streaming paths around the reactor cavity shield

  13. Spectral theory and nonlinear analysis with applications to spatial ecology

    Cano-Casanova, S; Mora-Corral , C

    2005-01-01

    This volume details some of the latest advances in spectral theory and nonlinear analysis through various cutting-edge theories on algebraic multiplicities, global bifurcation theory, non-linear Schrödinger equations, non-linear boundary value problems, large solutions, metasolutions, dynamical systems, and applications to spatial ecology. The main scope of the book is bringing together a series of topics that have evolved separately during the last decades around the common denominator of spectral theory and nonlinear analysis - from the most abstract developments up to the most concrete applications to population dynamics and socio-biology - in an effort to fill the existing gaps between these fields.

  14. Mean-deviation analysis in the theory of choice.

    Grechuk, Bogdan; Molyboha, Anton; Zabarankin, Michael

    2012-08-01

    Mean-deviation analysis, along with the existing theories of coherent risk measures and dual utility, is examined in the context of the theory of choice under uncertainty, which studies rational preference relations for random outcomes based on different sets of axioms such as transitivity, monotonicity, continuity, etc. An axiomatic foundation of the theory of coherent risk measures is obtained as a relaxation of the axioms of the dual utility theory, and a further relaxation of the axioms are shown to lead to the mean-deviation analysis. Paradoxes arising from the sets of axioms corresponding to these theories and their possible resolutions are discussed, and application of the mean-deviation analysis to optimal risk sharing and portfolio selection in the context of rational choice is considered. © 2012 Society for Risk Analysis.

  15. What kind of theory is music theory? : Epistemological exercises in music theory and analysis

    2008-01-01

    Music theory has long aligned itself with the sciences - particularly with physics, mathematics, and experimental psychology - seeking to cloak itself in the mantle of their epistemological legitimacy. This affinity, which was foreshadowed in music's inclusion in the medieval quadrivium alongside geometry, astronomy, and arithmetic, is evident throughout the history of music theory from the scientific revolution onward. Yet, as eager as music theorists have been to claim the epistemological p...

  16. An introduction to nonlinear analysis and fixed point theory

    Pathak, Hemant Kumar

    2018-01-01

    This book systematically introduces the theory of nonlinear analysis, providing an overview of topics such as geometry of Banach spaces, differential calculus in Banach spaces, monotone operators, and fixed point theorems. It also discusses degree theory, nonlinear matrix equations, control theory, differential and integral equations, and inclusions. The book presents surjectivity theorems, variational inequalities, stochastic game theory and mathematical biology, along with a large number of applications of these theories in various other disciplines. Nonlinear analysis is characterised by its applications in numerous interdisciplinary fields, ranging from engineering to space science, hydromechanics to astrophysics, chemistry to biology, theoretical mechanics to biomechanics and economics to stochastic game theory. Organised into ten chapters, the book shows the elegance of the subject and its deep-rooted concepts and techniques, which provide the tools for developing more realistic and accurate models for ...

  17. Development of a noncompact source theory with applications to helicopter rotors

    Farassat, F.; Brown, T. J.

    1976-01-01

    A new formulation for determining the acoustic field of moving bodies, based on acoustic analogy, is derived. The acoustic pressure is given as the sum of two integrals, one of which has a derivative with respect to time. The integrands are functions of the normal velocity and surface pressure of the body. A computer program based on this formulation was used to calculate acoustic pressure signatures for several helicoptor rotors from experimental surface pressure data. Results are compared with those from compact source calculations. It is shown that noncompactness of steady sources on the rotor can account for the high harmonics of the pressure system. Thickness noise is shown to be a significant source of sound, especially for blunt airfoils in regions where noncompact source theory should be applied.

  18. Finite element analysis theory and application with ANSYS

    Moaveni, Saeed

    2015-01-01

    For courses in Finite Element Analysis, offered in departments of Mechanical or Civil and Environmental Engineering. While many good textbooks cover the theory of finite element modeling, Finite Element Analysis: Theory and Application with ANSYS is the only text available that incorporates ANSYS as an integral part of its content. Moaveni presents the theory of finite element analysis, explores its application as a design/modeling tool, and explains in detail how to use ANSYS intelligently and effectively. Teaching and Learning Experience This program will provide a better teaching and learning experience-for you and your students. It will help: *Present the Theory of Finite Element Analysis: The presentation of theoretical aspects of finite element analysis is carefully designed not to overwhelm students. *Explain How to Use ANSYS Effectively: ANSYS is incorporated as an integral part of the content throughout the book. *Explore How to Use FEA as a Design/Modeling Tool: Open-ended design problems help stude...

  19. Symmetry analysis for anisotropic field theories

    Parra, Lorena; Vergara, J. David

    2012-01-01

    The purpose of this paper is to study with the help of Noether's theorem the symmetries of anisotropic actions for arbitrary fields which generally depend on higher order spatial derivatives, and to find the corresponding current densities and the Noether charges. We study in particular scale invariance and consider the cases of higher derivative extensions of the scalar field, electrodynamics and Chern-Simons theory.

  20. Addendum to foundations of multidimensional wave field signal theory: Gaussian source function

    Natalie Baddour

    2018-02-01

    Full Text Available Many important physical phenomena are described by wave or diffusion-wave type equations. Recent work has shown that a transform domain signal description from linear system theory can give meaningful insight to multi-dimensional wave fields. In N. Baddour [AIP Adv. 1, 022120 (2011], certain results were derived that are mathematically useful for the inversion of multi-dimensional Fourier transforms, but more importantly provide useful insight into how source functions are related to the resulting wave field. In this short addendum to that work, it is shown that these results can be applied with a Gaussian source function, which is often useful for modelling various physical phenomena.

  1. Addendum to foundations of multidimensional wave field signal theory: Gaussian source function

    Baddour, Natalie

    2018-02-01

    Many important physical phenomena are described by wave or diffusion-wave type equations. Recent work has shown that a transform domain signal description from linear system theory can give meaningful insight to multi-dimensional wave fields. In N. Baddour [AIP Adv. 1, 022120 (2011)], certain results were derived that are mathematically useful for the inversion of multi-dimensional Fourier transforms, but more importantly provide useful insight into how source functions are related to the resulting wave field. In this short addendum to that work, it is shown that these results can be applied with a Gaussian source function, which is often useful for modelling various physical phenomena.

  2. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Particle production in field theories coupled to strong external sources, I: Formalism and main results

    Gelis, Francois; Venugopalan, Raju

    2006-01-01

    We develop a formalism for particle production in a field theory coupled to a strong time-dependent external source. An example of such a theory is the color glass condensate. We derive a formula, in terms of cut vacuum-vacuum Feynman graphs, for the probability of producing a given number of particles. This formula is valid to all orders in the coupling constant. The distribution of multiplicities is non-Poissonian, even in the classical approximation. We investigate an alternative method of calculating the mean multiplicity. At leading order, the average multiplicity can be expressed in terms of retarded solutions of classical equations of motion. We demonstrate that the average multiplicity at next-to-leading order can be formulated as an initial value problem by solving equations of motion for small fluctuation fields with retarded boundary conditions. The variance of the distribution can be calculated in a similar fashion. Our formalism therefore provides a framework to compute from first principles particle production in proton-nucleus and nucleus-nucleus collisions beyond leading order in the coupling constant and to all orders in the source density. We also provide a transparent interpretation (in conventional field theory language) of the well-known Abramovsky-Gribov-Kancheli (AGK) cancellations. Explicit connections are made between the framework for multi-particle production developed here and the framework of reggeon field theory

  4. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  5. Spectral analysis and filter theory in applied geophysics

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  6. An analysis of the transformational leadership theory | Moradi ...

    An analysis of the transformational leadership theory. ... at all levels of the organization also feel the need to cooperate with others to achieve the desired results. ... intellectual stimulation; inspirational motivation; personal considerations ...

  7. Sources of Currency Crisis: An Empirical Analysis

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  8. Antioxidants: Characterization, natural sources, extraction and analysis

    OROIAN, MIRCEA; Escriche Roberto, Mª Isabel

    2015-01-01

    [EN] Recently many review papers regarding antioxidants fromdifferent sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, fo...

  9. Platoon Dispersion Analysis Based on Diffusion Theory

    Badhrudeen Mohamed

    2017-01-01

    Full Text Available Urbanization and gro wing demand for travel, causes the traffic system to work ineffectively in most urban areas leadin g to traffic congestion. Many approaches have been adopted to address this problem, one among them being the signal co-ordination. This can be achieved if the platoon of vehicles that gets discharged at one signal gets green at consecutive signals with minimal delay. However, platoons tend to get dispersed as they travel and this dispersion phenomenon should be taken into account for effective signal coordination. Reported studies in this area are from the homogeneous and lane disciplined traffic conditions. This paper analyse the platoon dispersion characteristics under heterogeneous and lane-less traffic conditions. Out of the various modeling techniques reported, the approach based on diffusion theory is used in this study. The diffusion theory based models so far assumed thedata to follow normal distribution. However, in the present study, the data was found to follow lognormal distribution and hence the implementation was carried out using lognormal distribution. The parameters of lognormal distribution were calibrated for the study condition. For comparison purpose, normal distribution was also calibrated and the results were evaluated. It was foun d that model with log normal distribution performed better in all cases than the o ne with normal distribution.

  10. LED intense headband light source for fingerprint analysis

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  11. Effective field theory analysis of Higgs naturalness

    Bar-Shalom, Shaouly [Technion-Israel Inst. of Tech., Haifa (Israel); Soni, Amarjit [Brookhaven National Lab. (BNL), Upton, NY (United States); Wudka, Jose [Univ. of California, Riverside, CA (United States)

    2015-07-20

    Assuming the presence of physics beyond the Standard Model ( SM) with a characteristic scale M ~ O (10) TeV, we investigate the naturalness of the Higgs sector at scales below M using an effective field theory (EFT) approach. We obtain the leading 1 -loop EFT contributions to the Higgs mass with a Wilsonian-like hard cutoff, and determine t he constraints on the corresponding operator coefficients for these effects to alleviate the little hierarchy problem up to the scale of the effective action Λ < M , a condition we denote by “EFT-naturalness”. We also determine the types of physics that can lead to EFT-naturalness and show that these types of new physics are best probed in vector-boson and multiple-Higgs production. The current experimental constraints on these coefficients are also discussed.

  12. Asteroid orbital error analysis: Theory and application

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  13. Kinematic analysis of parallel manipulators by algebraic screw theory

    Gallardo-Alvarado, Jaime

    2016-01-01

    This book reviews the fundamentals of screw theory concerned with velocity analysis of rigid-bodies, confirmed with detailed and explicit proofs. The author additionally investigates acceleration, jerk, and hyper-jerk analyses of rigid-bodies following the trend of the velocity analysis. With the material provided in this book, readers can extend the theory of screws into the kinematics of optional order of rigid-bodies. Illustrative examples and exercises to reinforce learning are provided. Of particular note, the kinematics of emblematic parallel manipulators, such as the Delta robot as well as the original Gough and Stewart platforms are revisited applying, in addition to the theory of screws, new methods devoted to simplify the corresponding forward-displacement analysis, a challenging task for most parallel manipulators. Stands as the only book devoted to the acceleration, jerk and hyper-jerk (snap) analyses of rigid-body by means of screw theory; Provides new strategies to simplify the forward kinematic...

  14. Adapted wavelet analysis from theory to software

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  15. Super Yang-Mills theory in 10+2 dimensions, The 2T-physics Source for N=4 SYM and M(atrix) Theory

    Bars, Itzhak

    2010-01-01

    In this paper we construct super Yang-Mills theory in 10+2 dimensions, a number of dimensions that was not reached before in a unitary supersymmetric field theory, and show that this is the 2T-physics source of some cherished lower dimensional field theories. The much studied conformally exact N=4 Super Yang-Mills field theory in 3+1 dimensions is known to be a compactified version of N=1 SYM in 9+1 dimensions, while M(atrix) theory is obtained by compactifications of the 9+1 theory to 0 dimensions (also 0+1 and others). We show that there is a deeper origin of these theories in two higher dimensions as they emerge from the new theory with two times. Pursuing various alternatives of gauge choices, solving kinematic equations and/or dimensional reductions of the 10+2 theory, we suggest a web of connections that include those mentioned above and a host of new theories that relate 2T-physics and 1T-physics field theories, all of which have the 10+2 theory as the parent. In addition to establishing the higher spa...

  16. Film scoring today - Theory, practice and analysis

    Flach, Paula Sophie

    2012-01-01

    This thesis considers film scoring by taking a closer look at the theoretical discourse throughout the last decades, examining current production practice of film music and showcasing a musical analysis of the film Inception (2010).

  17. Theory and Practice of Financial Analysis

    Jakova, Ivana

    2009-01-01

    Analysts, managers or other business executives and students have at their disposal wide variety analytical techniques when they want to evaluate company's financial position or when they wish to better understand the financial implication of business operational activities or investment. This thesis examines the uses of financial analysis as one of the main financial assessment techniques. After describing theoretically the main tools of financial analysis, this thesis determines the practic...

  18. Ion sources for solids isotopic analysis

    Tyrrell, A.C.

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material. (Auth.)

  19. Analysis of Contract Source Selection Strategy

    2015-07-07

    accomplish this milestone due to his unconditional love. I would like to thank my mom, Saraswati, and my dad , Khilendra, for their support and patience...FOR FURTHER RESEARCH The task of understanding the impact of a source selection strategy on resultant contract outcomes is a topic rich for further

  20. Ion sources for solids isotopic analysis

    Tyrrell, A. C. [Ministry of Defence, Foulness (UK). Atomic Weapons Research Establishment

    1978-12-15

    Of the dozen or so methods of producing ions from solid samples only the surface or thermal ionisation method has found general application for precise measurement of isotopic ratios. The author discusses the principal variables affecting the performance of the thermal source; sample preparation, loading onto the filament, sample pre-treatment, filament material.

  1. Probabilistic forward model for electroencephalography source analysis

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  2. A critical experimental test of synchrotron radiation theory with 3rd generation light source

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-05-15

    A recent ''beam splitting'' experiment at LCLS apparently demonstrated that after a microbunched electron beam is kicked on a large angle compared to the divergence of the FEL radiation, the microbunching wave front is readjusted along the new direction of motion of the kicked beam. Therefore, coherent radiation from an undulator placed after the kicker is emitted along the kicked direction without suppression. This strong emission of coherent undulator radiation in the kicked direction cannot be explained in the framework of conventional synchrotron radiation theory. In a previous paper we explained this puzzle. We demonstrated that, in accelerator physics, the coupling of fields and particles is based, on the one hand, on the use of results from particle dynamics treated according to the absolute time convention and, on the other hand, on the use of Maxwell equations treated according to the standard (Einstein) synchronization convention. Here lies the misconception which led to the strong qualitative disagreement between theory and experiment. After the ''beam splitting'' experiment at LCLS, it became clear that the conventional theory of synchrotron radiation cannot ensure the correct description of coherent and spontaneous emission from a kicked electron beam, nor the emission from a beam with finite angular divergence, in an undulator or a bending magnet. However, this result requires further experimental confirmation. In this publication we propose an uncomplicated and inexpensive experiment to test synchrotron radiation theory at 3rd generation light sources.

  3. A critical experimental test of synchrotron radiation theory with 3rd generation light source

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2016-05-01

    A recent ''beam splitting'' experiment at LCLS apparently demonstrated that after a microbunched electron beam is kicked on a large angle compared to the divergence of the FEL radiation, the microbunching wave front is readjusted along the new direction of motion of the kicked beam. Therefore, coherent radiation from an undulator placed after the kicker is emitted along the kicked direction without suppression. This strong emission of coherent undulator radiation in the kicked direction cannot be explained in the framework of conventional synchrotron radiation theory. In a previous paper we explained this puzzle. We demonstrated that, in accelerator physics, the coupling of fields and particles is based, on the one hand, on the use of results from particle dynamics treated according to the absolute time convention and, on the other hand, on the use of Maxwell equations treated according to the standard (Einstein) synchronization convention. Here lies the misconception which led to the strong qualitative disagreement between theory and experiment. After the ''beam splitting'' experiment at LCLS, it became clear that the conventional theory of synchrotron radiation cannot ensure the correct description of coherent and spontaneous emission from a kicked electron beam, nor the emission from a beam with finite angular divergence, in an undulator or a bending magnet. However, this result requires further experimental confirmation. In this publication we propose an uncomplicated and inexpensive experiment to test synchrotron radiation theory at 3rd generation light sources.

  4. Analysis and evaluation of the moral distress theory.

    Wilson, Melissa A

    2018-04-01

    Moral distress is a pervasive problem in nursing resulting in a detriment to patient care, providers, and organizations. Over a decade ago, the moral distress theory (MDT) was proposed and utilized in multiple research studies. This middle range theory explains and predicts the distress that occurs in a nurse because of moral conflict. The research findings born from this theory have been substantial. Since inception of this theory, moral distress has been extensively examined which has further elaborated its understanding. This paper provides an analysis and evaluation of the MDT according to applicable guidelines. Current understanding of the phenomenon indicates that a new theory may be warranted to better predict, treat, and manage moral distress. © 2017 Wiley Periodicals, Inc.

  5. Recurrence quantification analysis theory and best practices

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  6. Applying thematic analysis theory to practice: a researcher's experience.

    Tuckett, Anthony G

    2005-01-01

    This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.

  7. Source theory analysis of electron--positron annihilation experiments

    Schwinger, J.

    1975-01-01

    The phenomenological viewpoint already applied to deep inelastic scattering is extended to the discussion of electron-positron annihilation experiments. Some heuristic arguments lead to simple forms for the pion differential cross section that are in reasonable accord with the published experimental data in the energy interval 3 to 4.8 GeV

  8. Interaction Analysis: Theory, Research and Application.

    Amidon, Edmund J., Ed.; Hough, John J., Ed.

    This volume of selected readings developed for students and practitioners at various levels of sophistication is intended to be representative of work done to date on interaction analysis. The contents include journal articles, papers read at professional meetings, abstracts of doctoral dissertations, and selections from larger monographs, plus 12…

  9. Analysis of primary teacher stress' sources

    Katja Depolli Steiner

    2011-12-01

    Full Text Available Teachers are subject to many different work stressors. This study focused on differences in intensity and frequency of potential stressors facing primary schoolteachers and set the goal to identify the most important sources of teacher stress in primary school. The study included 242 primary schoolteachers from different parts of Slovenia. We used Stress Inventory that is designed for identification of intensity and frequency of 49 situations that can play the role of teachers' work stressors. Findings showed that the major sources of stress facing teachers are factors related to work overload, factors stemming from pupils' behaviour and motivation and factors related to school system. Results also showed some small differences in perception of stressors in different groups of teachers (by gender and by teaching level.

  10. Antioxidants: Characterization, natural sources, extraction and analysis.

    Oroian, Mircea; Escriche, Isabel

    2015-08-01

    Recently many review papers regarding antioxidants from different sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, for example preventing cancer and cardiovascular diseases, and lowering the incidence of different diseases. In this paper the main classes of antioxidants are presented: vitamins, carotenoids and polyphenols. Recently, many analytical methodologies involving diverse instrumental techniques have been developed for the extraction, separation, identification and quantification of these compounds. Antioxidants have been quantified by different researchers using one or more of these methods: in vivo, in vitro, electrochemical, chemiluminescent, electron spin resonance, chromatography, capillary electrophoresis, nuclear magnetic resonance, near infrared spectroscopy and mass spectrometry methods. Copyright © 2015. Published by Elsevier Ltd.

  11. Network analysis and synthesis a modern systems theory approach

    Anderson, Brian D O

    2006-01-01

    Geared toward upper-level undergraduates and graduate students, this book offers a comprehensive look at linear network analysis and synthesis. It explores state-space synthesis as well as analysis, employing modern systems theory to unite the classical concepts of network theory. The authors stress passive networks but include material on active networks. They avoid topology in dealing with analysis problems and discuss computational techniques. The concepts of controllability, observability, and degree are emphasized in reviewing the state-variable description of linear systems. Explorations

  12. The Constant Comparative Analysis Method Outside of Grounded Theory

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  13. Analysis of graphic representations of activity theory in international journals

    Marco André Mazzarotto

    2016-05-01

    Full Text Available Activity theory is a relevant framework for the Design field, and their graphic representations are cognitive artifacts that aid the understanding, use and communication of this theory. However, there is a lack of consistency around the graphics and labels used in these representations. Based on this, the aim of this study was to identify, analyze and evaluate these differences and propose a representation that aims to be more suitable for the theory. For this, uses as method a literature review based on Engeström (2001 and its three generations of visual models, combined with graphical analysis of representations collected in a hundred papers from international journals.

  14. A network analysis of leadership theory : the infancy of integration.

    Meuser, J. D.; Gardner, W. L.; Dinh, J. E.; Hu, J.; Liden, R. C.; Lord, R. G.

    2016-01-01

    We investigated the status of leadership theory integration by reviewing 14 years of published research (2000 through 2013) in 10 top journals (864 articles). The authors of these articles examined 49 leadership approaches/theories, and in 293 articles, 3 or more of these leadership approaches were included in their investigations. Focusing on these articles that reflected relatively extensive integration, we applied an inductive approach and used graphic network analysis as a guide for drawi...

  15. Video Game Characters. Theory and Analysis

    Felix Schröter; Jan-Noël Thon

    2014-01-01

    This essay develops a method for the analysis of video game characters based on a theoretical understanding of their medium-specific representation and the mental processes involved in their intersubjective construction by video game players. We propose to distinguish, first, between narration, simulation, and communication as three modes of representation particularly salient for contemporary video games and the characters they represent, second, between narrative, ludic, and social experien...

  16. Radar Polarimetry: Theory, Analysis, and Applications

    Hubbert, John Clark

    The fields of radar polarimetry and optical polarimetry are compared. The mathematics of optic polarimetry are formulated such that a local right handed coordinate system is always used to describe the polarization states. This is not done in radar polarimetry. Radar optimum polarization theory is redeveloped within the framework of optical polarimetry. The radar optimum polarizations and optic eigenvalues of common scatterers are compared. In addition a novel definition of an eigenpolarization state is given and the accompanying mathematics is developed. The polarization response calculated using optic, radar and novel definitions is presented for a variety of scatterers. Polarimetric transformation provides a means to characterize scatters in more than one polarization basis. Polarimetric transformation for an ensemble of scatters is obtained via two methods: (1) the covariance method and (2) the instantaneous scattering matrix (ISM) method. The covariance method is used to relate the mean radar parameters of a +/-45^circ linear polarization basis to those of a horizontal and vertical polarization basis. In contrast the ISM method transforms the individual time samples. Algorithms are developed for transforming the time series from fully polarimetric radars that switch between orthogonal states. The transformed time series are then used to calculate the mean radar parameters of interest. It is also shown that propagation effects do not need to be removed from the ISM's before transformation. The techniques are demonstrated using data collected by POLDIRAD, the German Aerospace Research Establishment's fully polarimetric C-band radar. The differential phase observed between two copolar states, Psi_{CO}, is composed of two phases: (1) differential propagation phase, phi_{DP}, and (2) differential backscatter phase, delta. The slope of phi_{DP } with range is an estimate of the specific differential phase, K_{DP}. The process of estimating K_{DP} is complicated when

  17. Supercontinuum light sources for food analysis

    Møller, Uffe Visbech; Petersen, Christian Rosenberg; Kubat, Irnis

    2014-01-01

    . One track of Light & Food will target the mid-infrared spectral region. To date, the limitations of mid-infraredlight sources, such as thermal emitters, low-power laser diodes, quantum cascade lasers and synchrotron radiation, have precluded mid-IR applications where the spatial coherence, broad...... bandwidth,high brightness and portability of a supercontinuum laser are all required. DTU Fotonik has now demonstrated the first optical fiber based broadband supercontinuum light souce, which covers 1.4-13.3μm and thereby most of the molecular fingerprint region....

  18. An Analysis of Programming Beginners' Source Programs

    Matsuyama, Chieko; Nakashima, Toyoshiro; Ishii, Naohiro

    The production of animations was made the subject of a university programming course in order to make students understand the process of program creation, and so that students could tackle programming with interest. In this paper, the formats and composition of the programs which students produced were investigated. As a result, it was found that there were a lot of problems related to such matters as how to use indent, how to apply comments and functions etc. for the format and the composition of the source codes.

  19. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and

  20. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  1. Chromatographic fingerprint similarity analysis for pollutant source identification

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  2. Theory for source-responsive and free-surface film modeling of unsaturated flow

    Nimmo, J.R.

    2010-01-01

    A new model explicitly incorporates the possibility of rapid response, across significant distance, to substantial water input. It is useful for unsaturated flow processes that are not inherently diffusive, or that do not progress through a series of equilibrium states. The term source-responsive is used to mean that flow responds sensitively to changing conditions at the source of water input (e.g., rainfall, irrigation, or ponded infiltration). The domain of preferential flow can be conceptualized as laminar flow in free-surface films along the walls of pores. These films may be considered to have uniform thickness, as suggested by field evidence that preferential flow moves at an approximately uniform rate when generated by a continuous and ample water supply. An effective facial area per unit volume quantitatively characterizes the medium with respect to source-responsive flow. A flow-intensity factor dependent on conditions within the medium represents the amount of source-responsive flow at a given time and position. Laminar flow theory provides relations for the velocity and thickness of flowing source-responsive films. Combination with the Darcy-Buckingham law and the continuity equation leads to expressions for both fluxes and dynamic water contents. Where preferential flow is sometimes or always significant, the interactive combination of source-responsive and diffuse flow has the potential to improve prediction of unsaturated-zone fluxes in response to hydraulic inputs and the evolving distribution of soil moisture. Examples for which this approach is efficient and physically plausible include (i) rainstorm-generated rapid fluctuations of a deep water table and (ii) space- and time-dependent soil water content response to infiltration in a macroporous soil. ?? Soil Science Society of America.

  3. Stability Analysis for Car Following Model Based on Control Theory

    Meng Xiang-Pei; Li Zhi-Peng; Ge Hong-Xia

    2014-01-01

    Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  4. Video Game Characters. Theory and Analysis

    Felix Schröter

    2014-06-01

    Full Text Available This essay develops a method for the analysis of video game characters based on a theoretical understanding of their medium-specific representation and the mental processes involved in their intersubjective construction by video game players. We propose to distinguish, first, between narration, simulation, and communication as three modes of representation particularly salient for contemporary video games and the characters they represent, second, between narrative, ludic, and social experience as three ways in which players perceive video game characters and their representations, and, third, between three dimensions of video game characters as ‘intersubjective constructs’, which usually are to be analyzed not only as fictional beings with certain diegetic properties but also as game pieces with certain ludic properties and, in those cases in which they function as avatars in the social space of a multiplayer game, as representations of other players. Having established these basic distinctions, we proceed to analyze their realization and interrelation by reference to the character of Martin Walker from the third-person shooter Spec Ops: The Line (Yager Development 2012, the highly customizable player-controlled characters from the role-playing game The Elder Scrolls V: Skyrim (Bethesda 2011, and the complex multidimensional characters in the massively multiplayer online role-playing game Star Wars: The Old Republic (BioWare 2011-2014.

  5. Complex analysis a modern first course in function theory

    Muir, Jerry R

    2015-01-01

    A thorough introduction to the theory of complex functions emphasizing the beauty, power, and counterintuitive nature of the subject Written with a reader-friendly approach, Complex Analysis: A Modern First Course in Function Theory features a self-contained, concise development of the fundamental principles of complex analysis. After laying groundwork on complex numbers and the calculus and geometric mapping properties of functions of a complex variable, the author uses power series as a unifying theme to define and study the many rich and occasionally surprising properties of analytic fun

  6. Analysis of Ward identities in supersymmetric Yang-Mills theory

    Ali, Sajid; Bergner, Georg; Gerber, Henning; Montvay, Istvan; Münster, Gernot; Piemonte, Stefano; Scior, Philipp

    2018-05-01

    In numerical investigations of supersymmetric Yang-Mills theory on a lattice, the supersymmetric Ward identities are valuable for finding the critical value of the hopping parameter and for examining the size of supersymmetry breaking by the lattice discretisation. In this article we present an improved method for the numerical analysis of supersymmetric Ward identities, which takes into account the correlations between the various observables involved. We present the first complete analysis of supersymmetric Ward identities in N=1 supersymmetric Yang-Mills theory with gauge group SU(3). The results indicate that lattice artefacts scale to zero as O(a^2) towards the continuum limit in agreement with theoretical expectations.

  7. Analysis of General Power Counting Rules in Effective Field Theory

    Gavela, B M; Manohar, A V; Merlo, L

    2016-01-01

    We derive the general counting rules for a quantum effective field theory (EFT) in $\\mathsf{d}$ dimensions. The rules are valid for strongly and weakly coupled theories, and predict that all kinetic energy terms are canonically normalized. They determine the energy dependence of scattering cross sections in the range of validity of the EFT expansion. The size of cross sections is controlled by the $\\Lambda$ power counting of EFT, not by chiral counting, even for chiral perturbation theory ($\\chi$PT). The relation between $\\Lambda$ and $f$ is generalized to $\\mathsf{d}$ dimensions. We show that the naive dimensional analysis $4\\pi$ counting is related to $\\hbar$ counting. The EFT counting rules are applied to $\\chi$PT, to Standard Model EFT and to the non-trivial case of Higgs EFT, which combines the $\\Lambda$ and chiral counting rules within a single theory.

  8. Towards a Design Theory for Collaborative Qualitative Data Analysis

    Nielsen, Peter Axel

    2016-01-01

    This position paper addresses how to develop a design theory to support the collaborative practice of qualitative data analysis. Qualitative researchers face several challenges in making sense of their empirical data and IS-support for this practice can be found in software applications...... such as NVivo, Atlas.ti, and DeDoose. While these software tools have utility and are valuable, they are also limiting – and they are particularly limiting researchers in their collaborative efforts with their co-researchers. In this paper, we investigate a design theory to extend it to support collaboration....... We use this as a stepping stone to discuss how to use a design theory to problematize existing applications and how to extend a design theory by abduction....

  9. Source Signals Separation and Reconstruction Following Principal Component Analysis

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  10. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation

    Alton, G. D.; Bilheux, H.

    2004-05-01

    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j+ext, and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j+ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects.

  11. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation

    Alton, G.D.; Bilheux, H.

    2004-01-01

    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j +ext , and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j +ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects

  12. Theory of the detection of the field surrounding half-dressed sources

    Compagno, G.; Passante, R.; Persico, F.

    1988-01-01

    Half-dressed sources are defined as sources deprived partially or totally of the cloud of virtual quanta which surrounds them in the ground state of the total system. Two models of a half-dressed point source S are considered, the first in the framework of the theory of massive scalar fields and the second in quantum electrodynamics (QED). In both cases the detector is modeled by a second fully dressed source T of the same field, which is also bound to an oscillation center by harmonic forces. It is shown that when S at time t = 0 is suddenly coupled to or decoupled from the field, the detector T, which is initially at rest, is set in motion after a time t = R 0 /c, where R 0 is the S-T distance. Neglecting the reaction back on the field due to the oscillatory motion of T, the amplitude of oscillation for t = ∞ is obtained as a function of R 0 . Thus the time-varying virtual field of S is shown to be capable of exerting a force which excites the model detector. For the QED case, this force is related to the properties of the energy density of the virtual field. This energy density displays a singularity at r = ct, and the mathematical nature of this singularity is studied in detail. In this way it is shown that the energy density of the time-dependent virtual field is rather different from that of a pulse of radiation emitted by a source during energy-conserving processes. The differences are discussed in detail, as well as the limitations of the model

  13. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  14. Antieigenvalue analysis for continuum mechanics, economics, and number theory

    Gustafson Karl

    2016-01-01

    Full Text Available My recent book Antieigenvalue Analysis, World-Scientific, 2012, presented the theory of antieigenvalues from its inception in 1966 up to 2010, and its applications within those forty-five years to Numerical Analysis, Wavelets, Statistics, Quantum Mechanics, Finance, and Optimization. Here I am able to offer three further areas of application: Continuum Mechanics, Economics, and Number Theory. In particular, the critical angle of repose in a continuum model of granular materials is shown to be exactly my matrix maximum turning angle of the stress tensor of the material. The important Sharpe ratio of the Capital Asset Pricing Model is now seen in terms of my antieigenvalue theory. Euclid’s Formula for Pythagorean triples becomes a special case of my operator trigonometry.

  15. Convex analysis and monotone operator theory in Hilbert spaces

    Bauschke, Heinz H

    2017-01-01

    This reference text, now in its second edition, offers a modern unifying presentation of three basic areas of nonlinear analysis: convex analysis, monotone operator theory, and the fixed point theory of nonexpansive operators. Taking a unique comprehensive approach, the theory is developed from the ground up, with the rich connections and interactions between the areas as the central focus, and it is illustrated by a large number of examples. The Hilbert space setting of the material offers a wide range of applications while avoiding the technical difficulties of general Banach spaces. The authors have also drawn upon recent advances and modern tools to simplify the proofs of key results making the book more accessible to a broader range of scholars and users. Combining a strong emphasis on applications with exceptionally lucid writing and an abundance of exercises, this text is of great value to a large audience including pure and applied mathematicians as well as researchers in engineering, data science, ma...

  16. Data Analysis with Open Source Tools

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  17. Sources of political violence, political and psychological analysis

    O. B. Balatska

    2015-05-01

    We also consider the following approaches to determining the nature and sources of aggression and violence such as instinktyvizm (K. Lorenz and behaviorism (J. B. Watson and B. F. Skinner et al.. Special attention is paid to theories of frustration aggression (J. Dollard, N. E. Miller, L. Berkowitz et al., according to which the causes of aggression and violence are hidden in a particular mental state – frustration. The particular importance of the theory of T. R. Gurr, in which the source of aggression and political violence are defined through the concept of relative deprivation, is underlined. Another approach is described in the article ­ the concept of aggression as a learned reaction (A. Bandura, G. Levin, B. Fleischmann et al.. Supporters of this approach believe that aggressive behavior is formed in the process of social training.

  18. Variational analysis and generalized differentiation I basic theory

    Mordukhovich, Boris S

    2006-01-01

    Contains a study of the basic concepts and principles of variational analysis and generalized differentiation in both finite-dimensional and infinite-dimensional spaces. This title presents many applications to problems in optimization, equilibria, stability and sensitivity, control theory, economics, mechanics, and more.

  19. Acknowledging the Infrasystem: A Critical Feminist Analysis of Systems Theory.

    Creedon, Pamela J.

    1993-01-01

    Examines the absence of a critical feminist perspective in the application of systems theory as a unifying model for public relations. Describes an unacknowledged third system, the infrasystem, that constructs both suprasystem and subsystem interactions. Concludes with a case analysis of sport as illustration. (HB)

  20. The Use of Modelling for Theory Building in Qualitative Analysis

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  1. A tutorial on incremental stability analysis using contraction theory

    Jouffroy, Jerome; Fossen, Thor I.

    2010-01-01

    This paper introduces a methodology for dierential nonlinear stability analysis using contraction theory (Lohmiller and Slotine, 1998). The methodology includes four distinct steps: the descriptions of two systems to be compared (the plant and the observer in the case of observer convergence...... on several simple examples....

  2. Risk analysis of alternative energy sources

    Kazmer, D.R.

    1982-01-01

    The author explores two points raised by Miller Spangler in a January 1981 issue: public perception of risks involving nuclear power plants relative to those of conventional plants and criteria for evaluating the way risk analyses are made. On the first point, he concludes that translating public attitudes into the experts' language of probability and risk could provide better information and understanding of both the attitudes and the risks. Viewing risk analysis methodologies as filters which help to test historical change, he suggests that the lack of information favors a lay jury approach for energy decisions. Spangler responds that Congress is an example of lay decision making, but that a lay jury, given public disinterest and polarization, would probably not improve social justice on the nuclear issue. 5 references, 4 figures

  3. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  4. Application of depletion perturbation theory to fuel cycle burnup analysis

    White, J.R.

    1979-01-01

    Over the past several years static perturbation theory methods have been increasingly used for reactor analysis in lieu of more detailed and costly direct computations. Recently, perturbation methods incorporating time dependence have also received attention, and several authors have demonstrated their applicability to fuel burnup analysis. The objective of the work described here is to demonstrate that a time-dependent perturbation method can be easily and accurately applied to realistic depletion problems

  5. Real analysis measure theory, integration, and Hilbert spaces

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  6. Transfer of learning between 2D and 3D sources during infancy: Informing theory and practice.

    Barr, Rachel

    2010-06-01

    The ability to transfer learning across contexts is an adaptive skill that develops rapidly during early childhood. Learning from television is a specific instance of transfer of learning between a 2-Dimensional (2D) representation and a 3-Dimensional (3D) object. Understanding the conditions under which young children might accomplish this particular kind of transfer is important because by 2 years of age 90% of US children are viewing television on a daily basis. Recent research shows that children can imitate actions presented on television using the corresponding real-world objects, but this same research also shows that children learn less from television than they do from live demonstrations until they are at least 3 years old; termed the video deficit effect. At present, there is no coherent theory to account for the video deficit effect; how learning is disrupted by this change in context is poorly understood. The aims of the present review are (1) to review the conditions under which children transfer learning between 2D images and 3D objects during early childhood, and (2) to integrate developmental theories of memory processing into the transfer of learning from media literature using Hayne's (2004) developmental representational flexibility account. The review will conclude that studies on the transfer of learning between 2D and 3D sources have important theoretical implications for general developmental theories of cognitive development, and in particular the development of a flexible representational system, as well as policy implications for early education regarding the potential use and limitations of media as effective teaching tools during early childhood.

  7. Exact-to-precision generalized perturbation theory for source-driven systems

    Wang Congjian; Abdel-Khalik, Hany S.

    2011-01-01

    Highlights: ► We present a new development in higher order generalized perturbation theory. ► The method addresses the explosion in the flux phase space, input parameters, and responses. ► The method hybridizes first-order GPT and proper orthogonal decomposition snapshots method. ► A simplified 1D and realistic 2D assembly models demonstrate applicability of the method. ► The accuracy of the method is compared to exact direct perturbations and first-order GPT. - Abstract: Presented in this manuscript are new developments to perturbation theory which are intended to extend its applicability to estimate, with quantifiable accuracy, the exact variations in all responses calculated by the model with respect to all possible perturbations in the model's input parameters. The new developments place high premium on reducing the associated computational overhead in order to enable the use of perturbation theory in routine reactor design calculations. By way of examples, these developments could be employed in core simulation to accurately estimate the few-group cross-sections variations resulting from perturbations in neutronics and thermal-hydraulics core conditions. These variations are currently being described using a look-up table approach, where thousands of assembly calculations are performed to capture few-group cross-sections variations for the downstream core calculations. Other applications include the efficient evaluation of surrogates for applications that require repeated model runs such as design optimization, inverse studies, uncertainty quantification, and online core monitoring. The theoretical background of these developments applied to source-driven systems and supporting numerical experiments are presented in this manuscript. Extension to eigenvalue problems will be presented in a future article.

  8. Frames and operator theory in analysis and signal processing

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  9. Methods of Approximation Theory in Complex Analysis and Mathematical Physics

    Saff, Edward

    1993-01-01

    The book incorporates research papers and surveys written by participants ofan International Scientific Programme on Approximation Theory jointly supervised by Institute for Constructive Mathematics of University of South Florida at Tampa, USA and the Euler International Mathematical Instituteat St. Petersburg, Russia. The aim of the Programme was to present new developments in Constructive Approximation Theory. The topics of the papers are: asymptotic behaviour of orthogonal polynomials, rational approximation of classical functions, quadrature formulas, theory of n-widths, nonlinear approximation in Hardy algebras,numerical results on best polynomial approximations, wavelet analysis. FROM THE CONTENTS: E.A. Rakhmanov: Strong asymptotics for orthogonal polynomials associated with exponential weights on R.- A.L. Levin, E.B. Saff: Exact Convergence Rates for Best Lp Rational Approximation to the Signum Function and for Optimal Quadrature in Hp.- H. Stahl: Uniform Rational Approximation of x .- M. Rahman, S.K. ...

  10. Source location in plates based on the multiple sensors array method and wavelet analysis

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon

    2014-01-01

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  11. Source location in plates based on the multiple sensors array method and wavelet analysis

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon [Inha University, Incheon (Korea, Republic of)

    2014-01-15

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  12. Analysis of Health Behavior Theories for Clustering of Health Behaviors.

    Choi, Seung Hee; Duffy, Sonia A

    The objective of this article was to review the utility of established behavior theories, including the Health Belief Model, Theory of Reasoned Action, Theory of Planned Behavior, Transtheoretical Model, and Health Promotion Model, for addressing multiple health behaviors among people who smoke. It is critical to design future interventions for multiple health behavior changes tailored to individuals who currently smoke, yet it has not been addressed. Five health behavior theories/models were analyzed and critically evaluated. A review of the literature included a search of PubMed and Google Scholar from 2010 to 2016. Two hundred sixty-seven articles (252 studies from the initial search and 15 studies from the references of initially identified studies) were included in the analysis. Most of the health behavior theories/models emphasize psychological and cognitive constructs that can be applied only to one specific behavior at a time, thus making them not suitable to address multiple health behaviors. However, the Health Promotion Model incorporates "related behavior factors" that can explain multiple health behaviors among persons who smoke. Future multiple behavior interventions guided by the Health Promotion Model are necessary to show the utility and applicability of the model to address multiple health behaviors.

  13. Prequantum classical statistical field theory: background field as a source of everything?

    Khrennikov, Andrei

    2011-01-01

    Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's 'double solution' approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special 'prequantum fields': the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the 'photonic field' (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of 'vacuum fluctuations') might play the role of a source of such pulses, i.e., the source of everything.

  14. Theory and technique of spark source mass spectrometry; Theorie et technique de la spectrometrie de masse a etincelles

    Stefani, R [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1968-07-01

    Trace analysis in solids by spark source mass spectrometry involves complicated phenomena: element ionization in spark and blacking of sensitive emulsion by focused ion beam. However the principal risk of selectivity lies in analyser system, which realizes double focusing only for a part of the ions. Therefore, each analyst has to known ionic optics of his apparatus, for ensuring the transmission of mean energetic ions, which are representative of sample composition. By a careful photometry of mass spectrum, good reproducibility can be obtained. Thereafter accuracy depends on the knowledge of sensitivity coefficients proper to this apparatus. (author) [French] L'analyse de traces dans les solides par spectrometrie de masse a etincelles met en jeu des phenomenes complexes qui sont l'ionisation des elements dans l'etincelle, et le noircissement de l'emulsion sensible par les faisceaux ioniques focalises. Cependant, le risque majeur de selectivite provient de l'ensemble analyseur, qui realise la double focalisation sur une fraction seulement du faisceau d'ions. L'analyste doit donc connaitre en detail l'optique ionique de son appareil, pour assurer le passage de la bande d'energie moyenne des ions, qui seule caracterise quantitativement la composition chimique de l'echantillon. Une exploitation photometrique soignee du spectrogramme donne alors des resultats reproductibles, dont la justesse ne depend plus que des coefficients de sensibilite propres a ce type d'instrument. (auteur)

  15. Stability analysis of jointed rock slope by the block theory

    Yoshinaka, Ryunoshin; Yamabe, Tadashi; Fujita, Tomoo.

    1990-01-01

    The block theory to analyze three dimensional stability problems of discontinuous rock masses is applied to the actual discontinuous rock slope. Taking into consideration that the geometrical information about discontinuities generally increases according to progressive steps of rock investigation in field, the method adopted for analysis is divided into following two steps; 1) the statistical/probabilitical analysis using information from the primary investigation stage which mainly consists of that of natural rock outcrops, and 2) the deterministic analysis correspond to the secondary stage using exploration adits. (author)

  16. Source modelling in seismic risk analysis for nuclear power plants

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  17. Utilization of graph theory in security analysis of power grid

    Dalibor Válek

    2014-12-01

    Full Text Available This paper describes way how to use graph theory in security analysis. As an environment is used network of power lines and devices which are included here. Power grid is considered as a system of nodes which make together graph (network. On the simple example is applied Fiedler´s theory which is able to select the most important power lines of whole network. Components related to these lines are logicly ordered and considered by author´s modified analysis. This method has been improved and optimalized for risks related with illegal acts. Each power grid component has been connected with possible kind of attack and every of this device was gradually evaluated by five coefficients which takes values from 1 to 10. On the coefficient basis was assessed the level of risk. In the last phase the most risky power grid components have been selected. On the selected devices have been proposed security measures.

  18. An introduction to queueing theory modeling and analysis in applications

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  19. Mathematical theory of compressible viscous fluids analysis and numerics

    Feireisl, Eduard; Pokorný, Milan

    2016-01-01

    This book offers an essential introduction to the mathematical theory of compressible viscous fluids. The main goal is to present analytical methods from the perspective of their numerical applications. Accordingly, we introduce the principal theoretical tools needed to handle well-posedness of the underlying Navier-Stokes system, study the problems of sequential stability, and, lastly, construct solutions by means of an implicit numerical scheme. Offering a unique contribution – by exploring in detail the “synergy” of analytical and numerical methods – the book offers a valuable resource for graduate students in mathematics and researchers working in mathematical fluid mechanics. Mathematical fluid mechanics concerns problems that are closely connected to real-world applications and is also an important part of the theory of partial differential equations and numerical analysis in general. This book highlights the fact that numerical and mathematical analysis are not two separate fields of mathematic...

  20. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  1. Conceptual and critical analysis of the Implicit Leadership Theory

    Hernández Avilés, Omar David; García Ramos, Tania

    2013-01-01

    The purpose of this essay is to present a conceptual and critical analysis of the Implicit Leadership Theory (ILT). The objectives are: 1) explaining the main concepts of the ILT; 2) explaining the main processes of the ILT; 3) identifying constructivist assumptions in the ILT; 4) identifying constructionist assumptions in the ILT, and 5) analyzing critically theoretical assumptions of the ILT. At analyzing constructivism and constructionism assumptions in the ILP, the constructivist leadersh...

  2. Theory analysis and simple calculation of travelling wave burnup scheme

    Zhang Jian; Yu Hong; Gang Zhi

    2012-01-01

    Travelling wave burnup scheme is a new burnup scheme that breeds fuel locally just before it burns. Based on the preliminary theory analysis, the physical imagine was found. Through the calculation of a R-z cylinder travelling wave reactor core with ERANOS code system, the basic physical characteristics of this new burnup scheme were concluded. The results show that travelling wave reactor is feasible in physics, and there are some good features in the reactor physics. (authors)

  3. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  4. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  5. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  6. Developing interprofessional education online: An ecological systems theory analysis.

    Bluteau, Patricia; Clouder, Lynn; Cureton, Debra

    2017-07-01

    This article relates the findings of a discourse analysis of an online asynchronous interprofessional learning initiative involving two UK universities. The impact of the initiative is traced over three intensive periods of online interaction, each of several-weeks duration occurring over a three-year period, through an analysis of a random sample of discussion forum threads. The corpus of rich data drawn from the forums is interpreted using ecological systems theory, which highlights the complexity of interaction of individual, social and cultural elements. Ecological systems theory adopts a life course approach to understand how development occurs through processes of progressively more complex reciprocal interaction between people and their environment. This lens provides a novel approach for analysis and interpretation of findings with respect to the impact of pre-registration interprofessional education and the interaction between the individual and their social and cultural contexts as they progress through 3/4 years of their programmes. Development is mapped over time (the chronosystem) to highlight the complexity of interaction across microsystems (individual), mesosystems (curriculum and institutional/care settings), exosystems (community/wider local context), and macrosystems (national context and culture). This article illustrates the intricacies of students' interprofessional development over time and the interactive effects of social ecological components in terms of professional knowledge and understanding, wider appreciation of health and social care culture and identity work. The implications for contemporary pre-registration interprofessional education and the usefulness and applicability of ecological systems theory for future research and development are considered.

  7. The quantitative analysis of 163Ho source by PIXE

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  8. Pattern theory the stochastic analysis of real-world signals

    Mumford, David

    2010-01-01

    Pattern theory is a distinctive approach to the analysis of all forms of real-world signals. At its core is the design of a large variety of probabilistic models whose samples reproduce the look and feel of the real signals, their patterns, and their variability. Bayesian statistical inference then allows you to apply these models in the analysis of new signals. This book treats the mathematical tools, the models themselves, and the computational algorithms for applying statistics to analyze six representative classes of signals of increasing complexity. The book covers patterns in text, sound

  9. Fringe pattern analysis for optical metrology theory, algorithms, and applications

    Servin, Manuel; Padilla, Moises

    2014-01-01

    The main objective of this book is to present the basic theoretical principles and practical applications for the classical interferometric techniques and the most advanced methods in the field of modern fringe pattern analysis applied to optical metrology. A major novelty of this work is the presentation of a unified theoretical framework based on the Fourier description of phase shifting interferometry using the Frequency Transfer Function (FTF) along with the theory of Stochastic Process for the straightforward analysis and synthesis of phase shifting algorithms with desired properties such

  10. Mobile applications for weight management: theory-based content analysis.

    Azar, Kristen M J; Lesser, Lenard I; Laing, Brian Y; Stephens, Janna; Aurora, Magi S; Burke, Lora E; Palaniappan, Latha P

    2013-11-01

    The use of smartphone applications (apps) to assist with weight management is increasingly prevalent, but the quality of these apps is not well characterized. The goal of the study was to evaluate diet/nutrition and anthropometric tracking apps based on incorporation of features consistent with theories of behavior change. A comparative, descriptive assessment was conducted of the top-rated free apps in the Health and Fitness category available in the iTunes App Store. Health and Fitness apps (N=200) were evaluated using predetermined inclusion/exclusion criteria and categorized based on commonality in functionality, features, and developer description. Four researchers then evaluated the two most popular apps in each category using two instruments: one based on traditional behavioral theory (score range: 0-100) and the other on the Fogg Behavioral Model (score range: 0-6). Data collection and analysis occurred in November 2012. Eligible apps (n=23) were divided into five categories: (1) diet tracking; (2) healthy cooking; (3) weight/anthropometric tracking; (4) grocery decision making; and (5) restaurant decision making. The mean behavioral theory score was 8.1 (SD=4.2); the mean persuasive technology score was 1.9 (SD=1.7). The top-rated app on both scales was Lose It! by Fitnow Inc. All apps received low overall scores for inclusion of behavioral theory-based strategies. © 2013 American Journal of Preventive Medicine.

  11. Analysis of 3-panel and 4-panel microscale ionization sources

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  12. Analysis of the tuning characteristics of microwave plasma source

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  13. Analysis of the tuning characteristics of microwave plasma source

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  14. Analysis on the inbound tourist source market in Fujian Province

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  15. Principle-based concept analysis: intentionality in holistic nursing theories.

    Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri

    2015-03-01

    This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.

  16. Sources of Cognitive Inflexibility in Set-Shifting Tasks: Insights Into Developmental Theories From Adult Data.

    Dick, Anthony Steven

    2012-01-01

    Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming . In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation ( attentional inertia ), and the need to shift to the previously ignored representation ( negative priming ). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children.

  17. Recognition memory, self-other source memory, and theory-of-mind in children with autism spectrum disorder.

    Lind, Sophie E; Bowler, Dermot M

    2009-09-01

    This study investigated semantic and episodic memory in autism spectrum disorder (ASD), using a task which assessed recognition and self-other source memory. Children with ASD showed undiminished recognition memory but significantly diminished source memory, relative to age- and verbal ability-matched comparison children. Both children with and without ASD showed an "enactment effect", demonstrating significantly better recognition and source memory for self-performed actions than other-person-performed actions. Within the comparison group, theory-of-mind (ToM) task performance was significantly correlated with source memory, specifically for other-person-performed actions (after statistically controlling for verbal ability). Within the ASD group, ToM task performance was not significantly correlated with source memory (after controlling for verbal ability). Possible explanations for these relations between source memory and ToM are considered.

  18. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  19. Thought analysis on self-organization theories of MHD plasma

    Kondoh, Yoshiomi; Sato, Tetsuya.

    1992-08-01

    A thought analysis on the self-organization theories of dissipative MHD plasma is presented to lead to three groups of theories that lead to the same relaxed state of ∇ x B = λB, in order to find an essential physical picture embedded in the self-organization phenomena due to nonlinear and dissipative processes. The self-organized relaxed state due to the dissipation by the Ohm loss is shown to be formulated generally as the state such that yields the minimum dissipation rate of global auto-and/or cross-correlations between two quantities in j, B, and A for their own instantaneous values of the global correlations. (author)

  20. Human factors and fuzzy set theory for safety analysis

    Nishiwaki, Y.

    1987-01-01

    Human reliability and performance is affected by many factors: medical, physiological and psychological, etc. The uncertainty involved in human factors may not necessarily be probabilistic, but fuzzy. Therefore, it is important to develop a theory by which both the non-probabilistic uncertainties, or fuzziness, of human factors and the probabilistic properties of machines can be treated consistently. In reality, randomness and fuzziness are sometimes mixed. From the mathematical point of view, probabilistic measures may be considered a special case of fuzzy measures. Therefore, fuzzy set theory seems to be an effective tool for analysing man-machine systems. The concept 'failure possibility' based on fuzzy sets is suggested as an approach to safety analysis and fault diagnosis of a large complex system. Fuzzy measures and fuzzy integrals are introduced and their possible applications are also discussed. (author)

  1. A game theory analysis of green infrastructure stormwater management policies

    William, Reshmina; Garg, Jugal; Stillwell, Ashlynn S.

    2017-09-01

    Green stormwater infrastructure has been demonstrated as an innovative water resources management approach that addresses multiple challenges facing urban environments. However, there is little consensus on what policy strategies can be used to best incentivize green infrastructure adoption by private landowners. Game theory, an analysis framework that has historically been under-utilized within the context of stormwater management, is uniquely suited to address this policy question. We used a cooperative game theory framework to investigate the potential impacts of different policy strategies used to incentivize green infrastructure installation. The results indicate that municipal regulation leads to the greatest reduction in pollutant loading. However, the choice of the "best" regulatory approach will depend on a variety of different factors including politics and financial considerations. Large, downstream agents have a disproportionate share of bargaining power. Results also reveal that policy impacts are highly dependent on agents' spatial position within the stormwater network, leading to important questions of social equity and environmental justice.

  2. Sensitivity theory for reactor burnup analysis based on depletion perturbation theory

    Yang, Wonsik.

    1989-01-01

    The large computational effort involved in the design and analysis of advanced reactor configurations motivated the development of Depletion Perturbation Theory (DPT) for general fuel cycle analysis. The work here focused on two important advances in the current methods. First, the adjoint equations were developed for using the efficient linear flux approximation to decouple the neutron/nuclide field equations. And second, DPT was extended to the constrained equilibrium cycle which is important for the consistent comparison and evaluation of alternative reactor designs. Practical strategies were formulated for solving the resulting adjoint equations and a computer code was developed for practical applications. In all cases analyzed, the sensitivity coefficients generated by DPT were in excellent agreement with the results of exact calculations. The work here indicates that for a given core response, the sensitivity coefficients to all input parameters can be computed by DPT with a computational effort similar to a single forward depletion calculation

  3. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  4. Dosimetric analysis of radiation sources to use in dermatological lesions

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures. Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  5. Exact interior solutions for static spheres in the Einstein-Cartan theory with two sources of torsion

    Gallakhmetov, A M

    2002-01-01

    In the framework of the problem of existence of exact interior solutions for static spherically symmetric configurations in the Einstein-Cartan theory (ECT), the distributions of perfect fluid and non-minimally coupled scalar field are considered. The exact solutions in the one-torsion ECT and two-torsion one are obtained. Some consequences of two sources of torsion are discussed.

  6. Stability analysis of black holes via a catastrophe theory and black hole thermodynamics in generalized theories of gravity

    Tamaki, Takashi; Torii, Takashi; Maeda, Kei-ichi

    2003-01-01

    We perform a linear perturbation analysis for black hole solutions with a 'massive' Yang-Mills field (the Proca field) in Brans-Dicke theory and find that the results are quite consistent with those via catastrophe theory where thermodynamic variables play an intrinsic role. Based on this observation, we show the general relation between these two methods in generalized theories of gravity which are conformally related to the Einstein-Hilbert action

  7. Critical Analysis on Open Source LMSs Using FCA

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  8. Modular Open-Source Software for Item Factor Analysis

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  9. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  10. Positioning Theory and Discourse Analysis: Some Tools for Social Interaction Analysis

    Francisco Tirado

    2007-05-01

    Full Text Available This article outlines positioning theory as a discursive analysis of interaction, focusing on the topic of conflict. Moreover, said theory is applied to a new work environment for the social sciences: virtual spaces. The analysis is organized in the following way. First, the major key psychosocial issues which define the topic of conflict are reviewed. Then, virtual environments are presented as a new work space for the social sciences. Thirdly, a synthesis of positioning theory and its FOUCAULTian legacy is conducted, while appreciating its particular appropriateness for analyzing conflictive interaction in virtual environments. An empiric case is then presented. This consists of an analysis of interactive sequences within a specific virtual environment: the Universitat Oberta de Catalunya (UOC Humanitats i Filologia Catalana studies forum. Through positioning theory, the production and effects that a conflictive interaction sequence has on the community in which it is produced are understood and explained. URN: urn:nbn:de:0114-fqs0702317

  11. Stepped-frequency radar sensors theory, analysis and design

    Nguyen, Cam

    2016-01-01

    This book presents the theory, analysis and design of microwave stepped-frequency radar sensors. Stepped-frequency radar sensors are attractive for various sensing applications that require fine resolution. The book consists of five chapters. The first chapter describes the fundamentals of radar sensors including applications followed by a review of ultra-wideband pulsed, frequency-modulated continuous-wave (FMCW), and stepped-frequency radar sensors. The second chapter discusses a general analysis of radar sensors including wave propagation in media and scattering on targets, as well as the radar equation. The third chapter addresses the analysis of stepped-frequency radar sensors including their principles and design parameters. Chapter 4 presents the development of two stepped-frequency radar sensors at microwave and millimeter-wave frequencies based on microwave integrated circuits (MICs), microwave monolithic integrated circuits (MMICs) and printed-circuit antennas, and discusses their signal processing....

  12. Dimensional analysis, similarity, analogy, and the simulation theory

    Davis, A.A.

    1978-01-01

    Dimensional analysis, similarity, analogy, and cybernetics are shown to be four consecutive steps in application of the simulation theory. This paper introduces the classes of phenomena which follow the same formal mathematical equations as models of the natural laws and the interior sphere of restraints groups of phenomena in which one can introduce simplfied nondimensional mathematical equations. The simulation by similarity in a specific field of physics, by analogy in two or more different fields of physics, and by cybernetics in nature in two or more fields of mathematics, physics, biology, economics, politics, sociology, etc., appears as a unique theory which permits one to transport the results of experiments from the models, convenably selected to meet the conditions of researches, constructions, and measurements in the laboratories to the originals which are the primary objectives of the researches. Some interesting conclusions which cannot be avoided in the use of simplified nondimensional mathematical equations as models of natural laws are presented. Interesting limitations on the use of simulation theory based on assumed simplifications are recognized. This paper shows as necessary, in scientific research, that one write mathematical models of general laws which will be applied to nature in its entirety. The paper proposes the extent of the second law of thermodynamics as the generalized law of entropy to model life and its activities. This paper shows that the physical studies and philosophical interpretations of phenomena and natural laws cannot be separated in scientific work; they are interconnected and one cannot be put above the others

  13. Item response theory analysis of the mechanics baseline test

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  14. Analysis of interacting quantum field theory in curved spacetime

    Birrell, N.D.; Taylor, J.G.

    1980-01-01

    A detailed analysis of interacting quantized fields propagating in a curved background spacetime is given. Reduction formulas for S-matrix elements in terms of vacuum Green's functions are derived, special attention being paid to the possibility that the ''in'' and ''out'' vacuum states may not be equivalent. Green's functions equations are obtained and a diagrammatic representation for them given, allowing a formal, diagrammatic renormalization to be effected. Coordinate space techniques for showing renormalizability are developed in Minkowski space, for lambdaphi 3 /sub() 4,6/ field theories. The extension of these techniques to curved spacetimes is considered. It is shown that the possibility of field theories becoming nonrenormalizable there cannot be ruled out, although, allowing certain modifications to the theory, phi 3 /sub( 4 ) is proven renormalizable in a large class of spacetimes. Finally particle production from the vacuum by the gravitational field is discussed with particular reference to Schwarzschild spacetime. We shed some light on the nonlocalizability of the production process and on the definition of the S matrix for such processes

  15. Statistical Analysis of Designed Experiments Theory and Applications

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  16. Linearly Polarized IR Spectroscopy Theory and Applications for Structural Analysis

    Kolev, Tsonko

    2011-01-01

    A technique that is useful in the study of pharmaceutical products and biological molecules, polarization IR spectroscopy has undergone continuous development since it first emerged almost 100 years ago. Capturing the state of the science as it exists today, "Linearly Polarized IR Spectroscopy: Theory and Applications for Structural Analysis" demonstrates how the technique can be properly utilized to obtain important information about the structure and spectral properties of oriented compounds. The book starts with the theoretical basis of linear-dichroic infrared (IR-LD) spectroscop

  17. Gravitational-wave physics and astronomy an introduction to theory, experiment and data analysis

    Creighton, Jolien D E

    2011-01-01

    This most up-to-date, one-stop reference combines coverage of both theory and observational techniques, with introductory sections to bring all readers up to the same level. Written by outstanding researchers directly involved with the scientific program of the Laser Interferometer Gravitational-Wave Observatory (LIGO), the book begins with a brief review of general relativity before going on to describe the physics of gravitational waves and the astrophysical sources of gravitational radiation. Further sections cover gravitational wave detectors, data analysis, and the outlook of gravitation

  18. Radioisotope sources for X-ray fluorescence analysis

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  19. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  20. Vocal individuality cues in the African penguin (Spheniscus demersus): a source-filter theory approach.

    Favaro, Livio; Gamba, Marco; Alfieri, Chiara; Pessani, Daniela; McElligott, Alan G

    2015-11-25

    The African penguin is a nesting seabird endemic to southern Africa. In penguins of the genus Spheniscus vocalisations are important for social recognition. However, it is not clear which acoustic features of calls can encode individual identity information. We recorded contact calls and ecstatic display songs of 12 adult birds from a captive colony. For each vocalisation, we measured 31 spectral and temporal acoustic parameters related to both source and filter components of calls. For each parameter, we calculated the Potential of Individual Coding (PIC). The acoustic parameters showing PIC ≥ 1.1 were used to perform a stepwise cross-validated discriminant function analysis (DFA). The DFA correctly classified 66.1% of the contact calls and 62.5% of display songs to the correct individual. The DFA also resulted in the further selection of 10 acoustic features for contact calls and 9 for display songs that were important for vocal individuality. Our results suggest that studying the anatomical constraints that influence nesting penguin vocalisations from a source-filter perspective, can lead to a much better understanding of the acoustic cues of individuality contained in their calls. This approach could be further extended to study and understand vocal communication in other bird species.

  1. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. CO_2 volatility impact on energy portfolio choice: A fully stochastic LCOE theory analysis

    Lucheroni, Carlo; Mari, Carlo

    2017-01-01

    Highlights: • Stochastic LCOE theory is an extension of the levelized cost of electricity analysis. • The fully stochastic analysis include stochastic processes for fossil fuels prices and CO_2 prices. • The nuclear asset is risky through uncertainty about construction times and it is used as a hedge. • Volatility of CO_2 prices has a strong influence on CO_2 emissions reduction. - Abstract: Market based pricing of CO_2 was designed to control CO_2 emissions by means of the price level, since high CO_2 price levels discourage emissions. In this paper, it will be shown that the level of uncertainty on CO_2 market prices, i.e. the volatility of CO_2 prices itself, has a strong influence not only on generation portfolio risk management but also on CO_2 emissions abatement. A reduction of emissions can be obtained when rational power generation capacity investors decide that the capacity expansion cost risk induced jointly by CO_2 volatility and fossil fuels prices volatility can be efficiently hedged adding to otherwise fossil fuel portfolios some nuclear power as a carbon free asset. This intriguing effect will be discussed using a recently introduced economic analysis tool, called stochastic LCOE theory. The stochastic LCOE theory used here was designed to investigate diversification effects on energy portfolios. In previous papers this theory was used to study diversification effects on portfolios composed of carbon risky fossil technologies and a carbon risk-free nuclear technology in a risk-reward trade-off frame. In this paper the stochastic LCOE theory will be extended to include uncertainty about nuclear power plant construction times, i.e. considering nuclear risky as well, this being the main uncertainty source of financial risk in nuclear technology. Two measures of risk will be used, standard deviation and CVaR deviation, to derive efficient frontiers for generation portfolios. Frontier portfolios will be analyzed in their implications on emissions

  3. Open source tools for the information theoretic analysis of neural data

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  4. Open source tools for the information theoretic analysis of neural data.

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  5. Inference algorithms and learning theory for Bayesian sparse factor analysis

    Rattray, Magnus; Sharp, Kevin; Stegle, Oliver; Winn, John

    2009-01-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  6. Inference algorithms and learning theory for Bayesian sparse factor analysis

    Rattray, Magnus; Sharp, Kevin [School of Computer Science, University of Manchester, Manchester M13 9PL (United Kingdom); Stegle, Oliver [Max-Planck-Institute for Biological Cybernetics, Tuebingen (Germany); Winn, John, E-mail: magnus.rattray@manchester.ac.u [Microsoft Research Cambridge, Roger Needham Building, Cambridge, CB3 0FB (United Kingdom)

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  7. Theory, analysis and design of RF interferometric sensors

    Nguyen, Cam

    2012-01-01

    Theory, Analysis and Design of RF Interferometric Sensors presents the theory, analysis and design of RF interferometric sensors. RF interferometric sensors are attractive for various sensing applications that require every fine resolution and accuracy as well as fast speed. The book also presents two millimeter-wave interferometric sensors realized using RF integrated circuits. The developed millimeter-wave homodyne sensor shows sub-millimeter resolution in the order of 0.05 mm without correction for the non-linear phase response of the sensor's quadrature mixer. The designed millimeter-wave double-channel homodyne sensor provides a resolution of only 0.01 mm, or 1/840th of the operating wavelength, and can inherently suppress the non-linearity of the sensor's quadrature mixer. The experimental results of displacement and velocity measurement are presented as a way to demonstrate the sensing ability of the RF interferometry and to illustrate its many possible applications in sensing. The book is succinct, ye...

  8. Applications of surface analysis and surface theory in tribology

    Ferrante, John

    1989-01-01

    Tribology, the study of adhesion, friction and wear of materials, is a complex field which requires a knowledge of solid state physics, surface physics, chemistry, material science, and mechanical engineering. It has been dominated, however, by the more practical need to make equipment work. With the advent of surface analysis and advances in surface and solid-state theory, a new dimension has been added to the analysis of interactions at tribological interfaces. In this paper the applications of tribological studies and their limitations are presented. Examples from research at the NASA Lewis Research Center are given. Emphasis is on fundamental studies involving the effects of monolayer coverage and thick films on friction and wear. A summary of the current status of theoretical calculations of defect energetics is presented. In addition, some new theoretical techniques which enable simplified quantitative calculations of adhesion, fracture, and friction are discussed.

  9. Probability theory versus simulation of petroleum potential in play analysis

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  10. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  11. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  12. Elaborations of grounded theory in information research: arenas/social worlds theory, discourse and situational analysis

    Vasconcelos, A.C.; Sen, B.A.; Rosa, A.; Ellis, D.

    2012-01-01

    This paper explores elaborations of Grounded Theory in relation to Arenas/Social Worlds Theory. The notions of arenas and social worlds were present in early applications of Grounded Theory but have not been as much used or recognised as the general Grounded Theory approach, particularly in the information studies field. The studies discussed here are therefore very unusual in information research. The empirical contexts of these studies are those of (1) the role of discourse in the organisat...

  13. Comparative analysis of traditional and alternative energy sources

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  14. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  15. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  16. Your Personal Analysis Toolkit - An Open Source Solution

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  17. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  18. A nuclear source term analysis for spacecraft power systems

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  19. Analysis of the Structure Ratios of the Funding Sources

    Maria Daniela Bondoc

    2014-06-01

    Full Text Available The funding sources of the assets and liabilities in the balance sheet include equity capitals and the debts of the entity. The analysis of the structure rates of the funding sources allows for making assessments related to the funding policy, highlighting the financial autonomy and how resources are provided. Using the literature specializing in economic and financial analysis, this paper aims at presenting these rates that focus, on the one hand, to reflect the degree of financial dependence (the rate of financial stability, the rate of global financial autonomy, the rate of on-term financial autonomy and on the other hand the debt structure (the rate of short-term debts, the global indebtedness rate, the on-term indebtedness rate. Based on the financial statements of an entity in the Argeş County, I analysed these indicators, and I drew conclusions and made assessments related to the autonomy, indebtedness and financial stability of the studied entity.

  20. Economics of Water Quality Protection from Nonpoint Sources: Theory and Practice

    Ribaudo, Marc; Horan, Richard D.; Smith, Mark E.

    1999-01-01

    Water quality is a major environmental issue. Pollution from nonpoint sources is the single largest remaining source of water quality impairments in the United States. Agriculture is a major source of several nonpoint-source pollutants, including nutrients, sediment, pesticides, and salts. Agricultural nonpoint pollution reduction policies can be designed to induce producers to change their production practices in ways that improve the environmental and related economic consequences of produc...

  1. Obsidian sources characterized by neutron-activation analysis.

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  2. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  3. Natural disaster risk analysis for critical infrastructure systems: An approach based on statistical learning theory

    Guikema, Seth D.

    2009-01-01

    Probabilistic risk analysis has historically been developed for situations in which measured data about the overall reliability of a system are limited and expert knowledge is the best source of information available. There continue to be a number of important problem areas characterized by a lack of hard data. However, in other important problem areas the emergence of information technology has transformed the situation from one characterized by little data to one characterized by data overabundance. Natural disaster risk assessments for events impacting large-scale, critical infrastructure systems such as electric power distribution systems, transportation systems, water supply systems, and natural gas supply systems are important examples of problems characterized by data overabundance. There are often substantial amounts of information collected and archived about the behavior of these systems over time. Yet it can be difficult to effectively utilize these large data sets for risk assessment. Using this information for estimating the probability or consequences of system failure requires a different approach and analysis paradigm than risk analysis for data-poor systems does. Statistical learning theory, a diverse set of methods designed to draw inferences from large, complex data sets, can provide a basis for risk analysis for data-rich systems. This paper provides an overview of statistical learning theory methods and discusses their potential for greater use in risk analysis

  4. Political Discourse Analysis Through Solving Problems of Graph Theory

    Monica Patrut

    2010-03-01

    Full Text Available In this article, we show how, using graph theory, we can make a content analysis of political discourse. Assumptions of this analysis are:
    - we have a corpus of speech of each party or candidate;
    - we consider that speech conveys economic, political, socio-cultural values, these taking the form of words or word families;
    - we consider that there are interdependences between the values of a political discourse; they are given by the co-occurrence of two values, as words in the text, within a well defined fragment, or they are determined by the internal logic of political discourse;
    - established links between values in a political speech have associated positive numbers indicating the "power" of those links; these "powers" are defined according to both the number of co-occurrences of values, and the internal logic of the discourse where they occur.
    In this context we intend to highlight the following:
    a which is the dominant value in a political speech;
    b which groups of values have ties between them and have no connection with the rest;
    c which is the order in which political values should be set in order to obtain an equivalent but more synthetic speech compared to the already given one;
    d which are the links between values that form the "core" political speech.
    To solve these problems, we shall use the Political Analyst program. After that, we shall present the concepts necessary to the understanding of the introductory graph theory, useful in understanding the analysis of the software and then the operation of the program. This paper extends the previous paper [6].

  5. Neutronics of the IFMIF neutron source: development and analysis

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  6. Java Source Code Analysis for API Migration to Embedded Systems

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  7. Decision theory, the context for risk and reliability analysis

    Kaplan, S.

    1985-01-01

    According to this model of the decision process then, the optimum decision is that option having the largest expected utility. This is the fundamental model of a decision situation. It is necessary to remark that in order for the model to represent a real-life decision situation, it must include all the options present in that situation, including, for example, the option of not deciding--which is itself a decision, although usually not the optimum one. Similarly, it should include the option of delaying the decision while the authors gather further information. Both of these options have probabilities, outcomes, impacts, and utilities like any option and should be included explicitly in the decision diagram. The reason for doing a quantitative risk or reliability analysis is always that, somewhere underlying there is a decision to be made. The decision analysis therefore always forms the context for the risk or reliability analysis, and this context shapes the form and language of that analysis. Therefore, they give in this section a brief review of the well-known decision theory diagram

  8. Total source charge and charge screening in Yang-Mills theories

    Campbell, W.B.; Norton, R.E.

    1991-01-01

    New gauge-invariant definitions for the total charge on a static Yang-Mills source are suggested which we argue are better suited for determining when true color screening has occurred. In particular, these new definitions imply that the Abelian Coulomb solution for a simple ''electric'' dipole source made up of two opposite point charges has zero total source charge and therefore no color screening. With the definition of total source charge previously suggested by other authors, such a source would have a total source charge of 2q and therefore a screening charge in the field of -2q, where q is the magnitude of the charge of either point charge. Our definitions for more general solutions are not unique because of the path dependence of the parallel transport of charges. Suggestions for removing this ambiguity are offered, but it is not known if a unique, physically meaningful definition of total source charge in fact exists

  9. Schwinger's quantum action principle from Dirac’s formulation through Feynman’s path integrals, the Schwinger-Keldysh method, quantum field theory, to source theory

    Milton, Kimball A

    2015-01-01

    Starting from the earlier notions of stationary action principles, these tutorial notes shows how Schwinger’s Quantum Action Principle descended from Dirac’s formulation, which independently led Feynman to his path-integral formulation of quantum mechanics. Part I brings out in more detail the connection between the two formulations, and applications are discussed. Then, the Keldysh-Schwinger time-cycle method of extracting matrix elements is described. Part II will discuss the variational formulation of quantum electrodynamics and the development of source theory.

  10. Childhood obesity in transition zones: an analysis using structuration theory.

    Chan, Christine; Deave, Toity; Greenhalgh, Trisha

    2010-07-01

    Childhood obesity is particularly prevalent in areas that have seen rapid economic growth, urbanisation, cultural transition, and commodification of food systems. Structuration theory may illuminate the interaction between population and individual-level causes of obesity. We conducted in-depth ethnographies of six overweight/obese and four non-overweight preschool children in Hong Kong, each followed for 12-18 months. Analysis was informed by Stones' strong structuration theory. Risk factors played out differently for different children as social structures were enacted at the level of family and preschool. The network of caregiving roles and relationships around the overweight/obese child was typically weak and disjointed, and the primary caregiver appeared confused by mixed messages about what is normal, expected and legitimate behaviour. In particular, external social structures created pressure to shift childcare routines from the logic of nurturing to the logic of consumption. Our findings suggest that threats to what Giddens called ontological security in the primary caregiver may underpin the poor parenting, family stress and weak mealtime routines that mediate the relationship between an obesogenic environment and the development of obesity in a particular child. This preliminary study offers a potentially transferable approach for studying emerging epidemics of diseases of modernity in transition societies.

  11. Time-correlated neutron analysis of a multiplying HEU source

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  12. Time-correlated neutron analysis of a multiplying HEU source

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  13. Time-correlated neutron analysis of a multiplying HEU source

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  14. A non-perturbative analysis in finite volume gauge theory

    Koller, J.; State Univ. of New York, Stony Brook; Van Baal, P.; State Univ. of New York, Stony Brook

    1988-01-01

    We discuss SU(2) gauge theory on a three-torus using a finite volume expansion. Our discovery of natural coordinates allows us to obtain continuum results in a region where Monte Carlo data are also available. The obtained results agree well with the perturbative and semiclassical analysis for small volumes, and there is fair agreement with the Monte Carlo results in intermediate volumes. The simple picture which emerges for the approximate low energy dynamics is that of three interacting particles enclosed in a sphere, with zero total 'angular momentum'. The validity of an adiabatic approximation is investigated. The fundamentally new understanding gained, is that non-perturbative dynamics can be incorporated by imposing boundary conditions which arise through the nontrivial topology of configuration space. (orig.)

  15. Empathy from the client's perspective: A grounded theory analysis.

    MacFarlane, Peter; Anderson, Timothy; McClintock, Andrew S

    2017-03-01

    Although empathy is one of most robust predictors of client outcome, there is little consensus about how best to conceptualize this construct. The aim of the present research was to investigate clients' perceptions and in-session experiences of empathy. Semi-structured, video-assisted interpersonal process recall interviews were used to collect data from nine clients receiving individual psychotherapy at a university psychology clinic. Grounded theory analysis yielded a model consisting of three clusters: (1) relational context of empathy (i.e., personal relationship and professional relationship), (2) types of empathy (i.e., psychotherapists' cognitive empathy, psychotherapists' emotional empathy, and client attunement to psychotherapist), and (3) utility of empathy (i.e., process-related benefits and client-related benefits). These results suggest that empathy is a multi-dimensional, interactional process that affects-and is affected by-the broader relationship between client and psychotherapist.

  16. Theory of sampling: four critical success factors before analysis.

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  17. GRAPHICAL ANALYSIS OF LAFFER'S THEORY FOR EUROPEAN UNION MEMBER STATES

    LILIANA BUNESCU

    2013-04-01

    Full Text Available Most times the current situation of one or another country depends on the historical development of own tax system. A practical question of any governance is to determine the optimal taxation rate level, bringing to the state the highest tax revenues. A good place to start is with what is popularly known as the Laffer curve. This paper aims to determine in graphical terms the level where European economies ranks by using Laffer curve based on the data series provided by the European Commission and the World Bank. Graphical analysis of Laffer's theory can emphasize only the positioning on one or another side of point for maximum tax revenues, a position that can influence fiscal policy decisions. Conclusions at European Union level are simple. Value of taxation rate for fiscal optimal point varies from one Member State to another, from 48.9% in Denmark to 28% in Romania, with an average of 37.1% for the EU-27.

  18. Dosimetric analysis of radiation sources for use dermatological lesions

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  19. Application of Extreme Value Theory to Crash Data Analysis.

    Xu, Lan; Nusholtz, Guy

    2017-11-01

    A parametric model obtained by fitting a set of data to a function generally uses a procedure such as maximum likelihood or least squares. In general this will generate the best estimate for the distribution of the data overall but will not necessarily generate a reasonable estimation for the tail of the distribution unless the function fitted resembles the underlying distribution function. A distribution function can represent an estimate that is significantly different from the actual tail data, while the bulk of the data is reasonably represented by the central part of the fitted distribution. Extreme value theory can be used to improve the predictive capabilities of the fitted function in the tail region. In this study the peak-over-threshold approach from the extreme value theory was utilized to show that it is possible to obtain a better fit of the tail of a distribution than the procedures that use the entire distribution only. Additional constraints, on the current use of the extreme value approach with respect to the selection of the threshold (an estimate of the beginning of the tail region) that minimize the sensitivity to individual data samples associated with the tail section as well as contamination from the central distribution are used. Once the threshold is determined, the maximum likelihood method was used to fit the exceedances with the Generalized Pareto Distribution to obtain the tail distribution. The approach was then used in the analysis of airbag inflator pressure data from tank tests, crash velocity distribution and mass distribution from the field crash data (NASS). From the examples, the extreme (tail) distributions were better estimated with the Generalized Pareto Distribution, than a single overall distribution, along with the probability of the occurrence for a given extreme value, or a rare observation such as a high speed crash. It was concluded that the peak-over-threshold approach from extreme value theory can be a useful tool in

  20. Theory of economic cycle: analysis of аustrian school

    Nesterenko, O.

    2008-01-01

    Essence of Austrian theory of economic cycle has been revealed. Differences of Austrian school approaches from theories of economic fluctuations in other streams of economic sciences have been analyzed

  1. Self-consistent field theory of collisions: Orbital equations with asymptotic sources and self-averaged potentials

    Hahn, Y.K., E-mail: ykhahn22@verizon.net

    2014-12-15

    The self-consistent field theory of collisions is formulated, incorporating the unique dynamics generated by the self-averaged potentials. The bound state Hartree–Fock approach is extended for the first time to scattering states, by properly resolving the principal difficulties of non-integrable continuum orbitals and imposing complex asymptotic conditions. The recently developed asymptotic source theory provides the natural theoretical basis, as the asymptotic conditions are completely transferred to the source terms and the new scattering function is made fullyintegrable. The scattering solutions can then be directly expressed in terms of bound state HF configurations, establishing the relationship between the bound and scattering state solutions. Alternatively, the integrable spin orbitals are generated by constructing the individual orbital equations that contain asymptotic sources and self-averaged potentials. However, the orbital energies are not determined by the equations, and a special channel energy fixing procedure is developed to secure the solutions. It is also shown that the variational construction of the orbital equations has intrinsic ambiguities that are generally associated with the self-consistent approach. On the other hand, when a small subset of open channels is included in the source term, the solutions are only partiallyintegrable, but the individual open channels can then be treated more simply by properly selecting the orbital energies. The configuration mixing and channel coupling are then necessary to complete the solution. The new theory improves the earlier continuum HF model. - Highlights: • First extension of HF to scattering states, with proper asymptotic conditions. • Orbital equations with asymptotic sources and integrable orbital solutions. • Construction of self-averaged potentials, and orbital energy fixing. • Channel coupling and configuration mixing, involving the new orbitals. • Critical evaluation of the

  2. Estimation of distance error by fuzzy set theory required for strength determination of HDR (192)Ir brachytherapy sources.

    Kumar, Sudhir; Datta, D; Sharma, S D; Chourasiya, G; Babu, D A R; Sharma, D N

    2014-04-01

    Verification of the strength of high dose rate (HDR) (192)Ir brachytherapy sources on receipt from the vendor is an important component of institutional quality assurance program. Either reference air-kerma rate (RAKR) or air-kerma strength (AKS) is the recommended quantity to specify the strength of gamma-emitting brachytherapy sources. The use of Farmer-type cylindrical ionization chamber of sensitive volume 0.6 cm(3) is one of the recommended methods for measuring RAKR of HDR (192)Ir brachytherapy sources. While using the cylindrical chamber method, it is required to determine the positioning error of the ionization chamber with respect to the source which is called the distance error. An attempt has been made to apply the fuzzy set theory to estimate the subjective uncertainty associated with the distance error. A simplified approach of applying this fuzzy set theory has been proposed in the quantification of uncertainty associated with the distance error. In order to express the uncertainty in the framework of fuzzy sets, the uncertainty index was estimated and was found to be within 2.5%, which further indicates that the possibility of error in measuring such distance may be of this order. It is observed that the relative distance li estimated by analytical method and fuzzy set theoretic approach are consistent with each other. The crisp values of li estimated using analytical method lie within the bounds computed using fuzzy set theory. This indicates that li values estimated using analytical methods are within 2.5% uncertainty. This value of uncertainty in distance measurement should be incorporated in the uncertainty budget, while estimating the expanded uncertainty in HDR (192)Ir source strength measurement.

  3. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  4. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further

  5. Item response theory analysis applied to the Spanish version of the Personal Outcomes Scale.

    Guàrdia-Olmos, J; Carbó-Carreté, M; Peró-Cebollero, M; Giné, C

    2017-11-01

    The study of measurements of quality of life (QoL) is one of the great challenges of modern psychology and psychometric approaches. This issue has greater importance when examining QoL in populations that were historically treated on the basis of their deficiency, and recently, the focus has shifted to what each person values and desires in their life, as in cases of people with intellectual disability (ID). Many studies of QoL scales applied in this area have attempted to improve the validity and reliability of their components by incorporating various sources of information to achieve consistency in the data obtained. The adaptation of the Personal Outcomes Scale (POS) in Spanish has shown excellent psychometric attributes, and its administration has three sources of information: self-assessment, practitioner and family. The study of possible congruence or incongruence of observed distributions of each item between sources is therefore essential to ensure a correct interpretation of the measure. The aim of this paper was to analyse the observed distribution of items and dimensions from the three Spanish POS information sources cited earlier, using the item response theory. We studied a sample of 529 people with ID and their respective practitioners and family member, and in each case, we analysed items and factors using Samejima's model of polytomic ordinal scales. The results indicated an important number of items with differential effects regarding sources, and in some cases, they indicated significant differences in the distribution of items, factors and sources of information. As a result of this analysis, we must affirm that the administration of the POS, considering three sources of information, was adequate overall, but a correct interpretation of the results requires that it obtain much more information to consider, as well as some specific items in specific dimensions. The overall ratings, if these comments are considered, could result in bias. © 2017

  6. PUTTING COMMUNICATION FRONT AND CENTER IN INSTITUTIONAL THEORY AND ANALYSIS

    Cornelissen, J.P.; Durand, R.; Fiss, P.C.; Lammers, J.C.; Vaara, E.

    2015-01-01

    We conceptualize the roots of cognitive, linguistic, and communicative theories of institutions and outline the promise and potential of a stronger communication focus for institutional theory. In particular, we outline a theoretical approach that puts communication at the heart of theories of

  7. Energy sources and nuclear energy. Comparative analysis and ethical reflections

    Hoenraet, C.

    1999-01-01

    Under the authority of the episcopacy of Brugge in Belgium an independent working group Ethics and Nuclear Energy was set up. The purpose of the working group was to collect all the necessary information on existing energy sources and to carry out a comparative analysis of their impact on mankind and the environment. Also attention was paid to economical and social aspects. The results of the study are subjected to an ethical reflection. The book is aimed at politicians, teachers, journalists and every interested layman who wants to gain insight into the consequences of the use of nuclear energy and other energy sources. Based on the information in this book one should be able to objectively define one's position in future debates on this subject

  8. Analysis of the TMI-2 source range detector response

    Carew, J.F.; Diamond, D.J.; Eridon, J.M.

    1980-01-01

    In the first few hours following the TMI-2 accident large variations (factors of 10-100) in the source range (SR) detector response were observed. The purpose of this analysis was to quantify the various effects which could contribute to these large variations. The effects evaluated included the transmission of neutrons and photons from the core to detector and the reduction in the multiplication of the Am-Be startup sources, and subsequent reduction in SR detector response, due to core voiding. A one-dimensional ANISN slab model of the TMI-2 core, core externals, pressure vessel and containment has been constructed for calculation of the SR detector response and is presented

  9. Obisdian sourcing by PIXE analysis at AURA2

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  10. Analysis of Earthquake Source Spectra in Salton Trough

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  11. Ambient Seismic Source Inversion in a Heterogeneous Earth: Theory and Application to the Earth's Hum

    Ermert, Laura; Sager, Korbinian; Afanasiev, Michael; Boehm, Christian; Fichtner, Andreas

    2017-11-01

    The sources of ambient seismic noise are extensively studied both to better understand their influence on ambient noise tomography and related techniques, and to infer constraints on their excitation mechanisms. Here we develop a gradient-based inversion method to infer the space-dependent and time-varying source power spectral density of the Earth's hum from cross correlations of continuous seismic data. The precomputation of wavefields using spectral elements allows us to account for both finite-frequency sensitivity and for three-dimensional Earth structure. Although similar methods have been proposed previously, they have not yet been applied to data to the best of our knowledge. We apply this method to image the seasonally varying sources of Earth's hum during North and South Hemisphere winter. The resulting models suggest that hum sources are localized, persistent features that occur at Pacific coasts or shelves and in the North Atlantic during North Hemisphere winter, as well as South Pacific coasts and several distinct locations in the Southern Ocean in South Hemisphere winter. The contribution of pelagic sources from the central North Pacific cannot be constrained. Besides improving the accuracy of noise source locations through the incorporation of finite-frequency effects and 3-D Earth structure, this method may be used in future cross-correlation waveform inversion studies to provide initial source models and source model updates.

  12. Archival Theory and the Shaping of Educational History: Utilizing New Sources and Reinterpreting Traditional Ones

    Glotzer, Richard

    2013-01-01

    Information technology has spawned new evidentiary sources, better retrieval systems for existing ones, and new tools for interpreting traditional source materials. These advances have contributed to a broadening of public participation in civil society (Blouin and Rosenberg 2006). In these culturally unsettled and economically fragile times…

  13. Lattice field theories: non-perturbative methods of analysis

    Weinstein, M.

    1978-01-01

    A lecture is given on the possible extraction of interesting physical information from quantum field theories by studying their semiclassical versions. From the beginning the problem of solving for the spectrum states of any given continuum quantum field theory is considered as a giant Schroedinger problem, and then some nonperturbative methods for diagonalizing the Hamiltonian of the theory are explained without recourse to semiclassical approximations. The notion of a lattice appears as an artifice to handle the problems associated with the familiar infrared and ultraviolet divergences of continuum quantum field theory and in fact for all but gauge theories. 18 references

  14. Potential Functional Embedding Theory at the Correlated Wave Function Level. 2. Error Sources and Performance Tests.

    Cheng, Jin; Yu, Kuang; Libisch, Florian; Dieterich, Johannes M; Carter, Emily A

    2017-03-14

    Quantum mechanical embedding theories partition a complex system into multiple spatial regions that can use different electronic structure methods within each, to optimize trade-offs between accuracy and cost. The present work incorporates accurate but expensive correlated wave function (CW) methods for a subsystem containing the phenomenon or feature of greatest interest, while self-consistently capturing quantum effects of the surroundings using fast but less accurate density functional theory (DFT) approximations. We recently proposed two embedding methods [for a review, see: Acc. Chem. Res. 2014 , 47 , 2768 ]: density functional embedding theory (DFET) and potential functional embedding theory (PFET). DFET provides a fast but non-self-consistent density-based embedding scheme, whereas PFET offers a more rigorous theoretical framework to perform fully self-consistent, variational CW/DFT calculations [as defined in part 1, CW/DFT means subsystem 1(2) is treated with CW(DFT) methods]. When originally presented, PFET was only tested at the DFT/DFT level of theory as a proof of principle within a planewave (PW) basis. Part 1 of this two-part series demonstrated that PFET can be made to work well with mixed Gaussian type orbital (GTO)/PW bases, as long as optimized GTO bases and consistent electron-ion potentials are employed throughout. Here in part 2 we conduct the first PFET calculations at the CW/DFT level and compare them to DFET and full CW benchmarks. We test the performance of PFET at the CW/DFT level for a variety of types of interactions (hydrogen bonding, metallic, and ionic). By introducing an intermediate CW/DFT embedding scheme denoted DFET/PFET, we show how PFET remedies different types of errors in DFET, serving as a more robust type of embedding theory.

  15. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  16. Creep analysis of fuel plates for the Advanced Neutron Source

    Swinson, W.F.; Yahr, G.T.

    1994-11-01

    The reactor for the planned Advanced Neutron Source will use closely spaced arrays of fuel plates. The plates are thin and will have a core containing enriched uranium silicide fuel clad in aluminum. The heat load caused by the nuclear reactions within the fuel plates will be removed by flowing high-velocity heavy water through narrow channels between the plates. However, the plates will still be at elevated temperatures while in service, and the potential for excessive plate deformation because of creep must be considered. An analysis to include creep for deformation and stresses because of temperature over a given time span has been performed and is reported herein

  17. Kajian Unified Theory of Acceptance and Use of Technology Dalam Penggunaan Open Source Software Database Management System

    Michael Sonny

    2016-06-01

    Full Text Available Perkembangan perangkat lunak computer dewasa ini terjadi sedemikian pesatnya, perkembangan tidak hanya terjadi pada perangkat lunak yang memiliki lisensi tertentu, perangkat open source pun demikian. Perkembangan itu tentu saja sangat menggembirakan bagi pengguna computer khususnya di kalangan pendidikan maupun di kalangan mahasiswa, karena pengguna mempunyai beberapa pilihan untuk menggunakan aplikasi. Perangkat lunak open source juga menawarkan produk yang umumnya gratis, diberikan kode programnya, kebebasan untuk modifikasi dan mengembangkan. Meneliti aplikasi berbasis open source tentu saja sangat beragam seperti aplikasi untuk pemrograman (PHP, Gambas, Database Management System (MySql, SQLite, browsing (Mozilla, Firefox, Opera. Pada penelitian ini di kaji penerimaan aplikasi DBMS (Database Management System seperti MySql dan SQLite dengan menggunakan sebuah model yang dikembangkan oleh Venkantes(2003 yaitu UTAUT (Unified Theory of Acceptance and Use of Technology. Faktor – faktor tertentu juga mempengaruhi dalam melakukan kegiatan pembelajaran aplikasi open source ini, salah satu faktor atau yang disebut dengan moderating yang bisa mempengaruhi efektifitas dan efisiensi. Dengan demikian akan mendapatkan hasil yang bisa membuat kelancaran dalam pembelajaran aplikasi berbasis open source ini.   Kata kunci— open source, Database Management System (DBMS, Modereting

  18. Who uses nursing theory? A univariate descriptive analysis of five years' research articles.

    Bond, A Elaine; Eshah, Nidal Farid; Bani-Khaled, Mohammed; Hamad, Atef Omar; Habashneh, Samira; Kataua', Hussein; al-Jarrah, Imad; Abu Kamal, Andaleeb; Hamdan, Falastine Rafic; Maabreh, Roqia

    2011-06-01

    Since the early 1950s, nursing leaders have worked diligently to build the Scientific Discipline of Nursing, integrating Theory, Research and Practice. Recently, the role of theory has again come into question, with some scientists claiming nurses are not using theory to guide their research, with which to improve practice. The purposes of this descriptive study were to determine: (i) Were nursing scientists' research articles in leading nursing journals based on theory? (ii) If so, were the theories nursing theories or borrowed theories? (iii) Were the theories integrated into the studies, or were they used as organizing frameworks? Research articles from seven top ISI journals were analysed, excluding regularly featured columns, meta-analyses, secondary analysis, case studies and literature reviews. The authors used King's dynamic Interacting system and Goal Attainment Theory as an organizing framework. They developed consensus on how to identify the integration of theory, searching the Title, Abstract, Aims, Methods, Discussion and Conclusion sections of each research article, whether quantitative or qualitative. Of 2857 articles published in the seven journals from 2002 to, and including, 2006, 2184 (76%) were research articles. Of the 837 (38%) authors who used theories, 460 (55%) used nursing theories, 377 (45%) used other theories: 776 (93%) of those who used theory integrated it into their studies, including qualitative studies, while 51 (7%) reported they used theory as an organizing framework for their studies. Closer analysis revealed theory principles were implicitly implied, even in research reports that did not explicitly report theory usage. Increasing numbers of nursing research articles (though not percentagewise) continue to be guided by theory, and not always by nursing theory. Newer nursing research methods may not explicitly state the use of nursing theory, though it is implicitly implied. © 2010 The Authors. Scandinavian Journal of Caring

  19. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  20. Theory of Maxwell's fish eye with mutually interacting sources and drains

    Leonhardt, Ulf; Sahebdivan, Sahar

    2015-11-01

    Maxwell's fish eye is predicted to image with a resolution not limited by the wavelength of light. However, interactions between sources and drains may ruin the subwavelength imaging capabilities of this and similar absolute optical instruments. Nevertheless, as we show in this paper, at resonance frequencies of the device, an array of drains may resolve a single source, or alternatively, a single drain may scan an array of sources, no matter how narrowly spaced they are. It seems that near-field information can be obtained from far-field distances.

  1. Sources

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  2. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  3. Review on solving the inverse problem in EEG source analysis

    Fabri Simon G

    2008-11-01

    Full Text Available Abstract In this primer, we give a review of the inverse problem for EEG source localization. This is intended for the researchers new in the field to get insight in the state-of-the-art techniques used to find approximate solutions of the brain sources giving rise to a scalp potential recording. Furthermore, a review of the performance results of the different techniques is provided to compare these different inverse solutions. The authors also include the results of a Monte-Carlo analysis which they performed to compare four non parametric algorithms and hence contribute to what is presently recorded in the literature. An extensive list of references to the work of other researchers is also provided. This paper starts off with a mathematical description of the inverse problem and proceeds to discuss the two main categories of methods which were developed to solve the EEG inverse problem, mainly the non parametric and parametric methods. The main difference between the two is to whether a fixed number of dipoles is assumed a priori or not. Various techniques falling within these categories are described including minimum norm estimates and their generalizations, LORETA, sLORETA, VARETA, S-MAP, ST-MAP, Backus-Gilbert, LAURA, Shrinking LORETA FOCUSS (SLF, SSLOFO and ALF for non parametric methods and beamforming techniques, BESA, subspace techniques such as MUSIC and methods derived from it, FINES, simulated annealing and computational intelligence algorithms for parametric methods. From a review of the performance of these techniques as documented in the literature, one could conclude that in most cases the LORETA solution gives satisfactory results. In situations involving clusters of dipoles, higher resolution algorithms such as MUSIC or FINES are however preferred. Imposing reliable biophysical and psychological constraints, as done by LAURA has given superior results. The Monte-Carlo analysis performed, comparing WMN, LORETA, sLORETA and SLF

  4. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  5. Thermal hydraulic analysis of the encapsulated nuclear heat source

    Sienicki, J.J.; Wade, D.C. [Argonne National Lab., IL (United States)

    2001-07-01

    An analysis has been carried out of the steady state thermal hydraulic performance of the Encapsulated Nuclear Heat Source (ENHS) 125 MWt, heavy liquid metal coolant (HLMC) reactor concept at nominal operating power and shutdown decay heat levels. The analysis includes the development and application of correlation-type analytical solutions based upon first principles modeling of the ENHS concept that encompass both pure as well as gas injection augmented natural circulation conditions, and primary-to-intermediate coolant heat transfer. The results indicate that natural circulation of the primary coolant is effective in removing heat from the core and transferring it to the intermediate coolant without the attainment of excessive coolant temperatures. (authors)

  6. IMMAN: free software for information theory-based chemometric analysis.

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  7. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  8. Primer Vector Optimization: Survey of Theory, new Analysis and Applications

    Guzman

    This paper presents a preliminary study in developing a set of optimization tools for orbit rendezvous, transfer and station keeping. This work is part of a large scale effort undergoing at NASA Goddard Space Flight Center and a.i. solutions, Inc. to build generic methods, which will enable missions with tight fuel budgets. Since no single optimization technique can solve efficiently all existing problems, a library of tools where the user could pick the method most suited for the particular mission is envisioned. The first trajectory optimization technique explored is Lawden's primer vector theory [Ref. 1]. Primer vector theory can be considered as a byproduct of applying Calculus of Variations (COV) techniques to the problem of minimizing the fuel usage of impulsive trajectories. For an n-impulse trajectory, it involves the solution of n-1 two-point boundary value problems. In this paper, we look at some of the different formulations of the primer vector (dependent on the frame employed and on the force model). Also, the applicability of primer vector theory is examined in effort to understand when and why the theory can fail. Specifically, since COV is based on "small variations", singularities in the linearized (variational) equations of motion along the arcs must be taken into account. These singularities are a recurring problem in analyzes that employ "small variations" [Refs. 2, 3]. For example, singularities in the (2-body problem) variational equations along elliptic arcs occur when [Ref. 4], 1) the difference between the initial and final times is a multiple of the reference orbit period, 2) the difference between the initial and final true anomalies are given by k, for k= 0, 1, 2, 3,..., note that this cover the 3) the time of flight is a minimum for the given difference in true anomaly. For the N-body problem, the situation is more complex and is still under investigation. Several examples, such as the initialization of an orbit (ascent trajectory) and

  9. Error Analysis of CM Data Products Sources of Uncertainty

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  10. Application of Open Source Technologies for Oceanographic Data Analysis

    Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.

    2015-12-01

    NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes

  11. Prospects for accelerator neutron sources for large volume minerals analysis

    Clayton, C.G.; Spackman, R.

    1988-01-01

    The electron Linac can be regarded as a practical source of thermal neutrons for activation analysis of large volume mineral samples. With a suitable target and moderator, a neutron flux of about 10 10 n/cm/s over 2-3 kg of rock can be generated. The proton Linac gives the possibility of a high neutron yield (> 10 12 n/s) of fast neutrons at selected energies. For the electron Linac, targets of W-U and W-Be are discussed. The advantages and limitations of the system are demonstrated for the analysis of gold in rocks and ores and for platinum in chromitite. These elements were selected as they are most likely to justify an accelerator installation at the present time. Errors due to self shielding in gold particles for thermal neutrons are discussed. The proton Linac is considered for neutrons generated from a lithium target through the 7 Li(p, n) 7 Be reaction. The analysis of gold by fast neutron activation is considered. This approach avoids particle self-absorption and, by appropriate proton energy selection, avoids potentially dominating interfering reactions. The analysis of 235 U in the presence of 238 U and 232 Th is also considered. (author)

  12. Lyapunov analysis: from dynamical systems theory to applications

    Cencini, Massimo; Ginelli, Francesco

    2013-06-01

    [17], von Neumann [18], Krylov [19]3 and Asonov and Sinai [20] on ergodic theory. Lyapunov exponents quantify exponential sensitivity to initial conditions and provide direct access to the entropy production in ergodic systems via the Pesin theory [21]. Further advances have been made possible by the introduction of proper physical invariant measures for certain dissipative systems due to Sinai [22], Ruelle [23] and Bowen [24, 25]. However, it was necessary to wait until the end of the 1970s before the independent works of Shimada and Nagashima [26] and Benettin et al [27] introduced the numerical algorithms required to compute Lyapunov exponents beyond the largest one. The availability of such algorithms and also, at about the same time, of those necessary for the computation of fractal dimensions and entropies by Grassberger and Procaccia [28], made possible the study of chaotic behavior in physically relevant models. Lyapunov analysis, applied to experimental systems [29], was also made possible by a combination of these numerical methods with ideas from nonlinear time series analysis [30]. As a result, it is nowadays widely recognized that Lyapunov exponents are a central tool of chaos theory, crucial for characterizing a number of interesting physical properties including dynamical entropies and fractal dimensions [31]. Their pivotal role in modern dynamical systems theory has been established by a fruitful exchange between a rigorous (and beautiful) mathematical theory and the algorithmic approaches essential for understanding many physical phenomena. From the 1990s to the present, with the concomitant progress in both theoretical understanding and computer capabilities, there has been a progressive shift of interest from low dimensional towards high dimensional systems. This shift towards dynamics characterized by many degrees of freedom, possibly spatially organized and/or with several characteristic temporal scales, has been accompanied by the need for

  13. Layers of protection analysis in the framework of possibility theory.

    Ouazraoui, N; Nait-Said, R; Bourareche, M; Sellami, I

    2013-11-15

    An important issue faced by risk analysts is how to deal with uncertainties associated with accident scenarios. In industry, one often uses single values derived from historical data or literature to estimate events probability or their frequency. However, both dynamic environments of systems and the need to consider rare component failures may make unrealistic this kind of data. In this paper, uncertainty encountered in Layers Of Protection Analysis (LOPA) is considered in the framework of possibility theory. Data provided by reliability databases and/or experts judgments are represented by fuzzy quantities (possibilities). The fuzzy outcome frequency is calculated by extended multiplication using α-cuts method. The fuzzy outcome is compared to a scenario risk tolerance criteria and the required reduction is obtained by resolving a possibilistic decision-making problem under necessity constraint. In order to validate the proposed model, a case study concerning the protection layers of an operational heater is carried out. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Design and analysis of nuclear battery driven by the external neutron source

    Wang, Sanbing; He, Chaohui

    2014-01-01

    Highlights: • A new type of space nuclear power called NBDEx is investigated. • NBDEx with 252 Cf has better performance than RTG with similar structure. • Its thermal power gets great improvement with increment of fuel enrichment. • The service life of NBDEx is about 2.96 year. • The launch abortion accident analysis fully demonstrates the advantage of NBDEx. - Abstract: Based on the theory of ADS (Accelerator Driven Subcritical reactor), a new type of nuclear battery was investigated, which was composed of a subcritical fission module and an isotope neutron source, called NBDEx (Nuclear Battery Driven by External neutron source). According to the structure of GPHS-RTG (General Purpose Heat Source Radioisotope Thermoelectric Generator), the fuel cell model and fuel assembly model of NBDEx were set up, and then their performances were analyzed with MCNP code. From these results, it was found that the power and power density of NBDEx were almost six times higher than the RTG’s. For fully demonstrating the advantage of NBDEx, the analysis of its impact factors was performed with MCNP code, and its lifetime was also calculated using the Origen code. These results verified that NBDEx was more suitable for the space missions than RTG

  15. Neutron activation analysis detection limits using 252Cf sources

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  16. An analysis of the concept of equilibrium in organization theory

    Gazendam, H.W.M.; Simons, John L.

    1998-01-01

    This article analyzes how the equilibrium concept is used in four organization theories: the theories of Fayol, Mintzberg, Morgan, and Volberda. Equilibrium can be defined as balance, fit or requisite variety. Equilibrium is related to observables dependent on the definition of organization as work

  17. An Analysis of Stochastic Game Theory for Multiagent Reinforcement Learning

    Bowling, Michael

    2000-01-01

    .... In this paper we contribute a comprehensive presentation of the relevant techniques for solving stochastic games from both the game theory community and reinforcement learning communities. We examine the assumptions and limitations of these algorithms, and identify similarities between these algorithms, single agent reinforcement learners, and basic game theory techniques.

  18. Theories of conduct disorder: a causal modelling analysis

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  19. Principals' Leadership and Teachers' Motivation: Self-Determination Theory Analysis

    Eyal, Ori; Roth, Guy

    2011-01-01

    Purpose: The purpose of this paper is to investigate the relationship between educational leadership and teacher's motivation. The research described here was anchored in the convergence of two fundamental theories of leadership and motivation: the full range model of leadership and self-determination theory. The central hypotheses were that…

  20. sources

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  1. Identifying avian sources of faecal contamination using sterol analysis.

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  2. Theory for beam-plasma millimeter-wave radiation source experiments

    Rosenberg, M.; Krall, N.A.

    1989-01-01

    This paper reports on theoretical studies for millimeter-wave plasma source experiments. In the device, millimeter-wave radiation is generated in a plasma-filled waveguide driven by counter-streaming electron beams. The beams excite electron plasma waves which couple to produce radiation at twice the plasma frequency. Physics topics relevant to the high electron beam current regime are discussed

  3. Discrimination and Well-being: Testing the differential source and Organizational Justice theories of workplace aggression

    Wood, S.; Braeken, J.; Niven, K.

    2013-01-01

    People may be subjected to discrimination from a variety of sources in the workplace. In this study of mental health workers, we contrast four potential perpetrators of discrimination (managers, co-workers, patients, and visitors) to investigate whether the negative impact of discrimination on

  4. Microeconomic theory and computation applying the maxima open-source computer algebra system

    Hammock, Michael R

    2014-01-01

    This book provides a step-by-step tutorial for using Maxima, an open-source multi-platform computer algebra system, to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques.

  5. Detection, Source Location, and Analysis of Volcano Infrasound

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  6. Multi-source Geospatial Data Analysis with Google Earth Engine

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  7. A One-Dimensional Thermoelastic Problem due to a Moving Heat Source under Fractional Order Theory of Thermoelasticity

    Tianhu He

    2014-01-01

    Full Text Available The dynamic response of a one-dimensional problem for a thermoelastic rod with finite length is investigated in the context of the fractional order theory of thermoelasticity in the present work. The rod is fixed at both ends and subjected to a moving heat source. The fractional order thermoelastic coupled governing equations for the rod are formulated. Laplace transform as well as its numerical inversion is applied to solving the governing equations. The variations of the considered temperature, displacement, and stress in the rod are obtained and demonstrated graphically. The effects of time, velocity of the moving heat source, and fractional order parameter on the distributions of the considered variables are of concern and discussed in detail.

  8. Phase 2 safety analysis report: National Synchrotron Light Source

    Stefan, P.

    1989-06-01

    The Phase II program was established in order to provide additional space for experiments, and also staging and equipment storage areas. It also provides additional office space and new types of advanced instrumentation for users. This document will deal with the new safety issues resulting from this extensive expansion program, and should be used as a supplement to BNL Report No. 51584 ''National Synchrotron Light Source Safety Analysis Report,'' July 1982 (hereafter referred to as the Phase I SAR). The initial NSLS facility is described in the Phase I SAR. It comprises two electron storage rings, an injection system common to both, experimental beam lines and equipment, and office and support areas, all of which are housed in a 74,000 sq. ft. building. The X-ray Ring provides for 28 primary beam ports and the VUV Ring, 16. Each port is capable of division into 2 or 3 separate beam lines. All ports receive their synchrotron light from conventional bending magnet sources, the magnets being part of the storage ring lattice. 4 refs

  9. Analysis of the source term in the Chernobyl-4 accident

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  10. Theory of fourfold interference with photon pairs from spatially separated sources

    Zhang, Hui Rong; Wang, Ruo Peng

    2007-01-01

    We present a theory for fourfold quantum interference of photons generated from independent spontaneous parametric down-conversion processes. Closed-form expressions for fourfold quantum interference patterns and visibility are found. The theoretical result for fourfold quantum interference patterns is in good agreement with experimental data reported. Detailed numerical calculations for the dependence of fourfold quantum interference visibility on experimentally controllable parameters are carried out. It is found out that higher visibility can be achieved for small biphoton width, short pump pulse coherence time, and narrow bandwidth of spectral filters. The optimal condition for obtaining at the same time higher fourfold interference visibility and intensity is also discussed

  11. On Sources of Structural Injustice: A Feminist Reading of the Theory of Iris M. Young

    Uhde, Zuzana

    2010-01-01

    Roč. 20, č. 2 (2010), s. 151-166 ISSN 1210-3055 R&D Projects: GA ČR GAP404/10/0021 Institutional research plan: CEZ:AV0Z70280505 Keywords : Iris M. Young * critical theory of justice * gender Subject RIV: AD - Politology ; Political Sciences http://versita.metapress.com/content/u4kx375l313k/?p=15604142b3014f328f90d518f50e85ac&pi=0

  12. 252Cf-source-driven neutron noise analysis method

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor, k/sub eff/ has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments and the development of theoretical methods to predict the experimental observables

  13. Statistical Analysis of the Microvariable AGN Source Mrk 501

    Alberto C. Sadun

    2018-02-01

    Full Text Available We report on the optical observations and analysis of the high-energy peaked BL Lac object (HBL, Mrk 501, at redshift z = 0.033. We can confirm microvariable behavior over the course of minutes on several occasions per night. As an alternative to the commonly understood dynamical model of random variations in intensity of the AGN, we develop a relativistic beaming model with a minimum of free parameters, which allows us to infer changes in the line of sight angles for the motion of the different relativistic components. We hope our methods can be used in future studies of beamed emission in other active microvariable sources, similar to the one we explored.

  14. Evaluating source separation of plastic waste using conjoint analysis.

    Nakatani, Jun; Aramaki, Toshiya; Hanaki, Keisuke

    2008-11-01

    Using conjoint analysis, we estimated the willingness to pay (WTP) of households for source separation of plastic waste and the improvement of related environmental impacts, the residents' loss of life expectancy (LLE), the landfill capacity, and the CO2 emissions. Unreliable respondents were identified and removed from the sample based on their answers to follow-up questions. It was found that the utility associated with reducing LLE and with the landfill capacity were both well expressed by logarithmic functions, but that residents were indifferent to the level of CO2 emissions even though they approved of CO2 reduction. In addition, residents derived utility from the act of separating plastic waste, irrespective of its environmental impacts; that is, they were willing to practice the separation of plastic waste at home in anticipation of its "invisible effects", such as the improvement of citizens' attitudes toward solid waste issues.

  15. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  16. Graph theory applied to noise and vibration control in statistical energy analysis models.

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  17. Advancing Explosion Source Theory through Experimentation: Results from Seismic Experiments Since the Moratorium on Nuclear Testing

    Bonner, J. L.; Stump, B. W.

    2011-12-01

    On 23 September 1992, the United States conducted the nuclear explosion DIVIDER at the Nevada Test Site (NTS). It would become the last US nuclear test when a moratorium ended testing the following month. Many of the theoretical explosion seismic models used today were developed from observations of hundreds of nuclear tests at NTS and around the world. Since the moratorium, researchers have turned to chemical explosions as a possible surrogate for continued nuclear explosion research. This talk reviews experiments since the moratorium that have used chemical explosions to advance explosion source models. The 1993 Non-Proliferation Experiment examined single-point, fully contained chemical-nuclear equivalence by detonating over a kiloton of chemical explosive at NTS in close proximity to previous nuclear explosion tests. When compared with data from these nearby nuclear explosions, the regional and near-source seismic data were found to be essentially identical after accounting for different yield scaling factors for chemical and nuclear explosions. The relationship between contained chemical explosions and large production mining shots was studied at the Black Thunder coal mine in Wyoming in 1995. The research led to an improved source model for delay-fired mining explosions and a better understanding of mining explosion detection by the International Monitoring System (IMS). The effect of depth was examined in a 1997 Kazakhstan Depth of Burial experiment. Researchers used local and regional seismic observations to conclude that the dominant mechanism for enhanced regional shear waves was local Rg scattering. Travel-time calibration for the IMS was the focus of the 1999 Dead Sea Experiment where a 10-ton shot was recorded as far away as 5000 km. The Arizona Source Phenomenology Experiments provided a comparison of fully- and partially-contained chemical shots with mining explosions, thus quantifying the reduction in seismic amplitudes associated with partial

  18. Analysis of observables in Chern-Simons perturbation theory

    Alvarez, M.; Labastida, J.M.F.

    1993-01-01

    Chern-Simons theory with gauge group SU(N) is analyzed from a perturbation theory point of view. Computations up to order g 6 of the vacuum expectation value of the unknot are carried out and it is shown that agreement with the exact result by Witten implies no quantum correction at two loops for the two-point function. In addition, it is shown from a perturbation theory point of view that the framing dependence of the vacuum expectation value of an arbitrary knot factorizes in the form predicted by Witten. (orig.)

  19. Activity Analysis: Bridging the Gap between Production Economics Theory and Practical Farm Management Procedures

    Longworth, John W.; Menz, Kenneth M.

    1980-01-01

    This paper is addressed to the traditional problem of demonstrating the relevance of production theory to management-oriented people. Activity analysis, it is argued, is the most appropriate pedagogic framework within which to commence either a production economics or a farm management course. Production economics theory has not been widely accepted as a useful method for the analysis of practical management problems. The theory has been traditionally presented in terms of continuous function...

  20. Surface-Source Downhole Seismic Analysis in R

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  1. Scalar radiation from a radially infalling source into a Schwarzschild black hole in the framework of quantum field theory

    Oliveira, Leandro A. [Campus Salinopolis, Universidade Federal do Para, Salinopolis, Para (Brazil); Universidade Federal do Para, Faculdade de Fisica, Belem, Para (Brazil); Crispino, Luis C.B. [Universidade Federal do Para, Faculdade de Fisica, Belem, Para (Brazil); Higuchi, Atsushi [University of York, Department of Mathematics, Heslington, York (United Kingdom)

    2018-02-15

    We investigate the radiation to infinity of a massless scalar field from a source falling radially towards a Schwarzschild black hole using the framework of the quantum field theory at tree level. When the source falls from infinity, the monopole radiation is dominant for low initial velocities. Higher multipoles become dominant at high initial velocities. It is found that, as in the electromagnetic and gravitational cases, at high initial velocities the energy spectrum for each multipole with l ≥ 1 approximately is constant up to the fundamental quasinormal frequency and then drops to zero. We also investigate the case where the source falls from rest at a finite distance from the black hole. It is found that the monopole and dipole contributions in this case are dominant. This case needs to be carefully distinguished from the unphysical process where the source abruptly appears at rest and starts falling, which would result in radiation of an infinite amount of energy. We also investigate the radiation of a massless scalar field to the horizon of the black hole, finding some features similar to the gravitational case. (orig.)

  2. Fire Hazard Analysis for the Cold Neutron Source System

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-15

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area.

  3. Fire Hazard Analysis for the Cold Neutron Source System

    Choi, Jung Won; Kim, Young Ki; Wu, Sang Ik; Park, Young Cheol; Kim, Bong Soo; Kang, Mee Jin; Oh, Sung Wook

    2006-04-01

    As the Cold Neutron Source System for its installation in HANARO has been designing, the fire hazard analysis upon the CNS system becomes required under No. 2003-20 of the MOST notice, Technical Standard about the Fire Hazard Analysis. As a moderator, the strongly flammable hydrogen is filled in the hydrogen system of CNS. Against the fire or explosion in the reactor hall, accordingly, the physical damage on the reactor safety system should be evaluated in order to reflect the safety protection precaution in the design of CNS system. For the purpose of fire hazard analysis, the accident scenarios were divided into three: hydrogen leak during the hydrogen charging in the system, hydrogen leak during the normal operation of CNS, explosion of hydrogen buffer tank by the external fire. The analysis results can be summarized as follows. First, there is no physical damage threatening the reactor safety system although all hydrogen gas came out of the system then ignited as a jet fire. Second, since the CNS equipment island (CEI) is located enough away from the reactor, no physical damage caused by the buffer tank explosion is on the reactor in terms of the overpressure except the flying debris so that the light two-hour fireproof panel is installed in an one side of hydrogen buffer tank. Third, there are a few combustibles on the second floor of CEI so that the fire cannot be propagated to other areas in the reactor hall; however, the light two-hour fireproof panel will be built on the second floor against the external or internal fire so as to play the role of a fire protection area

  4. Spatial layout optimization design of multi-type LEDs lighting source based on photoelectrothermal coupling theory

    Xue, Lingyun; Li, Guang; Chen, Qingguang; Rao, Huanle; Xu, Ping

    2018-03-01

    Multiple LED-based spectral synthesis technology has been widely used in the fields of solar simulator, color mixing, and artificial lighting of plant factory and so on. Generally, amounts of LEDs are spatially arranged with compact layout to obtain the high power density output. Mutual thermal spreading among LEDs will produce the coupled thermal effect which will additionally increase the junction temperature of LED. Affected by the Photoelectric thermal coupling effect of LED, the spectrum of LED will shift and luminous efficiency will decrease. Correspondingly, the spectral synthesis result will mismatch. Therefore, thermal management of LED spatial layout plays an important role for multi-LEDs light source system. In the paper, the thermal dissipation network topology model considering the mutual thermal spreading effect among the LEDs is proposed for multi-LEDs system with various types of power. The junction temperature increment cased by the thermal coupling has the great relation with the spatial arrangement. To minimize the thermal coupling effect, an optimized method of LED spatial layout for the specific light source structure is presented and analyzed. The results showed that layout of LED with high-power are arranged in the corner and low-power in the center. Finally, according to this method, it is convenient to determine the spatial layout of LEDs in a system having any kind of light source structure, and has the advantages of being universally applicable to facilitate adjustment.

  5. New Analysis and Theory of Deployable Folded Structures, Phase I

    National Aeronautics and Space Administration — A recently developed mathematical theory has great value for deployable space structures and in situ manufacture of large beams, panels, cylinders and other...

  6. New Analysis and Theory of Deployable Folded Structures, Phase II

    National Aeronautics and Space Administration — A recently developed mathematical folding theory has great value for deployable space structures and in situ manufacture of large beams, panels and cylinders. The...

  7. grounded theory approach in sermon analysis of sermons

    The grounded theory approach is implemented in analysing sermons on poverty and directed at ... poverty situation in South Africa, especially in the black community (Pieterse ..... The activity of open coding discovers gaps or holes of needed.

  8. Using Molecular Modeling in Teaching Group Theory Analysis of the Infrared Spectra of Organometallic Compounds

    Wang, Lihua

    2012-01-01

    A new method is introduced for teaching group theory analysis of the infrared spectra of organometallic compounds using molecular modeling. The main focus of this method is to enhance student understanding of the symmetry properties of vibrational modes and of the group theory analysis of infrared (IR) spectra by using visual aids provided by…

  9. Mokken scale analysis : Between the Guttman scale and parametric item response theory

    van Schuur, Wijbrandt H.

    2003-01-01

    This article introduces a model of ordinal unidimensional measurement known as Mokken scale analysis. Mokken scaling is based on principles of Item Response Theory (IRT) that originated in the Guttman scale. I compare the Mokken model with both Classical Test Theory (reliability or factor analysis)

  10. Chiral symmetry breaking in gauge theories from Reggeon diagram analysis

    White, A.R.

    1991-01-01

    It is argued that reggeon diagrams can be used to study dynamical properties of gauge theories containing a large number of massless fermions. SU(2) gauge theory is studied in detail and it is argued that there is a high energy solution which is analogous to the solution of the massless Schwinger model. A generalized winding-number condensate produces the massless pseudoscalar spectrum associated with chiral symmetry breaking and a ''trivial'' S-Matrix

  11. The filmic emotion. A comparative analysis of film theories

    Imanol Zumalde-Arregi, Ph.D.

    2011-01-01

    Full Text Available This article tries to explain the origins of the filmic emotion. Due to the lack of a widely agreed idea about the way film texts play with the emotional world of the empirical spectator, this work outlines the diverse arguments and approaches through which film theory has tried to explain the catalogue of emotions that a film can provoke in the spectator. This review reveals a conflict between the theories oriented to the spectator and context, and the theories focused on the text. There are, in fact, two approaches to evaluate the emotional effects provoked by films: the cultural studies approach, which is based on pragmatics and gender oriented theories, and focuses on the social and subjective conditions through which cinema is experienced, and the approach of structural semiotics and cognitive theory, which focuses on the way a film text tries to direct the spectator’s emotional experience. In the middle of these two approaches, the psychoanalytical theory conceives the filmic experience as a simulation of daily life.

  12. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  13. [Sources of finance for provincial occupational health services. Theory and practice].

    Rydlewska-Liszkowska, I; Jugo, B

    1999-01-01

    The financing of occupational health services (OHS) at the provincial level is an important issue in view of the transformation process going on not only in OHS but also in the overall health care system in Poland. New principles of financing must be now based on the cost and effects analyses. Thus, the question arises on how to provide financial means adequate to needs of health care institutions resulting from their tasks and responsibilities. The gaps existing in the information system have encouraged us to examine the situation in regard to the structure of financing and internal allocation of financial means. The objectives were formulated as follows: to characterise the sources of financial means received by provincial OHS centres; to analyse the structure of financial means derived from various sources, taking into account forms of financial administration, using the data provided by selected centres; to define the relation between the financial means being at the disposal of OHS centres and the scope of their activities; The information on the financing system was collected using a questionnaire mailed to directors of selected OHS centres. The information collected proved to be a valuable source of knowledge on the above mentioned issues as well as on how far the new system of financing associated with a new form of financial administration--an independent public health institution--has already been implemented. The studies indicated that at the present stage of the OHS system transformation it is very difficult to formulate conclusions on the financing administration in provincial OHS centres.

  14. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  15. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    Sun, Bingbing; Alkhalifah, Tariq Ali

    2017-01-01

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  16. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    J.D. Clayton

    2016-08-01

    Full Text Available Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceramics are AD-995 alumina, aluminum nitride, silicon carbide, and boron carbide. Test data can be accurately represented by the linear form of the power series, whereby the same value of a single fitting parameter applies remarkably well for all four ceramics. Comparison of the present model with others in the literature (e.g., Tate's theory demonstrates a target resistance stress that depends on impact velocity, linearly in the limiting case. Comparison of the present analysis with recent research involving penetration of thin ceramic tiles at lower typical impact velocities confirms the importance of target properties related to fracture and shear strength at the Hugoniot Elastic Limit (HEL only in the latter. In contrast, in the former (i.e., hypervelocity and thick target experiments, the current analysis demonstrates dominant dependence of penetration depth only by target mass density. Such comparisons suggest transitions from microstructure-controlled to density-controlled penetration resistance with increasing impact velocity and ceramic target thickness.

  17. Comparative Analysis Study of Open Source GIS in Malaysia

    Rasid, Muhammad Zamir Abdul; Kamis, Naddia; Halim, Mohd Khuizham Abd

    2014-01-01

    Open source origin might appear like a major prospective change which is qualified to deliver in various industries and also competing means in developing countries. The leading purpose of this research study is to basically discover the degree of adopting Open Source Software (OSS) that is connected with Geographic Information System (GIS) application within Malaysia. It was derived based on inadequate awareness with regards to the origin ideas or even on account of techie deficiencies in the open origin instruments. This particular research has been carried out based on two significant stages; the first stage involved a survey questionnaire: to evaluate the awareness and acceptance level based on the comparison feedback regarding OSS and commercial GIS. This particular survey was conducted among three groups of candidates: government servant, university students and lecturers, as well as individual. The approaches of measuring awareness in this research were based on a comprehending signal plus a notion signal for each survey questions. These kinds of signs had been designed throughout the analysis in order to supply a measurable and also a descriptive signal to produce the final result. The second stage involved an interview session with a major organization that carries out available origin internet GIS; the Federal Department of Town and Country Planning Peninsular Malaysia (JPBD). The impact of this preliminary study was to understand the particular viewpoint of different groups of people on the available origin, and also their insufficient awareness with regards to origin ideas as well as likelihood may be significant root of adopting level connected with available origin options

  18. Analysis and Optimal Condition of the Rear-Sound-Aided Control Source in Active Noise Control

    Karel Kreuter

    2011-01-01

    Full Text Available An active noise control scenario of simple ducts is considered. The previously suggested technique of using an single loudspeaker and its rear sound to cancel the upstream sound is further examined and compared to the bidirectional solution in order to give theoretical proof of its advantage. Firstly, a model with a new approach for taking damping effects into account is derived based on the electrical transmission line theory. By comparison with the old model, the new approach is validated, and occurring differences are discussed. Moreover, a numerical application with the consideration of damping is implemented for confirmation. The influence of the rear sound strength on the feedback-path system is investigated, and the optimal condition is determined. Finally, it is proven that the proposed source has an advantage of an extended phase lag and a time delay in the feedback-path system by both frequency-response analysis and numerical calculation of the time response.

  19. Visualizing context through theory deconstruction: a content analysis of three bodies of evaluation theory literature.

    Vo, Anne T

    2013-06-01

    While the evaluation field collectively agrees that contextual factors bear on evaluation practice and related scholarly endeavors, the discipline does not yet have an explicit framework for understanding evaluation context. To address this gap in the knowledge base, this paper explores the ways in which evaluation context has been addressed in the practical-participatory, values-engaged, and emergent realist evaluation literatures. Five primary dimensions that constitute evaluation context were identified for this purpose: (1) stakeholder; (2) program; (3) organization; (4) historical/political; and (5) evaluator. Journal articles, book chapters, and conference papers rooted in the selected evaluation approaches were compared along these dimensions in order to explore points of convergence and divergence in the theories. Study results suggest that the selected prescriptive theories most clearly explicate stakeholder and evaluator contexts. Programmatic, organizational, and historical/political contexts, on the other hand, require further clarification. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. An Analysis of Open Source Security Software Products Downloads

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  1. Models of collapsing and expanding anisotropic gravitating source in f(R, T) theory of gravity

    Abbas, G. [The Islamia University of Bahawalpur, Department of Mathematics, Bahawalpur (Pakistan); Ahmed, Riaz [The Islamia University of Bahawalpur, Department of Mathematics, Bahawalpur (Pakistan); University of the Central Punjab, Department of Mathematics, Lahore (Pakistan)

    2017-07-15

    In this paper, we have formulated the exact solutions of the non-static anisotropic gravitating source in f(R, T) gravity which may lead to expansion and collapse. By assuming there to be no thermal conduction in gravitating source, we have determined parametric solutions in f(R, T) gravity with a non-static spherical geometry filled using an anisotropic fluid. We have examined the ranges of the parameters for which the expansion scalar becomes negative and positive, leading to collapse and expansion, respectively. Further, using the definition of the mass function, the conditions for the trapped surface have been explored, and it has been investigated that there exists a single horizon in this case. The impact of the coupling parameter λ has been discussed in detail in both cases. For the various values of the coupling parameter λ, we have plotted the energy density, anisotropic pressure and anisotropy parameter in the cases of collapse and expansion. The physical significance of the graphs has been explained in detail. (orig.)

  2. Source selection problem of competitive power plants under government intervention: a game theory approach

    Mahmoudi, Reza; Hafezalkotob, Ashkan; Makui, Ahmad

    2014-06-01

    Pollution and environmental protection in the present century are extremely significant global problems. Power plants as the largest pollution emitting industry have been the cause of a great deal of scientific researches. The fuel or source type used to generate electricity by the power plants plays an important role in the amount of pollution produced. Governments should take visible actions to promote green fuel. These actions are often called the governmental financial interventions that include legislations such as green subsidiaries and taxes. In this paper, by considering the government role in the competition of two power plants, we propose a game theoretical model that will help the government to determine the optimal taxes and subsidies. The numerical examples demonstrate how government could intervene in a competitive market of electricity to achieve the environmental objectives and how power plants maximize their utilities in each energy source. The results also reveal that the government's taxes and subsidiaries effectively influence the selected fuel types of power plants in the competitive market.

  3. Coherence properties of third and fourth generation X-ray sources. Theory and experiment

    Singer, Andrej

    2013-06-15

    Interference effects are among the most fascinating optical phenomena. For instance, the butterflies and soap bubbles owe their beautiful colors to interference effects. They appear as a result of the superposition principle, valid in electrodynamics due to the linearity of the wave equation. If two waves interfere, the total radiation field is a sum of these two fields and depends strongly on the relative phases between these fields. While the oscillation frequency of individual fields is typically too large to be observed by a human eye or other detection systems, the phase differences between these fields manifest themselves as relatively slowly varying field strength modulations. These modulations can be detected, provided the oscillating frequencies of the superposed fields are similar. As such, the interference provides a superb measure of the phase differences of optical light, which may carry detailed information about a source or a scattering object. The ability of waves to interfere depends strongly on the degree of correlation between these waves, i.e. their mutual coherence. Until the middle of the 20th century, the coherence of light available to experimentalists was poor. A significant effort had to be made to extend the degree of coherence, which made the electromagnetic field determination using of the interference principle very challenging. Coherence is the defining feature of a laser, whose invention initiated a revolutionary development of experimental techniques based on interference, such as holography. Important contributions to this development were also provided by astronomists, as due to enormous intergalactic distances the radiation from stars has a high transverse coherence length at earth. With the construction of third generation synchrotron sources, partially coherent X-ray sources have become feasible. New areas of research utilizing highly coherent X-ray beams have emerged, including X-ray photon correlation spectroscopy (XPCS), X

  4. Coherence properties of third and fourth generation X-ray sources. Theory and experiment

    Singer, Andrej

    2013-06-01

    Interference effects are among the most fascinating optical phenomena. For instance, the butterflies and soap bubbles owe their beautiful colors to interference effects. They appear as a result of the superposition principle, valid in electrodynamics due to the linearity of the wave equation. If two waves interfere, the total radiation field is a sum of these two fields and depends strongly on the relative phases between these fields. While the oscillation frequency of individual fields is typically too large to be observed by a human eye or other detection systems, the phase differences between these fields manifest themselves as relatively slowly varying field strength modulations. These modulations can be detected, provided the oscillating frequencies of the superposed fields are similar. As such, the interference provides a superb measure of the phase differences of optical light, which may carry detailed information about a source or a scattering object. The ability of waves to interfere depends strongly on the degree of correlation between these waves, i.e. their mutual coherence. Until the middle of the 20th century, the coherence of light available to experimentalists was poor. A significant effort had to be made to extend the degree of coherence, which made the electromagnetic field determination using of the interference principle very challenging. Coherence is the defining feature of a laser, whose invention initiated a revolutionary development of experimental techniques based on interference, such as holography. Important contributions to this development were also provided by astronomists, as due to enormous intergalactic distances the radiation from stars has a high transverse coherence length at earth. With the construction of third generation synchrotron sources, partially coherent X-ray sources have become feasible. New areas of research utilizing highly coherent X-ray beams have emerged, including X-ray photon correlation spectroscopy (XPCS), X

  5. Sixty years of interest in flow and transport theories: Sources of inspiration and a few results

    Raats, Peter A. C.

    2016-04-01

    By choosing to major in soil physics at Wageningen now exactly 60 years ago, I could combine my interest in exact sciences with my experience of growing up on a farm. I never regretted that choice. In the first twenty years, I profited much from close contacts with members of the immediate post-WW II generation of soil physicists (especially Jerry Bolt, Arnold Klute, Ed Miller, Champ Tanner, Wilford Gardner, John Philip, and Jan van Schilfgaarde), chemical engineers (especially at UW Madison the trio Bob Bird, Warren Stewart and Ed Lightfoot) and experts in continuum mechanics (especially at Johns Hopkins Clifford Truesdell and Jerald Ericksen). As graduate student at Illinois with Klute, to describe flow and transport theories in soil science I initially explored as possible framework thermodynamics of irreversible processes (TIP), but soon switched to the continuum theory of mixtures (CTM), initiated by Truesdell in 1957. In CTM, the balance of forces gave a rational basis for flux equations. CTM allowed me to deal with swelling/shrinkage, role of inertia, boundary conditions, and structured soils. Later, I did use TIP to deal with certain aspects of transfer of water and heat in soils and selective uptake of water and nutrients by plant roots. Recently, a variety of theories for upscaling from the pore scale to the Darcy scale have clarified the potential, limits and common ground of CTM and TIP. A great advantage of CTM is that it provides geometric tools suited for kinematic aspects of flow, transport, and growth/decay processes. In particular, the concept of material coordinates of the solid phase that I used in my PhD thesis to cope with large deformation due to swelling/shrinkage of soils, later also turned to be useful to deal with simultaneous shrinkage and decay in peat soils and compost heaps, and the growth of plant tissues. Also, by focusing on the material coordinates for the water, it became possible to describe transport of solutes in unsaturated

  6. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.

  7. Prospect theory: A parametric analysis of functional forms in Brazil

    Robert Eugene Lobel

    2017-10-01

    Full Text Available This study aims to analyze risk preferences in Brazil based on prospect theory by estimating the risk aversion parameter of the expected utility theory (EUT for a select sample, in addition to the value and probability function parameter, assuming various functional forms, and a newly proposed value function, the modified log. This is the first such study in Brazil, and the parameter results are slightly different from studies in other countries, indicating that subjects are more risk averse and exhibit a smaller loss aversion. Probability distortion is the only common factor. As expected, the study finds that behavioral models are superior to EUT, and models based on prospect theory, the TK and Prelec weighting function, and the value power function show superior performance to others. Finally, the modified log function proposed in the study fits the data well, and can thus be used for future studies in Brazil.

  8. A Preliminary ZEUS Lightning Location Error Analysis Using a Modified Retrieval Theory

    Elander, Valjean; Koshak, William; Phanord, Dieudonne

    2004-01-01

    The ZEUS long-range VLF arrival time difference lightning detection network now covers both Europe and Africa, and there are plans for further expansion into the western hemisphere. In order to fully optimize and assess ZEUS lightning location retrieval errors and to determine the best placement of future receivers expected to be added to the network, a software package is being developed jointly between the NASA Marshall Space Flight Center (MSFC) and the University of Nevada Las Vegas (UNLV). The software package, called the ZEUS Error Analysis for Lightning (ZEAL), will be used to obtain global scale lightning location retrieval error maps using both a Monte Carlo approach and chi-squared curvature matrix theory. At the core of ZEAL will be an implementation of an Iterative Oblate (IO) lightning location retrieval method recently developed at MSFC. The IO method will be appropriately modified to account for variable wave propagation speed, and the new retrieval results will be compared with the current ZEUS retrieval algorithm to assess potential improvements. In this preliminary ZEAL work effort, we defined 5000 source locations evenly distributed across the Earth. We then used the existing (as well as potential future ZEUS sites) to simulate arrival time data between source and ZEUS site. A total of 100 sources were considered at each of the 5000 locations, and timing errors were selected from a normal distribution having a mean of 0 seconds and a standard deviation of 20 microseconds. This simulated "noisy" dataset was analyzed using the IO algorithm to estimate source locations. The exact locations were compared with the retrieved locations, and the results are summarized via several color-coded "error maps."

  9. A novel source convergence acceleration scheme for Monte Carlo criticality calculations, part I: Theory

    Griesheimer, D. P.; Toth, B. E.

    2007-01-01

    A novel technique for accelerating the convergence rate of the iterative power method for solving eigenvalue problems is presented. Smoothed Residual Acceleration (SRA) is based on a modification to the well known fixed-parameter extrapolation method for power iterations. In SRA the residual vector is passed through a low-pass filter before the extrapolation step. Filtering limits the extrapolation to the lower order Eigenmodes, improving the stability of the method and allowing the use of larger extrapolation parameters. In simple tests SRA demonstrates superior convergence acceleration when compared with an optimal fixed-parameter extrapolation scheme. The primary advantage of SRA is that it can be easily applied to Monte Carlo criticality calculations in order to reduce the number of discard cycles required before a stationary fission source distribution is reached. A simple algorithm for applying SRA to Monte Carlo criticality problems is described. (authors)

  10. The Swift/UVOT catalogue of NGC 4321 star-forming sources: a case against density wave theory

    Ferreras, Ignacio; Cropper, Mark; Kawata, Daisuke; Page, Mat; Hoversten, Erik A.

    2012-08-01

    We study the star-forming regions in the spiral galaxy NGC 4321 (M100). We take advantage of the spatial resolution (2.5 arcsec full width at half-maximum) of the Swift/Ultraviolet/Optical Telescope camera and the availability of three ultraviolet (UV) passbands in the region 1600 spiral arms. The Hα luminosities of the sources have a strong decreasing radial trend, suggesting more massive star-forming regions in the central part of the galaxy. When segregated with respect to near-UV (NUV)-optical colour, blue sources have a significant excess of flux in the IR at 8 μm, revealing the contribution from polycyclic aromatic hydrocarbons, although the overall reddening of these sources stays below E(B - V) = 0.2 mag. The distribution of distances to the spiral arms is compared for subsamples selected according to Hα luminosity, NUV-optical colour or ages derived from a population synthesis model. An offset would be expected between these subsamples as a function of radius if the pattern speed of the spiral arm were constant - as predicted by classic density wave theory. No significant offsets are found, favouring instead a mechanism where the pattern speed has a radial dependence.

  11. Analysis on stability of strategic alliance: A game theory perspective

    CHEN Fei-qiong; FAN Liang-cong

    2006-01-01

    Strategic alliance has suffered much instabilities since its first implementation. Scholars have carried out many embedded, precise and comprehensive researches from both theory and empiricism. Here we try to find certain stable solutions by employing game theory, in an attempt to construct theoretical bases for strategic alliance, which people called "one of the most important organizational innovation in the end of the 20th century" (Shi, 2001), to exploit its advantages in the process of globalization. Finally, this article puts forward some advices for its success.

  12. 252Cf-source-driven neutron noise analysis method

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables

  13. Source analysis of spaceborne microwave radiometer interference over land

    Guan, Li; Zhang, Sibo

    2016-03-01

    Satellite microwave thermal emissions mixed with signals from active sensors are referred to as radiofrequency interference (RFI). Based on Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) observations from June 1 to 16, 2011, RFI over Europe was identified and analyzed using the modified principal component analysis algorithm in this paper. The X band AMSR-E measurements in England and Italy are mostly affected by the stable, persistent, active microwave transmitters on the surface, while the RFI source of other European countries is the interference of the reflected geostationary TV satellite downlink signals to the measurements of spaceborne microwave radiometers. The locations and intensities of the RFI induced by the geostationary TV and communication satellites changed with time within the observed period. The observations of spaceborne microwave radiometers in ascending portions of orbits are usually interfered with over European land, while no RFI was detected in descending passes. The RFI locations and intensities from the reflection of downlink radiation are highly dependent upon the relative geometry between the geostationary satellite and the measuring passive sensor. Only these fields of view of a spaceborne instrument whose scan azimuths are close to the azimuth relative to the geostationary satellite are likely to be affected by RFI.

  14. The use of Theory in Family Therapy Research: Content Analysis and Update.

    Chen, Ruoxi; Hughes, Alexandria C; Austin, Jason P

    2017-07-01

    In this study, we evaluated 275 empirical studies from Journal of Marital and Family Therapy and Family Process from 2010 to 2015 on their use of theory, and compared our findings to those of a similar previous analysis (Hawley & Geske, 2000). Overall, theory seems to have become much better incorporated in empirical family therapy research, with only 16.4% of the articles not using theory in either their introductory or discussion sections. Theory appeared better incorporated in the introductory sections than in the discussion sections. Systems theory remained the most commonly used conceptual framework, followed by attachment theory. We discuss areas for improving theory incorporation in family therapy research, and offer suggestions for both family therapy researchers and educators. © 2017 American Association for Marriage and Family Therapy.

  15. Conference on Geometric Analysis &Conference on Type Theory, Homotopy Theory and Univalent Foundations : Extended Abstracts Fall 2013

    Yang, Paul; Gambino, Nicola; Kock, Joachim

    2015-01-01

    The two parts of the present volume contain extended conference abstracts corresponding to selected talks given by participants at the "Conference on Geometric Analysis" (thirteen abstracts) and at the "Conference on Type Theory, Homotopy Theory and Univalent Foundations" (seven abstracts), both held at the Centre de Recerca Matemàtica (CRM) in Barcelona from July 1st to 5th, 2013, and from September 23th to 27th, 2013, respectively. Most of them are brief articles, containing preliminary presentations of new results not yet published in regular research journals. The articles are the result of a direct collaboration between active researchers in the area after working in a dynamic and productive atmosphere. The first part is about Geometric Analysis and Conformal Geometry; this modern field lies at the intersection of many branches of mathematics (Riemannian, Conformal, Complex or Algebraic Geometry, Calculus of Variations, PDE's, etc) and relates directly to the physical world, since many natural phenomena...

  16. Prosumers and Emirecs: Analysis of Two Confronted Theories

    Aparici, Roberto; García-Marín, David

    2018-01-01

    In the 1970s, the publications of Alvin Toffler and Jean Cloutier were essential for the emergence of two concepts, prosumer and emirec, whose meanings have been mistakenly equated by numerous scholars and researchers. At the same time, the mercantilist theories linked to prosumption have made invisible the models of communication designed by…

  17. Discussion: The Forward Search: Theory and Data Analysis

    Johansen, Søren; Nielsen, Bent

    2010-01-01

    The Forward Search Algorithm is a statistical algorithm for obtaining robust estimators of regression coefficients in the presence of outliers. The algorithm selects a succession of subsets of observations from which the parameters are estimated. The present note shows how the theory of empirical...

  18. Item Response Data Analysis Using Stata Item Response Theory Package

    Yang, Ji Seung; Zheng, Xiaying

    2018-01-01

    The purpose of this article is to introduce and review the capability and performance of the Stata item response theory (IRT) package that is available from Stata v.14, 2015. Using a simulated data set and a publicly available item response data set extracted from Programme of International Student Assessment, we review the IRT package from…

  19. A Grounded Theory Analysis of Introductory Computer Science Pedagogy

    Jonathan Wellons

    2011-12-01

    Full Text Available Planning is a critical, early step on the path to successful program writing and a skill that is often lacking in novice programmers. As practitioners we are continually searching for or creating interventions to help our students, particularly those who struggle in the early stages of their computer science education. In this paper we report on our ongoing research of novice programming skills that utilizes the qualitative research method of grounded theory to develop theories and inform the construction of these interventions. We describe how grounded theory, a popular research method in the social sciences since the 1960’s, can lend formality and structure to the common practice of simply asking students what they did and why they did it. Further, we aim to inform the reader not only about our emerging theories on interventions for planning but also how they might collect and analyze their own data in this and other areas that trouble novice programmers. In this way those who lecture and design CS1 interventions can do so from a more informed perspective.

  20. Intermittency in multihadron production: An analysis using stochastic theories

    Biyajima, M.

    1989-01-01

    Multiplicity data of the NA22, KLM, and UA1 collaborations are analysed by means of probability distributions derived in the framework of pure birth stochastic equations. The intermittent behaviour of the KLM and UA1 data is well reproduced by the theory. A comparison with the negative binomial distribution is also made. 19 refs., 3 figs., 1 tab. (Authors)

  1. Multiple Intelligences Theory and Iranian Textbooks: An Analysis

    Taase, Yoones

    2012-01-01

    The purpose of this study is to investigate locally designed ELT textbooks in the light of multiple intelligences theory. Three textbooks (grade 1.2.3) used in guidance school of Iranian educational system were analyzed using MI checklist developed by Botelho, Mario do Rozarioand. Catered for kinds of intelligences in the activities and exercises…

  2. The application of catastrophe theory to medical image analysis

    Kuijper, A.; Florack, L.M.J.

    2001-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the influence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of annihilations and

  3. The Application of Catastrophe Theory to Medical Image Analysis

    Kuijper, Arjan; Florack, L.M.J.

    2001-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the influence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of

  4. The application of catastrophe theory to image analysis

    Kuijper, A.; Florack, L.M.J.

    2001-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the inuence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of annihilations and the

  5. Conflict Theory and the Analysis of Religious Experience | Obiefuna ...

    Both the experience and the way it is interpreted can, and most of the times, lead to conflict and violence. One of the appropriate theoretical approaches to explaining, understanding, and resolving religious related conflicts and violence is the conflict theory with micro (psycho-spiritual) and macro (socio-cultural) features ...

  6. Thermodynamic Models from Fluctuation Solution Theory Analysis of Molecular Simulations

    Christensen, Steen; Peters, Günther H.j.; Hansen, Flemming Yssing

    2007-01-01

    Fluctuation solution theory (FST) is employed to analyze results of molecular dynamics (MD) simulations of liquid mixtures. The objective is to generate parameters for macroscopic GE-models, here the modified Margules model. We present a strategy for choosing the number of parameters included...

  7. 139 Conflict Theory and the Analysis of Religious Experience (Pp ...

    FIRST LADY

    2011-01-18

    Jan 18, 2011 ... Indexed African Journals Online: www.ajol.info ... in the traditional social functions of religion” (p. 362). James (1977) ..... Examples include Moses, John of the Cross, St Francis of. Assisi, St .... of the provisions of the Scriptures. He uses ..... theory of Collins (1975) who sees social structure and the actor as.

  8. New perspective of real options theory for policy analysis

    Sadowski, B.M.

    2007-01-01

    In deze Reflectie geeft Sadowski een alternatief, of eigenlijk een aanvulling op de KBA, vanuit de real options theory, zoals die ook in de financiële sector gebruikt wordt. Het idee is dat er in dynamische markten veel ontwikkelingen zijn: technologisch, een markt in beweging of industriële

  9. Practical Guide to Conducting an Item Response Theory Analysis

    Toland, Michael D.

    2014-01-01

    Item response theory (IRT) is a psychometric technique used in the development, evaluation, improvement, and scoring of multi-item scales. This pedagogical article provides the necessary information needed to understand how to conduct, interpret, and report results from two commonly used ordered polytomous IRT models (Samejima's graded…

  10. A critical analysis of the quantum theory of measurement

    Fer, F.

    1984-01-01

    Keeping strictly in the positivist and probabilistic, hence hilbertian frame of Quantum Mechanics, the author tries to ascertain whether or not Quantum Mechanics, starting from its axioms, reaches the aim of any physical theory, that is, comparison with experiment. The answer is: no, as long as it keeps close to the existing axiomatics, and also to accurate mathematics. (Auth.)

  11. Darwinism and the Behavioral Theory of Sociocultural Evolution: An Analysis.

    Langdon, John

    1979-01-01

    Challenges the view that the social sciences are theoretically impoverished disciplines when compared with the natural sciences. Demonstrates that the synthesis of an abstract Darwinian model of systemic adaptation and the behavioral principles of social learning produces a logical theory of sociocultural evolution. (DB)

  12. Analysis of the neutrons dispersion in a semi-infinite medium based in transport theory and the Monte Carlo method

    Arreola V, G.; Vazquez R, R.; Guzman A, J. R.

    2012-10-01

    In this work a comparative analysis of the results for the neutrons dispersion in a not multiplicative semi-infinite medium is presented. One of the frontiers of this medium is located in the origin of coordinates, where a neutrons source in beam form, i.e., μο=1 is also. The neutrons dispersion is studied on the statistical method of Monte Carlo and through the unidimensional transport theory and for an energy group. The application of transport theory gives a semi-analytic solution for this problem while the statistical solution for the flow was obtained applying the MCNPX code. The dispersion in light water and heavy water was studied. A first remarkable result is that both methods locate the maximum of the neutrons distribution to less than two mean free trajectories of transport for heavy water, while for the light water is less than ten mean free trajectories of transport; the differences between both methods is major for the light water case. A second remarkable result is that the tendency of both distributions is similar in small mean free trajectories, while in big mean free trajectories the transport theory spreads to an asymptote value and the solution in base statistical method spreads to zero. The existence of a neutron current of low energy and toward the source is demonstrated, in contrary sense to the neutron current of high energy coming from the own source. (Author)

  13. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    A major challenge facing the prospective deployment of radiation detection systems for homeland security applications is the discrimination of radiological or nuclear 'threat sources' from radioactive, but benign, 'nuisance sources'. Common examples of such nuisance sources include naturally occurring radioactive material (NORM), medical patients who have received radioactive drugs for either diagnostics or treatment, and industrial sources. A sensitive detector that cannot distinguish between 'threat' and 'benign' classes will generate false positives which, if sufficiently frequent, will preclude it from being operationally deployed. In this report, we describe a first-principles physics-based modeling approach that is used to approximate the physical properties and corresponding gamma ray spectral signatures of real nuisance sources. Specific models are proposed for the three nuisance source classes - NORM, medical and industrial. The models can be validated against measured data - that is, energy spectra generated with the model can be compared to actual nuisance source data. We show by example how this is done for NORM and medical sources, using data sets obtained from spectroscopic detector deployments for cargo container screening and urban area traffic screening, respectively. In addition to capturing the range of radioactive signatures of individual nuisance sources, a nuisance source population model must generate sources with a frequency of occurrence consistent with that found in actual movement of goods and people. Measured radiation detection data can indicate these frequencies, but, at present, such data are available only for a very limited set of locations and time periods. In this report, we make more general estimates of frequencies for NORM and medical sources using a range of data sources such as shipping manifests and medical treatment statistics. We also identify potential data sources for industrial

  14. Men's passage to fatherhood: an analysis of the contemporary relevance of transition theory

    Draper, Janet

    2003-01-01

    This paper presents a theoretical analysis of men's experiences of pregnancy, birth and early fatherhood. It does so using a framework of ritual transition theory and argues that despite its earlier structural-functionalist roots, transition theory remains a valuable framework, illuminating contemporary transitions across the life course. The paper discusses the historical development of transition or ritual theory and, drawing upon data generated during longitudinal ethnographic interviews w...

  15. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  16. Parental hostility and its sources in psychologically abusive mothers: a test of the three-factor theory.

    Lesnik-Oberstein, M; Koers, A J; Cohen, L

    1995-01-01

    A revised version of the three-factor theory of child abuse (Lesnik-Oberstein, Cohen, & Koers, 1982) is presented. Further, we report on a research designed to test three main hypotheses derived from Factor I (1) (a high level of hostility in abusive parents) and its sources. The three main hypotheses are: (1) that psychologically abusive mothers have a high level of hostile feelings (Factor I); (2) that the high level of hostile feelings in abusive mothers is associated with low marital coping skills (resulting in affectionless, violent marriages), a negative childhood upbringing (punitive, uncaring, over controlling), a high level of stress (objective stress), and a high level of strain (low self-esteem, depression, neurotic symptoms, social anxiety, feelings of being wronged); and (3) that maternal psychological child abuse is associated with low marital coping skills, a negative childhood upbringing, a high level of stress and a high level of strain. Forty-four psychologically abusing mothers were compared with 128 nonabusing mothers on a variety of measures and were matched for age and educational level. All the mothers had children who were hospitalized for medical symptoms. The three hypotheses were supported, with the exception of the component of hypothesis 2 concerning the association between objective stress and maternal hostility. The positive results are consistent with the three-factor theory.

  17. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste

    Karim Ghani, Wan Azlina Wan Ab.; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni

    2013-01-01

    Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and

  18. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste

    Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Idris, Azni, E-mail: azni@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia)

    2013-05-15

    Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and

  19. Selection of important ecological source patches base on Green Infrastructure theory: A case study of Wuhan city

    Ke, Yuanyuan; Yu, Yan; Tong, Yan

    2018-01-01

    Selecting urban ecological patches is of great significance for constructing urban green infrastructure network, protecting urban biodiversity and ecological environment. With the support of GIS technology, a criterion for selecting sources of patches was developed according to existing planning. Then ecological source patches of terrestrial organism, aquatic and amphibious organism were selected in Wuhan city. To increase the connectivity of the ecological patches and achieve greater ecological protection benefits, the green infrastructure networks in Wuhan city were constructed with the minimum path analysis method. Finally, the characteristics of ecological source patches were analyzed with landscape metrics, and ecological protection importance degree of ecological source patches were evaluated comprehensively. The results showed that there were 23 important ecological source patches in Wuhan city, among which Sushan Temple Forest Patch, Lu Lake and Shangshe Lake Wetland Patch were the most important in all kinds of patches for ecological protection. This study can provide a scientific basis for the preservation of urban ecological space, the delineation of natural conservation areas and the protection of biological diversity.

  20. Sensitivity analysis: Theory and practical application in safety cases

    Kuhlmann, Sebastian; Plischke, Elmar; Roehlig, Klaus-Juergen; Becker, Dirk-Alexander

    2014-01-01

    The projects described here aim at deriving an adaptive and stepwise approach to sensitivity analysis (SA). Since the appropriateness of a single SA method strongly depends on the nature of the model under study, a top-down approach (from simple to sophisticated methods) is suggested. If simple methods explain the model behaviour sufficiently well then there is no need for applying more sophisticated ones and the SA procedure can be considered complete. The procedure is developed and tested using a model for a LLW/ILW repository in salt. Additionally, a new model for the disposal of HLW in rock salt will be available soon for SA studies within the MOSEL/NUMSA projects. This model will address special characteristics of waste disposal in undisturbed rock salt, especially the case of total confinement, resulting in a zero release which is indeed the objective of radioactive waste disposal. A high proportion of zero-output realisations causes many SA methods to fail, so special treatment is needed and has to be developed. Furthermore, the HLW disposal model will be used as a first test case for applying the procedure described above, which was and is being derived using the LLW/ILW model. How to treat dependencies in the input, model conservatism and time-dependent outputs will be addressed in the future project programme: - If correlations or, more generally, dependencies between input parameters exist, the question arises about the deeper meaning of sensitivity results in such cases: A strict separation between inputs, internal states and outputs is no longer possible. Such correlations (or dependencies) might have different reasons. In some cases correlated input parameters might have a common physically (well-)known fundamental cause but there are reasons why this fundamental cause cannot or should not be integrated into the model, i.e. the cause might generate a very complex model which cannot be calculated in appropriate time. In other cases the correlation may

  1. Preliminary thermal analysis of grids for twin source extraction system

    Pandey, Ravi; Bandyopadhyay, Mainak; Chakraborty, Arun K.

    2017-01-01

    The TWIN (Two driver based Indigenously built Negative ion source) source provides a bridge between the operational single driver based negative ion source test facility, ROBIN in IPR and an ITER-type multi driver based ion source. The source is designed to be operated in CW mode with 180kW, 1MHz, 5s ON/600s OFF duty cycle and also in 5Hz modulation mode with 3s ON/20s OFF duty cycle for 3 such cycle. TWIN source comprises of ion source sub-assembly (consist of driver and plasma box) and extraction system sub-assembly. Extraction system consists of Plasma grid (PG), extraction grid (EG) and Ground grid (GG) sub assembly. Negative ion beams produced at plasma grid seeing the plasma side of ion source will receive moderate heat flux whereas the extraction grid and ground grid would be receiving majority of heat flux from extracted negative ion and co-extracted electron beams. Entire Co-extracted electron beam would be dumped at extraction grid via electron deflection magnetic field making the requirement of thermal and hydraulic design for extraction grid to be critical. All the three grids are made of OFHC Copper and would be actively water cooled keeping the peak temperature rise of grid surface within allowable limit with optimum uniformity. All the grids are to be made by vacuum brazing process where joint strength becomes crucial at elevated temperature. Hydraulic design must maintain the peak temperature at the brazing joint within acceptable limit

  2. Plagiarism and Source Deception Detection Based on Syntax Analysis

    Eman Salih Al-Shamery

    2017-02-01

    Full Text Available In this research, the shingle algorithm with Jaccard method are employed as a new approach to detect deception in sources in addition to detect plagiarism . Source deception occurs as a result of taking a particular text from a source and relative it to another source, while plagiarism occurs in the documents as a result of taking part or all of the text belong to another research, this approach is based on Shingle algorithm with Jaccard coefficient , Shingling is an efficient way to compare the set of shingle in the files that contain text which are used as a feature to measure the syntactic similarity of the documents and it will work with Jaccard coefficient that measures similarity between sample sets . In this proposed system, text will be checked whether it contains syntax plagiarism or not and gives a percentage of similarity with other documents , As well as research sources will be checked to detect deception in source , by matching it with available sources from Turnitin report of the same research by using shingle algorithm with Jaccard coefficient. The motivations of this work is to discovery of literary thefts that occur on the researches , especially what students are doing in their researches , also discover the deception that occurs in the sources.

  3. Conformational analysis of cellobiose by electronic structure theories.

    French, Alfred D; Johnson, Glenn P; Cramer, Christopher J; Csonka, Gábor I

    2012-03-01

    Adiabatic Φ/ψ maps for cellobiose were prepared with B3LYP density functional theory. A mixed basis set was used for minimization, followed with 6-31+G(d) single-point calculations, with and without SMD continuum solvation. Different arrangements of the exocyclic groups (38 starting geometries) were considered for each Φ/ψ point. The vacuum calculations agreed with earlier computational and experimental results on the preferred gas phase conformation (anti-Φ(H), syn-ψ(H)), and the results from the solvated calculations were consistent with the (syn Φ(H)/ψ(H) conformations from condensed phases (crystals or solutions). Results from related studies were compared, and there is substantial dependence on the solvation model as well as arrangements of exocyclic groups. New stabilizing interactions were revealed by Atoms-In-Molecules theory. Published by Elsevier Ltd.

  4. Technical conditions for sustainable growth in economic theory. An analysis

    Granda C, Catalina

    2008-01-01

    Economic theory and its models point out returns to scale, substitution among productive factors and technological progress as conditions for sustainable growth. This work aims at a critical appraisal of these conditions, particularly the ones related to substitution between natural resources and manmade capital and technical change, by recognizing the inevitable physical scarcity of resources concomitant to the human actions in a world governed by hemodynamic restrictions. To do so, the role that the mentioned conditions play in the theories of economic growth with resources is analyzed, and its limitations and objections from a biophysical perspective are indicated as well. Finally, a brief consideration as to how inappropriate the theoretical representations of economic activities are to take account of growth in spite of resource exhaustion or degradation is carried out

  5. [Analysis of Sexual Strategies Theory in the Spanish population].

    Yela, Carlos

    2012-02-01

    The aim of this study was to test some of the main hypotheses derived from Buss' Sexual Strategies Theory on a representative Spanish sample. These hypotheses refer to the different strategies men and woman seem to adopt when they want to engage in a short-term sexual relationship (fling), or in a long-term one (loving relationship). (The technical term "strategies", unlike its equivalent in everyday language, is not meant to imply something necessarily planned or conscious). Data was obtained by self-report from a representative sample of 1949 Spanish people. Almost all the results verify the working hypotheses, conferring some empirical support on the theory from which they were deduced, within an evolutionary Social Psychology paradigm. In any case, it is argued that socialization approaches ("double moral" and the social construction of gender role and sexual identity) can also explain most of the differences obtained, in perfect compatibility with biological perspectives.

  6. Kinetic theory analysis of electron attachment cooling in oxygen

    Skullerud, H.R.

    1983-01-01

    The attachment cooling effect observed by Hegerberg and Crompton (1983) has been analysed theoretically and numerically in a Boltzmann equation eigenvalue approach. The effect is highly sensitive to the shape and magnitude of the rotational excitation cross sections. When due account is taken of the rotational excitations associated with the (O 2 - ) negative ion resonances, good agreement between theory and experiment can be obtained with reasonable input cross-section data

  7. Cumulative prospect theory and mean variance analysis. A rigorous comparison

    Hens, Thorsten; Mayer, Janos

    2012-01-01

    We compare asset allocations derived for cumulative prospect theory(CPT) based on two different methods: Maximizing CPT along the mean–variance efficient frontier and maximizing it without that restriction. We find that with normally distributed returns the difference is negligible. However, using standard asset allocation data of pension funds the difference is considerable. Moreover, with derivatives like call options the restriction to the mean-variance efficient frontier results in a siza...

  8. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  9. Insult in Context: Incorporating Speech Act Theory in Doctrinal Legal Analysis of Interpretative Discussions

    H.T.M. Kloosterhuis (Harm)

    2015-01-01

    textabstractIn this article, I want to show that some doctrinal problems of legal interpretation and argumentation can be analysed in a more precise way than a standard doctrinal analysis, when we use insights from speech act theory and argumentation theory. Taking a discussion about the accusation

  10. Assessing Coverage of Maslow's Theory in Educational Psychology Textbooks: A Content Analysis

    Wininger, Steven R.; Norman, Antony D.

    2010-01-01

    Although Maslow's hierarchy of needs theory (HNT) is one of the most prevalent theories in psychology, the authors argued that it is also one of the most misinterpreted or misrepresented, particularly in educational psychology textbooks. Therefore, after carefully reading Maslow's writings on HNT they conducted a content analysis of 18 educational…

  11. An Institutional Theory Analysis of Charter Schools: Addressing Institutional Challenges to Scale

    Huerta, Luis A.; Zuckerman, Andrew

    2009-01-01

    This article presents a conceptual framework derived from institutional theory in sociology that offers two competing policy contexts in which charter schools operate--a bureaucratic frame versus a decentralized frame. An analysis of evolving charter school types based on three underlying theories of action is considered. As charter school leaders…

  12. Seismicity and source spectra analysis in Salton Sea Geothermal Field

    Cheng, Y.; Chen, X.

    2016-12-01

    The surge of "man-made" earthquakes in recent years has led to considerable concerns about the associated hazards. Improved monitoring of small earthquakes would significantly help understand such phenomena and the underlying physical mechanisms. In the Salton Sea Geothermal field in southern California, open access of a local borehole network provides a unique opportunity to better understand the seismicity characteristics, the related earthquake hazards, and the relationship with the geothermal system, tectonic faulting and other physical conditions. We obtain high-resolution earthquake locations in the Salton Sea Geothermal Field, analyze characteristics of spatiotemporal isolated earthquake clusters, magnitude-frequency distributions and spatial variation of stress drops. The analysis reveals spatial coherent distributions of different types of clustering, b-value distributions, and stress drop distribution. The mixture type clusters (short-duration rapid bursts with high aftershock productivity) are predominately located within active geothermal field that correlate with high b-value, low stress drop microearthquake clouds, while regular aftershock sequences and swarms are distributed throughout the study area. The differences between earthquakes inside and outside of geothermal operation field suggest a possible way to distinguish directly induced seismicity due to energy operation versus typical seismic slip driven sequences. The spatial coherent b-value distribution enables in-situ estimation of probabilities for M≥3 earthquakes, and shows that the high large-magnitude-event (LME) probability zones with high stress drop are likely associated with tectonic faulting. The high stress drop in shallow (1-3 km) depth indicates the existence of active faults, while low stress drops near injection wells likely corresponds to the seismic response to fluid injection. I interpret the spatial variation of seismicity and source characteristics as the result of fluid

  13. Analysis of North Korea's Nuclear Tests under Prospect Theory

    Lee, Han Myung; Ryu, Jae Soo; Lee, Kwang Seok; Lee, Dong Hoon; Jun, Eunju; Kim, Mi Jin

    2013-01-01

    North Korea has chosen nuclear weapons as the means to protect its sovereignty. Despite international society's endeavors and sanctions to encourage North Korea to abandon its nuclear ambition, North Korea has repeatedly conducted nuclear testing. In this paper, the reason for North Korea's addiction to a nuclear arsenal is addressed within the framework of cognitive psychology. The prospect theory addresses an epistemological approach usually overlooked in rational choice theories. It provides useful implications why North Korea, being under a crisis situation has thrown out a stable choice but taken on a risky one such as nuclear testing. Under the viewpoint of prospect theory, nuclear tests by North Korea can be understood as follows: The first nuclear test in 2006 is seen as a trial to escape from loss areas such as financial sanctions and regime threats; the second test in 2009 was interpreted as a consequence of the strategy to recover losses by making a direct confrontation against the United States; and the third test in 2013 was understood as an attempt to strengthen internal solidarity after Kim Jong-eun inherited the dynasty, as well as to enhance bargaining power against the United States. Thus, it can be summarized that Pyongyang repeated its nuclear tests to escape from a negative domain and to settle into a positive one. In addition, in the future, North Korea may not be willing to readily give up its nuclear capabilities to ensure the survival of its own regime

  14. A lifting-surface theory solution for the diffraction of internal sound sources by an engine nacelle

    Martinez, R.

    1986-07-01

    Lifting-surface theory is used to solve the problem of diffraction by a rigid open-ended pipe of zero thickness and finite length, with application to the prediction of acoustic insertion-loss performance for the encasing structure of a ducted propeller or turbofan. An axisymmetric situation is assumed, and the incident field due to a force applied directly to the fluid in the cylinder axial direction is used. A virtual-source distribution of unsteady dipoles is found whose integrated component of radial velocity is set to cancel that of the incident field over the surface. The calculated virtual load is verified by whether its effect on the near-field input power at the actual source is consistent with the far-field power radiated by the system, a balance which is possible if the no-flow-through boundary condition has been satisfied over the rigid pipe surface such that the velocity component of the acoustic intensity is zero.

  15. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  16. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  17. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  18. Regional Moment Tensor Source-Type Discrimination Analysis

    2015-11-16

    unique normalized eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson...Solutions color-coded by variance reduction (VR) pre- sented on the Tape and Tape (2012a) and Tape and Tape (2012b) Lune . The white circle...eigenvalues (black ‘+’ signs) or unique source-types on (a) the fundamental Lune (Tape and Tape, 2012a,b), and (b) on the Hudson source-type plot (Hudson

  19. Studying young people’ views on deployment of renewable energy sources in Iran through the lenses of Social Cognitive Theory

    Nadejda Komendantova

    2018-03-01

    Full Text Available Renewable energy sources (RES have potentials to address goals of climate change mitigation at the global level. Iran has abundant RES potentials and investment into renewable energy sources can contribute to its socio-economic development and to diversification of its energy mix. Economic and technical capacities but also human factors, such as stakeholders’ views, public and social acceptance, as well as willingness to use RES, willingness to pay for their deployment and to participate in decision-making processes on energy transition, are crucial factors for deployment of RES at scale. These human factors impact development and implementation of energy transition at the national and local governance levels. Deployment of new technology and energy transition can lead to conflicting views, believes and risks perceptions among involved stakeholders but also among people affected by deployment of new technology infrastructure deployment. To be sustainable and acceptable by all social groups, such process should be based on understanding of positions of different stakeholders and development of compromise solutions. It is crucial to understand the views of young people on deployment of RES as young people represent a significant share of population and are future decision makers. Their support and willingness to use RES will be a significant driver for RES deployment in short and medium term. Based on socio cognitive theory this paper examines the patters of behavior of young adults in relation to energy use. The results show positive influence of self-rewarding to encourage young adults to participate in energy transition. Another important driver is expectation of social outcome, which involves existing social norms in the community. Trust to the source of information is another important driver and the level of information about RES has an important influence on the willingness to use them.

  20. Safety analysis of passing maneuvers using extreme value theory

    Farah, H.; Azevedo, C.L.

    2015-01-01

    The increased availability of detailed trajectory data sets from naturalistic, observational and simulation-based studies are a key source for potential improvements in the development of detailed safety models that explicitly account for vehicle conflict interactions and the various driving

  1. Safety analysis of passing maneuvers using extreme value theory

    Farah, H.; Azevedo, Carlos Lima

    2017-01-01

    The increased availability of detailed trajectory data sets from naturalistic, observational, and simulation-based studies, is a key source for potential improvements in the development of detailed safety models that explicitly account for vehicle conflict interactions and various driving

  2. COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks

    Sie, Rory

    2012-01-01

    Sie, R. L. L. (2012). COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks (Unpublished doctoral dissertation). September, 28, 2012, Open Universiteit in the Netherlands (CELSTEC), Heerlen, The Netherlands.

  3. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Cross talk analysis in multicore optical fibers by supermode theory.

    Szostkiewicz, Lukasz; Napierala, Marek; Ziolowicz, Anna; Pytel, Anna; Tenderenda, Tadeusz; Nasilowski, Tomasz

    2016-08-15

    We discuss the theoretical aspects of core-to-core power transfer in multicore fibers relying on supermode theory. Based on a dual core fiber model, we investigate the consequences of this approach, such as the influence of initial excitation conditions on cross talk. Supermode interpretation of power coupling proves to be intuitive and thus may lead to new concepts of multicore fiber-based devices. As a conclusion, we propose a definition of a uniform cross talk parameter that describes multicore fiber design.

  5. Theory of Belief Functions for Data Analysis and Machine Learning Applications: Review and Prospects

    Denoeux, Thierry

    The Dempster-Shafer theory of belief functions provides a unified framework for handling both aleatory uncertainty, arising from statistical variability in populations, and epistemic uncertainty, arising from incompleteness of knowledge. An overview of both the fundamentals and some recent developments in this theory will first be presented. Several applications in data analysis and machine learning will then be reviewed, including learning under partial supervision, multi-label classification, ensemble clustering and the treatment of pairwise comparisons in sensory or preference analysis.

  6. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  7. A historical analysis of the theories of money

    Farah Durani

    2016-03-01

    Full Text Available Money, the most complex idea to understand is labelled a subject of disagreement and a thorough confusion among economists. “Money” has been fortunate enough a topic to receive ample attention from the philosophers. Economic literature is replete with theories concerning the understanding and behavior of money from different historic eras. The present day knowledge of money is very limited when it comes to understanding what actually it can be. Most of the common strata of people believe that the money is something that is determined by the governments. People generally hold the view that citizens have a legitimate duty incumbent upon them to honor the payment systems stipulated by the authorities. This Research Paper aims at bringing together most of the prominent contributions of greatest philosophers of money and clearly demarcates various schools of monetary thought be it the Classical, Neo-Classical or the Heterodox. The aims of this chapter are to review the relevant theories concerning the understanding of money, to present the monetary dogmas of historical times in a proper chronological order, to establish a link between the predecessor and the successor, to elaborate on the most obscure confusions and their causes in an easy to understand parlance and to deviate from the mainstream to discuss the Heterodox yet appealing school of thought.

  8. The Interaction between Multimedia Data Analysis and Theory Development in Design Research

    van Nes, Fenna; Doorman, Michiel

    2010-01-01

    Mathematics education researchers conducting instruction experiments using a design research methodology are challenged with the analysis of often complex and large amounts of qualitative data. In this paper, we present two case studies that show how multimedia analysis software can greatly support video data analysis and theory development in…

  9. Studying emotion theories through connectivity analysis: Evidence from generalized psychophysiological interactions and graph theory.

    Huang, Yun-An; Jastorff, Jan; Van den Stock, Jan; Van de Vliet, Laura; Dupont, Patrick; Vandenbulcke, Mathieu

    2018-05-15

    Psychological construction models of emotion state that emotions are variable concepts constructed by fundamental psychological processes, whereas according to basic emotion theory, emotions cannot be divided into more fundamental units and each basic emotion is represented by a unique and innate neural circuitry. In a previous study, we found evidence for the psychological construction account by showing that several brain regions were commonly activated when perceiving different emotions (i.e. a general emotion network). Moreover, this set of brain regions included areas associated with core affect, conceptualization and executive control, as predicted by psychological construction models. Here we investigate directed functional brain connectivity in the same dataset to address two questions: 1) is there a common pathway within the general emotion network for the perception of different emotions and 2) if so, does this common pathway contain information to distinguish between different emotions? We used generalized psychophysiological interactions and information flow indices to examine the connectivity within the general emotion network. The results revealed a general emotion pathway that connects neural nodes involved in core affect, conceptualization, language and executive control. Perception of different emotions could not be accurately classified based on the connectivity patterns from the nodes of the general emotion pathway. Successful classification was achieved when connections outside the general emotion pathway were included. We propose that the general emotion pathway functions as a common pathway within the general emotion network and is involved in shared basic psychological processes across emotions. However, additional connections within the general emotion network are required to classify different emotions, consistent with a constructionist account. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Theory analysis for Pender's health promotion model (HPM) by Barnum's criteria: a critical perspective.

    Khoshnood, Zohreh; Rayyani, Masoud; Tirgari, Batool

    2018-01-13

    Background Analysis of nursing theoretical works and its role in knowledge development is presented as an essential process of critical reflection. Health promotion model (HPM) focuses on helping people achieve higher levels of well-being and identifies background factors that influence health behaviors. Objectives This paper aims to evaluate, and critique HPM by Barnum's criteria. Methods The present study reviewed books and articles derived from Proquest, PubMed, Blackwell Databases. The method of evaluation for this model is based on Barnum's criteria for analysis, application and evaluation of nursing theories. The criteria selected by Barnum embrace both internal and external criticism. Internal criticism deals with how theory components fit with each other (internal construction of theory) and external criticism deals with the way in which theory relates to the extended world (which considers theory in its relationships to human beings, nursing, and health). Results The electronic database search yielded over 27,717 titles and abstracts. Following removal of duplicates, 18,963 titles and abstracts were screened using the inclusion criteria and 1278 manuscripts were retrieved. Of these, 80 were specific to HPM and 23 to analysis of any theory in nursing relating to the aim of this article. After final selection using the inclusion criteria for this review, 28 manuscripts were identified as examining the factors contributing to theory analysis. Evaluation of health promotion theory showed that the philosophical claims and their content are consistent and clear. HPM has a logical structure and was applied to diverse age groups from differing cultures with varying health concerns. Conclusion In conclusion, among the strategies for theory critique, the Barnum approach is structured and accurate, considers theory in its relationship to human beings, community psychiatric nursing, and health. While according to Pender, nursing assessment, diagnosis and interventions

  11. Incorporating priors for EEG source imaging and connectivity analysis

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  12. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  13. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  14. Renewable energy sources cost benefit analysis and prospects for Italy

    Ariemma, A.; Montanino, G.

    1992-01-01

    In light of Italy's over-dependency on imported oil, and due to this nation's commitment to the pursuit of the strict environmental protection policies of the European Communities, ENEL (the Italian National Electricity Board) has become actively involved in research efforts aimed at the commercialization of renewable energy sources - photovoltaic, wind, biomass, and mini-hydraulic. Through the use of energy production cost estimates based on current and near- future levels of technological advancement, this paper assesses prospects for the different sources. The advantages and disadvantages of each source in its use as a suitable complementary energy supply satisfying specific sets of constraints regarding siting, weather, capital and operating costs, maintenance, etc., are pointed out. In comparing the various alternatives, the paper also considers environmental benefits and commercialization feasibility in terms of time and outlay

  15. Complex analysis fundamentals of the classical theory of functions

    Stalker, John

    1998-01-01

    This clear, concise introduction to the classical theory of one complex variable is based on the premise that "anything worth doing is worth doing with interesting examples." The content is driven by techniques and examples rather than definitions and theorems. This self-contained monograph is an excellent resource for a self-study guide and should appeal to a broad audience. The only prerequisite is a standard calculus course. The first chapter deals with a beautiful presentation of special functions. . . . The third chapter covers elliptic and modular functions. . . in much more detail, and from a different point of view, than one can find in standard introductory books. . . . For [the] subjects that are omitted, the author has suggested some excellent references for the reader who wants to go through these topics. The book is read easily and with great interest. It can be recommended to both students as a textbook and to mathematicians and physicists as a useful reference. ---Mathematical Reviews Mainly or...

  16. Classical and modern numerical analysis theory, methods and practice

    Ackleh, Azmy S; Kearfott, R Baker; Seshaiyer, Padmanabhan

    2009-01-01

    Mathematical Review and Computer Arithmetic Mathematical Review Computer Arithmetic Interval ComputationsNumerical Solution of Nonlinear Equations of One Variable Introduction Bisection Method The Fixed Point Method Newton's Method (Newton-Raphson Method) The Univariate Interval Newton MethodSecant Method and Müller's Method Aitken Acceleration and Steffensen's Method Roots of Polynomials Additional Notes and SummaryNumerical Linear Algebra Basic Results from Linear Algebra Normed Linear Spaces Direct Methods for Solving Linear SystemsIterative Methods for Solving Linear SystemsThe Singular Value DecompositionApproximation TheoryIntroduction Norms, Projections, Inner Product Spaces, and Orthogonalization in Function SpacesPolynomial ApproximationPiecewise Polynomial ApproximationTrigonometric ApproximationRational ApproximationWavelet BasesLeast Squares Approximation on a Finite Point SetEigenvalue-Eigenvector Computation Basic Results from Linear Algebra The Power Method The Inverse Power Method Deflation T...

  17. Effective theory analysis for vector-like quark model

    Morozumi, Takuya; Shimizu, Yusuke; Takahashi, Shunya; Umeeda, Hiroyuki

    2018-04-01

    We study a model with a down-type SU(2) singlet vector-like quark (VLQ) as a minimal extension of the standard model (SM). In this model, flavor-changing neutral currents (FCNCs) arise at tree level and the unitarity of the 3× 3 Cabibbo-Kobayashi-Maskawa (CKM) matrix does not hold. In this paper, we constrain the FCNC coupling from b\\rArr s transitions, especially B_s\\rArr μ^+μ^- and \\bar{B}\\rArr X_sγ processes. In order to analyze these processes we derive an effective Lagrangian that is valid below the electroweak symmetry breaking scale. For this purpose, we first integrate out the VLQ field and derive an effective theory by matching Wilson coefficients up to one-loop level. Using the effective theory, we construct the effective Lagrangian for b\\rArr sγ^{(*)}. It includes the effects of the SM quarks and the violation of CKM unitarity. We show the constraints on the magnitude of the FCNC coupling and its phase by taking account of the current experimental data on Δ M_{B_s}, Br[B_s\\rArrμ^+μ^-], Br[\\bar{B}\\rArr X_sγ], and CKM matrix elements, as well as theoretical uncertainties. We find that the constraint from Br[B_s\\rArrμ^+μ^-] is more stringent than that from Br[\\bar{B}\\rArr X_sγ]. We also obtain a bound for the mass of the VLQ and the strength of the Yukawa couplings related to the FCNC coupling of the b\\rArr s transition. Using the CKM elements that satisfy the above constraints, we show how the unitarity is violated on the complex plane.

  18. HFIR cold neutron source moderator vessel design analysis

    Chang, S.J.

    1998-04-01

    A cold neutron source capsule made of aluminum alloy is to be installed and located at the tip of one of the neutron beam tubes of the High Flux Isotope Reactor. Cold hydrogen liquid of temperature approximately 20 degree Kelvin and 15 bars pressure is designed to flow through the aluminum capsule that serves to chill and to moderate the incoming neutrons produced from the reactor core. The cold and low energy neutrons thus produced will be used as cold neutron sources for the diffraction experiments. The structural design calculation for the aluminum capsule is reported in this paper

  19. Comprehensive analysis of earthquake source spectra in southern California

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  20. Alice and Bob meet Banach the interface of asymptotic geometric analysis and quantum information theory

    Aubrun, Guillaume

    2017-01-01

    The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...

  1. Real analysis an introduction to the theory of real functions and integration

    Dshalalow, Jewgeni H

    2000-01-01

    Designed for use in a two-semester course on abstract analysis, REAL ANALYSIS: An Introduction to the Theory of Real Functions and Integration illuminates the principle topics that constitute real analysis. Self-contained, with coverage of topology, measure theory, and integration, it offers a thorough elaboration of major theorems, notions, and constructions needed not only by mathematics students but also by students of statistics and probability, operations research, physics, and engineering.Structured logically and flexibly through the author''s many years of teaching experience, the material is presented in three main sections:Part 1, chapters 1through 3, covers the preliminaries of set theory and the fundamentals of metric spaces and topology. This section can also serves as a text for first courses in topology.Part II, chapter 4 through 7, details the basics of measure and integration and stands independently for use in a separate measure theory course.Part III addresses more advanced topics, includin...

  2. Operational analysis and comparative evaluation of embedded Z-Source inverters

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  3. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  4. Reactor Core Design and Analysis for a Micronuclear Power Source

    Hao Sun

    2018-03-01

    Full Text Available Underwater vehicle is designed to ensure the security of country sea boundary, providing harsh requirements for its power system design. Conventional power sources, such as battery and Stirling engine, are featured with low power and short lifetime. Micronuclear reactor power source featured with higher power density and longer lifetime would strongly meet the demands of unmanned underwater vehicle power system. In this paper, a 2.4 MWt lithium heat pipe cooled reactor core is designed for micronuclear power source, which can be applied for underwater vehicles. The core features with small volume, high power density, long lifetime, and low noise level. Uranium nitride fuel with 70% enrichment and lithium heat pipes are adopted in the core. The reactivity is controlled by six control drums with B4C neutron absorber. Monte Carlo code MCNP is used for calculating the power distribution, characteristics of reactivity feedback, and core criticality safety. A code MCORE coupling MCNP and ORIGEN is used to analyze the burnup characteristics of the designed core. The results show that the core life is 14 years, and the core parameters satisfy the safety requirements. This work provides reference to the design and application of the micronuclear power source.

  5. Fecal bacteria source characterization and sensitivity analysis of SWAT 2005

    The Soil and Water Assessment Tool (SWAT) version 2005 includes a microbial sub-model to simulate fecal bacteria transport at the watershed scale. The objectives of this study were to demonstrate methods to characterize fecal coliform bacteria (FCB) source loads and to assess the model sensitivity t...

  6. Source term analysis for a RCRA mixed waste disposal facility

    Jordan, D.L.; Blandford, T.N.; MacKinnon, R.J.

    1996-01-01

    A Monte Carlo transport scheme was used to estimate the source strength resulting from potential releases from a mixed waste disposal facility. Infiltration rates were estimated using the HELP code, and transport through the facility was modeled using the DUST code, linked to a Monte Carlo driver

  7. Stability analysis of direct current control in current source rectifier

    Lu, Dapeng; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    Current source rectifier with high switching frequency has a great potential for improving the power efficiency and power density in ac-dc power conversion. This paper analyzes the stability of direct current control based on the time delay effect. Small signal model including dynamic behaviors...

  8. An open-source solution for advanced imaging flow cytometry data analysis using machine learning.

    Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew

    2017-01-01

    Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery

  9. A Method for the Analysis of Information Use in Source-Based Writing

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  10. Tracing diffuse anthropogenic Pb sources in rural soils by means of Pb isotope analysis

    Walraven, N.; Gaans, P.F.M. van; Veer, G. van der; Os, B.J.H. van; Klaver, G.T.; Vriend, S.P.; Middelburg, J.J.; Davies, G.R.

    2013-01-01

    Knowledge of the cause and source of Pb pollution is important to abate environmental Pb pollution by taking source-related actions. Lead isotope analysis is a potentially powerful tool to identify anthropogenic Pb and its sources in the environment. Spatial information on the variation of

  11. Vrancea seismic source analysis using a small-aperture array

    Popescu, E.; Popa, M.; Radulian, M.; Placinta, A.O.

    2005-01-01

    A small-aperture seismic array (BURAR) was installed in 1999 in the northern part of the Romanian territory (Bucovina area). Since then, the array has been in operation under a joint cooperation programme between Romania and USA. The array consists of 10 stations installed in boreholes (nine short period instruments and one broadband instrument) with enough high sensitivity to properly detect earthquakes generated in Vrancea subcrustal domain (at about 250 km epicentral distance) with magnitude M w below 3. Our main purpose is to investigate and calibrate the source parameters of the Vrancea intermediate-depth earthquakes using specific techniques provided by the BURAR array data. Forty earthquakes with magnitudes between 2.9 and 6.0 were selected, including the recent events of September 27, 2004 (45.70 angle N, 26.45 angle E, h = 166 km, M w = 4.7), October 27, 2004 (45.84 angle N, 26.63 angle E, h = 105 km, M w = 6.0) and May 14, 2005 (45.66 angle N, 26.52 angle E, h = 146 km, M w = 5.1), which are the best ever recorded earthquakes on the Romanian territory: Empirical Green's function deconvolution and spectral ratio methods are applied for pairs of collocated events with similar focal mechanism. Stability tests are performed for the retrieved source time function using the array elements. Empirical scaling and calibration relationships are also determined. Our study shows the capability of the BURAR array to determine the source parameters of the Vrancea intermediate-depth earthquakes as a stand alone station and proves that the recordings of this array alone provides reliable and useful tools to efficiently constrain the source parameters and consequently source scaling properties. (authors)

  12. Correspondence Analysis-Theory and Application in Management Accounting Research

    Duller, Christine

    2010-09-01

    Correspondence analysis is an explanatory data analytic technique and is used to identify systematic relations between categorical variables. It is related to principal component analysis and the results provide information on the structure of categorical variables similar to the results given by a principal component analysis in case of metric variables. Classical correspondence analysis is designed two-dimensional, whereas multiple correspondence analysis is an extension to more than two variables. After an introductory overview of the idea and the implementation in standard software packages (PASW, SAS, R) an example in recent research is presented, which deals with strategic management accounting in family and non-family enterprises in Austria, where 70% to 80% of all enterprises can be classified as family firms. Although there is a growing body of literature focusing on various management issues in family firms, so far the state of the art of strategic management accounting in family firms is an empirically under-researched subject. In relevant literature only the (empirically untested) hypothesis can be found, that family firms tend to have less formalized management accounting systems than non-family enterprises. Creating a correspondence analysis will help to identify the underlying structure, which is responsible for differences in strategic management accounting.

  13. Residents’ Waste Separation Behaviors at the Source: Using SEM with the Theory of Planned Behavior in Guangzhou, China

    Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua

    2015-01-01

    Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors. PMID:26274969

  14. NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory

    Johnson, Wayne

    2017-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail

  15. NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail

  16. NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory

    Johnson, Wayne

    2016-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter

  17. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.

  18. Computational science and re-discovery: open-source implementation of ellipsoidal harmonics for problems in potential theory

    Bardhan, Jaydeep P; Knepley, Matthew G

    2012-01-01

    We present two open-source (BSD) implementations of ellipsoidal harmonic expansions for solving problems of potential theory using separation of variables. Ellipsoidal harmonics are used surprisingly infrequently, considering their substantial value for problems ranging in scale from molecules to the entire solar system. In this paper, we suggest two possible reasons for the paucity relative to spherical harmonics. The first is essentially historical—ellipsoidal harmonics developed during the late 19th century and early 20th, when it was found that only the lowest-order harmonics are expressible in closed form. Each higher-order term requires the solution of an eigenvalue problem, and tedious manual computation seems to have discouraged applications and theoretical studies. The second explanation is practical: even with modern computers and accurate eigenvalue algorithms, expansions in ellipsoidal harmonics are significantly more challenging to compute than those in Cartesian or spherical coordinates. The present implementations reduce the 'barrier to entry' by providing an easy and free way for the community to begin using ellipsoidal harmonics in actual research. We demonstrate our implementation using the specific and physiologically crucial problem of how charged proteins interact with their environment, and ask: what other analytical tools await re-discovery in an era of inexpensive computation?

  19. An open-source framework for analyzing N-electron dynamics. II. Hybrid density functional theory/configuration interaction methodology.

    Hermann, Gunter; Pohl, Vincent; Tremblay, Jean Christophe

    2017-10-30

    In this contribution, we extend our framework for analyzing and visualizing correlated many-electron dynamics to non-variational, highly scalable electronic structure method. Specifically, an explicitly time-dependent electronic wave packet is written as a linear combination of N-electron wave functions at the configuration interaction singles (CIS) level, which are obtained from a reference time-dependent density functional theory (TDDFT) calculation. The procedure is implemented in the open-source Python program detCI@ORBKIT, which extends the capabilities of our recently published post-processing toolbox (Hermann et al., J. Comput. Chem. 2016, 37, 1511). From the output of standard quantum chemistry packages using atom-centered Gaussian-type basis functions, the framework exploits the multideterminental structure of the hybrid TDDFT/CIS wave packet to compute fundamental one-electron quantities such as difference electronic densities, transient electronic flux densities, and transition dipole moments. The hybrid scheme is benchmarked against wave function data for the laser-driven state selective excitation in LiH. It is shown that all features of the electron dynamics are in good quantitative agreement with the higher-level method provided a judicious choice of functional is made. Broadband excitation of a medium-sized organic chromophore further demonstrates the scalability of the method. In addition, the time-dependent flux densities unravel the mechanistic details of the simulated charge migration process at a glance. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. Portfolio theory and cost-effectiveness analysis: a further discussion.

    Sendi, Pedram; Al, Maiwenn J; Rutten, Frans F H

    2004-01-01

    Portfolio theory has been suggested as a means to improve the risk-return characteristics of investments in health-care programs through diversification when costs and effects are uncertain. This approach is based on the assumption that the investment proportions are not subject to uncertainty and that the budget can be invested in toto in health-care programs. In the present paper we develop an algorithm that accounts for the fact that investment proportions in health-care programs may be uncertain (due to the uncertainty associated with costs) and limited (due to the size of the programs). The initial budget allocation across programs may therefore be revised at the end of the investment period to cover the extra costs of some programs with the leftover budget of other programs in the portfolio. Once the total budget is equivalent to or exceeds the expected costs of the programs in the portfolio, the initial budget allocation policy does not impact the risk-return characteristics of the combined portfolio, i.e., there is no benefit from diversification anymore. The applicability of portfolio methods to improve the risk-return characteristics of investments in health care is limited to situations where the available budget is much smaller than the expected costs of the programs to be funded.

  1. Efficiency and credit ratings: a permutation-information-theory analysis

    Bariviera, Aurelio Fernandez; Martinez, Lisana B; Zunino, Luciano; Belén Guercio, M; Rosso, Osvaldo A

    2013-01-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)

  2. Learning Theory and Equity Valuation: an Empirical Analysis

    Antonio Zoratto Sanvicente

    2010-07-01

    Full Text Available This paper tested the Pástor and Veronesi (2003 hypothesis that the market-to-book ratio (M/B is negatively related to the number of years (age during which a firm has had its stock traded on an Exchange. The predicted decline takes place as a result of a learning process by investors. The authors tested this implication in the U.S. market using the Fama-MacBeth (1973 methodology. In the present article a more general econometric approach is adopted, with the use of panel data and fixed-factor regressors, with data for stocks traded at the São Paulo Stock Exchange (BOVESPA. The evidence does not reject the Pástor and Veronesi hypothesis. Additional conjectures were tested regarding the learning process. These tests indicate that the greater availability of data on a company amplifies the effect of the age variable on the M/B ratio, implying a more accelerated learning process. This paper concludes that the evidence for the Brazilian market supports the theory that investors learn.

  3. Buckling analysis of micro- and nano-rods/tubes based on nonlocal Timoshenko beam theory

    Wang, C M; Zhang, Y Y; Ramesh, Sai Sudha; Kitipornchai, S

    2006-01-01

    This paper is concerned with the elastic buckling analysis of micro- and nano-rods/tubes based on Eringen's nonlocal elasticity theory and the Timoshenko beam theory. In the former theory, the small scale effect is taken into consideration while the effect of transverse shear deformation is accounted for in the latter theory. The governing equations and the boundary conditions are derived using the principle of virtual work. Explicit expressions for the critical buckling loads are derived for axially loaded rods/tubes with various end conditions. These expressions account for a better representation of the buckling behaviour of micro- and nano-rods/tubes where small scale effect and transverse shear deformation effect are significant. By comparing it with the classical beam theories, the sensitivity of the small scale effect on the buckling loads may be observed

  4. Do violations of the axioms of expected utility theory threaten decision analysis?

    Nease, R F

    1996-01-01

    Research demonstrates that people violate the independence principle of expected utility theory, raising the question of whether expected utility theory is normative for medical decision making. The author provides three arguments that violations of the independence principle are less problematic than they might first appear. First, the independence principle follows from other more fundamental axioms whose appeal may be more readily apparent than that of the independence principle. Second, the axioms need not be descriptive to be normative, and they need not be attractive to all decision makers for expected utility theory to be useful for some. Finally, by providing a metaphor of decision analysis as a conversation between the actual decision maker and a model decision maker, the author argues that expected utility theory need not be purely normative for decision analysis to be useful. In short, violations of the independence principle do not necessarily represent direct violations of the axioms of expected utility theory; behavioral violations of the axioms of expected utility theory do not necessarily imply that decision analysis is not normative; and full normativeness is not necessary for decision analysis to generate valuable insights.

  5. Car indoor air pollution - analysis of potential sources

    Müller Daniel

    2011-12-01

    Full Text Available Abstract The population of industrialized countries such as the United States or of countries from the European Union spends approximately more than one hour each day in vehicles. In this respect, numerous studies have so far addressed outdoor air pollution that arises from traffic. By contrast, only little is known about indoor air quality in vehicles and influences by non-vehicle sources. Therefore the present article aims to summarize recent studies that address i.e. particulate matter exposure. It can be stated that although there is a large amount of data present for outdoor air pollution, research in the area of indoor air quality in vehicles is still limited. Especially, knowledge on non-vehicular sources is missing. In this respect, an understanding of the effects and interactions of i.e. tobacco smoke under realistic automobile conditions should be achieved in future.

  6. The Spallation Neutron Source (SNS) conceptual design shielding analysis

    Johnson, J.O.; Odano, N.; Lillie, R.A.

    1998-03-01

    The shielding design is important for the construction of an intense high-energy accelerator facility like the proposed Spallation Neutron Source (SNS) due to its impact on conventional facility design, maintenance operations, and since the cost for the radiation shielding shares a considerable part of the total facility costs. A calculational strategy utilizing coupled high energy Monte Carlo calculations and multi-dimensional discrete ordinates calculations, along with semi-empirical calculations, was implemented to perform the conceptual design shielding assessment of the proposed SNS. Biological shields have been designed and assessed for the proton beam transport system and associated beam dumps, the target station, and the target service cell and general remote maintenance cell. Shielding requirements have been assessed with respect to weight, space, and dose-rate constraints for operating, shutdown, and accident conditions. A discussion of the proposed facility design, conceptual design shielding requirements calculational strategy, source terms, preliminary results and conclusions, and recommendations for additional analyses are presented

  7. Surveys on surgery theory

    Cappell, Sylvain; Rosenberg, Jonathan

    2014-01-01

    Surgery theory, the basis for the classification theory of manifolds, is now about forty years old. The sixtieth birthday (on December 14, 1996) of C.T.C. Wall, a leading member of the subject''s founding generation, led the editors of this volume to reflect on the extraordinary accomplishments of surgery theory as well as its current enormously varied interactions with algebra, analysis, and geometry. Workers in many of these areas have often lamented the lack of a single source surveying surgery theory and its applications. Because no one person could write such a survey, the editors ask

  8. Surveys on surgery theory

    Cappell, Sylvain; Rosenberg, Jonathan

    2014-01-01

    Surgery theory, the basis for the classification theory of manifolds, is now about forty years old. There have been some extraordinary accomplishments in that time, which have led to enormously varied interactions with algebra, analysis, and geometry. Workers in many of these areas have often lamented the lack of a single source that surveys surgery theory and its applications. Indeed, no one person could write such a survey. The sixtieth birthday of C. T. C. Wall, one of the leaders of the founding generation of surgery theory, provided an opportunity to rectify the situation and produce a

  9. Asymptotic Analysis of Large Cooperative Relay Networks Using Random Matrix Theory

    H. Poor

    2008-04-01

    Full Text Available Cooperative transmission is an emerging communication technology that takes advantage of the broadcast nature of wireless channels. In cooperative transmission, the use of relays can create a virtual antenna array so that multiple-input/multiple-output (MIMO techniques can be employed. Most existing work in this area has focused on the situation in which there are a small number of sources and relays and a destination. In this paper, cooperative relay networks with large numbers of nodes are analyzed, and in particular the asymptotic performance improvement of cooperative transmission over direction transmission and relay transmission is analyzed using random matrix theory. The key idea is to investigate the eigenvalue distributions related to channel capacity and to analyze the moments of this distribution in large wireless networks. A performance upper bound is derived, the performance in the low signal-to-noise-ratio regime is analyzed, and two approximations are obtained for high and low relay-to-destination link qualities, respectively. Finally, simulations are provided to validate the accuracy of the analytical results. The analysis in this paper provides important tools for the understanding and the design of large cooperative wireless networks.

  10. Development of in-vessel source term analysis code, tracer

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  11. Economic analysis of the need for advanced power sources

    Hardie, R.W.; Omberg, R.P.

    1975-01-01

    The purpose of this paper is to determine the economic need for an advanced power source, be it fusion, solar, or some other concept. However, calculations were also performed assuming abandonment of the LMFBR program, so breeder benefits are a by-product of this study. The model used was the ALPS linear programming system for forecasting optimum power growth patterns. Total power costs were calculated over a planning horizon from 1975 to 2041 and discounted at 7 1 / 2 percent. The benefit of a particular advanced power source is simply the reduction in total power cost resulting from its introduction. Since data concerning advanced power sources (APS) are speculative, parametric calculations varying introduction dates and capital costs about a hypothetical APS plant were performed. Calculations were also performed without the LMFBR to determine the effect of the breeder on the benefits of an advanced power source. Other data used in the study, such as the energy demand curve and uranium resource estimates, are given in the Appendix, and a list of the 11 power plants used in this study is given. Calculations were performed for APS introduction dates of 2001 and 2011. Estimates of APS capital costs included cases where it was assumed the costs were $50/kW and $25/kW higher than the LMFBR. In addition, cases where APS and LMFBR capital costs are identical were also considered. It is noted that the APS capital costs used in this study are not estimates of potential advanced power system plant costs, but were chosen to compute potential dollar benefits of advanced power systems under extremely optimistic assumptions. As a further example, all APS fuel cycle costs were assumed to be zero

  12. National Synchrotron Light Source safety-analysis report

    Batchelor, K.

    1982-07-01

    This document covers all of the safety issues relating to the design and operation of the storage rings and injection system of the National Synchrotron Light Source. The building systems for fire protection, access and egress are described together with air and other gaseous control or venting systems. Details of shielding against prompt bremstrahlung radiation and synchrotron radiation are described and the administrative requirements to be satisfied for operation of a beam line at the facility are given

  13. Analysis of polymer foil heaters as infrared radiation sources

    Witek, Krzysztof; Piotrowski, Tadeusz; Skwarek, Agata

    2012-01-01

    Infrared radiation as a heat source is used in many fields. In particular, the positive effect of far-infrared radiation on living organisms has been observed. This paper presents two technological solutions for infrared heater production using polymer-silver and polymer-carbon pastes screenprinted on foil substrates. The purpose of this work was the identification of polymer layers as a specific frequency range IR radiation sources. The characterization of the heaters was determined mainly by measurement of the surface temperature distribution using a thermovision camera and the spectral characteristics were determined using a special measuring system. Basic parameters obtained for both, polymer silver and polymer carbon heaters were similar and were as follows: power rating of 10–12 W/dm 2 , continuous working surface temperature of 80–90 °C, temperature coefficient of resistance (TCR) about +900 ppm/K for polymer-carbon heater and about +2000 ppm/K for polymer-silver, maximum radiation intensity in the wavelength range of 6–14 μm with top intensity at 8.5 μm and heating time about 20 min. For comparison purposes, commercial panel heater was tested. The results show that the characteristics of infrared polymer heaters are similar to the characteristics of the commercial heater, so they can be taken into consideration as the alternative infrared radiation sources.

  14. Analysis of Extended Z-source Inverter for Photovoltaic System

    Prakash, G.; Subramani, C.; Dhineshkumar, K.; Rayavel, P.

    2018-04-01

    The Z-source inverter has picked up prominence as a solitary stage buck-support inverter topology among numerous specialists. Notwithstanding, its boosting capacity could be constrained, and in this manner, it may not be reasonable for a few applications requiring high lift request of falling other dc-dc help converters. The Z-source inverter is a recent converter topology that exhibits both voltage-buck and voltage-boost capability This could lose the effectiveness and request all the more detecting for controlling the additional new stages. This paper is proposing another group of broadened help semi Z - source inverter (ZSI) to fill the exploration hole left in the improvement of ZSI. These new topologies can be worked with same regulation strategies that were produced for unique ZSI. Likewise, they have a similar number of dynamic switches as unique ZSI saving the single-organize nature of ZSI. Proposed topologies are dissected in the enduring state and their exhibitions are approved utilizing recreated comes about acquired in MATLAB/Simulink. Besides, they are tentatively approved with comes about acquired from a model created in the research facility. The trend of fast increase of the PV energy use is related to the increasing efficiency of solar cells as well as the improvements of manufacturing technology of solar panels.

  15. An Analysis of School Administration Using Schema Theory.

    Nagy, Philip

    The use of a schema analysis, or hierarchical classification system, to identify problem-solving approaches among principals with different levels of experience is examined in this paper. Forty-one students in Ontario--10 undergraduate teacher preparation students, 8 teachers with principal qualifications, and 23 principals grouped according to 3…

  16. Safety analysis of passing maneuvers using extreme value theory

    Haneen Farah

    2017-04-01

    The results indicate that this is a promising approach for safety evaluation. On-going work of the authors will attempt to generalize this method to other safety measures related to passing maneuvers, test it for the detailed analysis of the effect of demographic factors on passing maneuvers' crash probability and for its usefulness in a traffic simulation environment.

  17. Dynamical analysis of an optical rocking ratchet: Theory and experiment

    Arzola, Alejandro V.; Volke-Sepúlveda, K.; Mateos, J.L.

    2013-01-01

    Roč. 87, č. 6 (2013), 062910:1-9 ISSN 1539-3755 R&D Projects: GA MŠk LH12018; GA MŠk EE2.4.31.0016 Institutional support: RVO:68081731 Keywords : deterministic optical rocking ratchet * analysis of the dynamics Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.326, year: 2013

  18. The interaction between theory and experiment in charge density analysis

    Coppens, Phillip

    2013-01-01

    The field of x-ray charge density analysis has gradually morphed into an area benefiting from the strong interactions between theoreticians and experimentalists, leading to new concepts on chemical bonding and of intermolecular interactions in condensed phases. Some highlights of the developments culminating in the 2013 Aminoff Award are described in this paper. (comment)

  19. Theories of the Syllogism: A Meta-Analysis

    Khemlani, Sangeet; Johnson-Laird, P. N.

    2012-01-01

    Syllogisms are arguments about the properties of entities. They consist of 2 premises and a conclusion, which can each be in 1 of 4 "moods": "All A are B," "Some A are B," "No A are B," and "Some A are not B." Their logical analysis began with Aristotle, and their psychological investigation began over 100 years ago. This article outlines the…

  20. Adult Attachment Ratings (AAR): an item response theory analysis.

    Pilkonis, Paul A; Kim, Yookyung; Yu, Lan; Morse, Jennifer Q

    2014-01-01

    The Adult Attachment Ratings (AAR) include 3 scales for anxious, ambivalent attachment (excessive dependency, interpersonal ambivalence, and compulsive care-giving), 3 for avoidant attachment (rigid self-control, defensive separation, and emotional detachment), and 1 for secure attachment. The scales include items (ranging from 6-16 in their original form) scored by raters using a 3-point format (0 = absent, 1 = present, and 2 = strongly present) and summed to produce a total score. Item response theory (IRT) analyses were conducted with data from 414 participants recruited from psychiatric outpatient, medical, and community settings to identify the most informative items from each scale. The IRT results allowed us to shorten the scales to 5-item versions that are more precise and easier to rate because of their brevity. In general, the effective range of measurement for the scales was 0 to +2 SDs for each of the attachment constructs; that is, from average to high levels of attachment problems. Evidence for convergent and discriminant validity of the scales was investigated by comparing them with the Experiences of Close Relationships-Revised (ECR-R) scale and the Kobak Attachment Q-sort. The best consensus among self-reports on the ECR-R, informant ratings on the ECR-R, and expert judgments on the Q-sort and the AAR emerged for anxious, ambivalent attachment. Given the good psychometric characteristics of the scale for secure attachment, however, this measure alone might provide a simple alternative to more elaborate procedures for some measurement purposes. Conversion tables are provided for the 7 scales to facilitate transformation from raw scores to IRT-calibrated (theta) scores.

  1. Analysis of event tree with imprecise inputs by fuzzy set theory

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  2. GENOVA: a generalized perturbation theory program for various applications to CANDU core physics analysis (I)-theory and application

    Kim, Do Heon; Choi, Hang Bok

    2001-01-01

    A generalized perturbation theory (GPT) program, GENOVA, has been developed for the purpose of various applications to Canadian deuterium uranium (CANDU) reactor physics analyses. GENOVA was written under the framework of CANDU physics design and analysis code, RFSP. A sensitivity method based on the GPT was implemented in GENOVA to estimate various sensitivity coefficients related to the movement of zone controller units (ZCUs) existing in the CANDU reactor. The numerical algorithm for the sensitivity method was verified by a simple 2 x 2 node problem. The capability of predicting ZCU levels upon a refueling perturbation was validated for a CANDU-6 reactor problem. The applicability of GENOVA to the CANDU-6 core physics analysis has been demonstrated with the optimum refueling simulation and the uncertainty analysis problems. For the optimum refueling simulation, an optimum channel selection strategy has been proposed, using the ZCU level predicted by GENOVA. The refueling simulation of a CANDU-6 natural uranium core has shown that the ZCU levels are successfully controlled within the operating range while the channel and bundle powers are satisfying the license limits. An uncertainty analysis has been performed for the fuel composition heterogeneity of a CANDU DUPIC core, using the sensitivity coefficients generated by GENOVA. The results have shown that the uncertainty of the core performance parameter can be reduced appreciably when the contents of the major fissile isotopes are tightly controlled. GENOVA code has been successfully explored to supplement the weak points of the current design and analysis code, such as the incapacity of performing an optimum refueling simulation and uncertainty analysis. The sample calculations have shown that GENOVA has strong potential to be used for CANDU core analysis combined with the current design and analysis code, RFSP, especially for the development of advanced CANDU fuels

  3. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  4. Recruiting highly educated graduates: a study on the relationship between recruitment information sources, the theory of planned behavior, and actual job pursuit

    Jaidi, Y.; van Hooft, E.A.J.; Arends, L.R.

    2011-01-01

    Using the theory of planned behavior, we examined the effects of different recruitment-related information sources on the job pursuit of highly educated graduates. The study was conducted using a real-life longitudinal design. Participants reported on potential employers they were interested in. We

  5. Fit for Practice: Analysis and Evaluation of Watson's Theory of Human Caring.

    Pajnkihar, Majda; McKenna, Hugh P; Štiglic, Gregor; Vrbnjak, Dominika

    2017-07-01

    The aim of the authors of this paper is to analyze Watson's theory of human caring for its usefulness and worth in education, practice, and research. The reason for undertaking this analysis is to evaluate if Watson's theory would be useful for nursing in those countries where such theories were not an established part of the nursing curriculum. Furthermore, in some European countries, their political past or cultural influences led to an unquestioned adoption of the biomedical model. As their political culture changes, many social structures have had to be revisited, and for nursing, this has meant the introduction of theoretical reasoning, teaching, and practice.

  6. Physical analysis of some features of the gauge theories with Higgs sectors

    Beshtoev, Kh.M.

    1995-01-01

    A physical analysis of some features of the gauge theories with Higgs sectors is made. It is shown that we should assume gauge transformations in the fermion and Higgs sectors to be different (i.e., to have different charges) in order to remove contradictions arising in gauge theories with Higgs sectors. Then, the Higgs mechanism can be interpreted as some mechanism of gauge field shielding. In such a mechanism fermions remain without masses. The conclusion is made that in the standard theory of the development of the Universe, monopoles cannot survive at low temperatures. 15 refs

  7. Sociological Analysis of Contemporary Youth Movements: the Strenghts of Classical Leadership Theories

    N V Andrievskaya

    2011-03-01

    Full Text Available The article is devoted to the study of youth socio-political movements in terms of leadership theories. The author examines the development of leadership theories and provides the analysis of the activities of two youth organization's leaders in the context of the theories involved. In order to analyze the efficiency of leadership the author highlights the qualities essential for an ideal leader of a youth organization and identifies the type of leadership style. Further on, the author considers how far each of the candidates answers the ideal leadership model description.

  8. Statistical analysis of activation and reaction energies with quasi-variational coupled-cluster theory

    Black, Joshua A.; Knowles, Peter J.

    2018-06-01

    The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.

  9. Health Behavior Theory in Popular Calorie Counting Apps: A Content Analysis

    Davis, Siena F; Ellsworth, Marisa A; Payne, Hannah E; Hall, Shelby M; West, Joshua H; Nordhagen, Amber L

    2016-01-01

    Background Although the Health & Fitness category of the Apple App Store features hundreds of calorie counting apps, the extent to which popular calorie counting apps include health behavior theory is unknown. Objective This study evaluates the presence of health behavior theory in calorie counting apps. Methods Data for this study came from an extensive content analysis of the 10 most popular calorie counting apps in the Health & Fitness category of the Apple App Store. Results Each app was ...

  10. Spot-on or not? : an analysis of Seurat's colour theory

    Marks-Donaldson, Roberta Lynne

    1997-01-01

    An analysis of mid- to late-nineteenth century scientific colour theories sets the stage for the introduction of the artistic style of French painter Georges Seurat. His traditional beaux-arts training, extraordinary skills as a draughtsman, and keen interest in the then existing science theories on colour combined in his person to create a new approach called Divisionism, (also called Pointillisme, pointillism, and melange optique). As Seurat's readings of scientific literature and his pract...

  11. Dual-layer ultrathin film optics: I. Theory and analysis

    Wang, Qian; Lim, Kim Peng

    2015-01-01

    This paper revisits dual-layer ultrathin film optics, which can be used for functional graded refractive index thin film stack. We present the detailed derivation including s-polarized and p-polarized light under arbitrary incidence angle showing the equivalence between the dual-layer ultrathin films and a negative birefringent thin film and also the approximations made during the derivation. Analysis of the approximations shows the influence of thickness of dual-layer thin films, the incidence angle and desired refractive index of the birefringent film. Numerical comparison between the titanium dioxide/aluminum oxide based dual-layer ultrathin film stack and the equivalent birefringent film verifies the theoretical analysis. The detailed theoretical study and numerical comparison provide a physical insight and design guidelines for dual-layer ultrathin film based optical devices. (paper)

  12. The analysis of linear partial differential operators I distribution theory and Fourier analysis

    Hörmander, Lars

    2003-01-01

    The main change in this edition is the inclusion of exercises with answers and hints. This is meant to emphasize that this volume has been written as a general course in modern analysis on a graduate student level and not only as the beginning of a specialized course in partial differen­ tial equations. In particular, it could also serve as an introduction to harmonic analysis. Exercises are given primarily to the sections of gen­ eral interest; there are none to the last two chapters. Most of the exercises are just routine problems meant to give some familiarity with standard use of the tools introduced in the text. Others are extensions of the theory presented there. As a rule rather complete though brief solutions are then given in the answers and hints. To a large extent the exercises have been taken over from courses or examinations given by Anders Melin or myself at the University of Lund. I am grateful to Anders Melin for letting me use the problems originating from him and for numerous valuable comm...

  13. Analysis of Generation Y Workforce Motivation Using Multiattribute Utility Theory

    2011-01-01

    the Berlin Wall, the induction of music television (MTV) into society, Columbine High School shootings , 9/11 terrorist attacks, more frequent... high inflation of the 1980s (Dries et al., 2008; Crumpacker & Crumpacker, 2007; Weingarten, 2009). 66 Analysis of Generation Y Workforce Motivation...January 2011 and abhors slowness (Weingarten, 2009). To some, Generation Y’s work values and attributes paint a picture of being high main- tenance

  14. Viscous wing theory development. Volume 1: Analysis, method and results

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  15. Finite element analysis of advanced neutron source fuel plates

    Luttrell, C.R.

    1995-08-01

    The proposed design for the Advanced Neutron Source reactor core consists of closely spaced involute fuel plates. Coolant flows between the plates at high velocities. It is vital that adjacent plates do not come in contact and that the coolant channels between the plates remain open. Several scenarios that could result in problems with the fuel plates are studied. Finite element analyses are performed on fuel plates under pressure from the coolant flowing between the plates at a high velocity, under pressure because of a partial flow blockage in one of the channels, and with different temperature profiles

  16. Review on solving the forward problem in EEG source analysis

    Vergult Anneleen

    2007-11-01

    Full Text Available Abstract Background The aim of electroencephalogram (EEG source localization is to find the brain areas responsible for EEG waves of interest. It consists of solving forward and inverse problems. The forward problem is solved by starting from a given electrical source and calculating the potentials at the electrodes. These evaluations are necessary to solve the inverse problem which is defined as finding brain sources which are responsible for the measured potentials at the EEG electrodes. Methods While other reviews give an extensive summary of the both forward and inverse problem, this review article focuses on different aspects of solving the forward problem and it is intended for newcomers in this research field. Results It starts with focusing on the generators of the EEG: the post-synaptic potentials in the apical dendrites of pyramidal neurons. These cells generate an extracellular current which can be modeled by Poisson's differential equation, and Neumann and Dirichlet boundary conditions. The compartments in which these currents flow can be anisotropic (e.g. skull and white matter. In a three-shell spherical head model an analytical expression exists to solve the forward problem. During the last two decades researchers have tried to solve Poisson's equation in a realistically shaped head model obtained from 3D medical images, which requires numerical methods. The following methods are compared with each other: the boundary element method (BEM, the finite element method (FEM and the finite difference method (FDM. In the last two methods anisotropic conducting compartments can conveniently be introduced. Then the focus will be set on the use of reciprocity in EEG source localization. It is introduced to speed up the forward calculations which are here performed for each electrode position rather than for each dipole position. Solving Poisson's equation utilizing FEM and FDM corresponds to solving a large sparse linear system. Iterative

  17. Collection, Analysis, and Dissemination of Open Source News and Analysis for Safeguards Implementation and Evaluation

    Khaled, J.; Reed, J.; Ferguson, M.; Hepworth, C.; Serrat, J.; Priori, M.; Hammond, W.

    2015-01-01

    Analysis of all safeguards-relevant information is an essential component of IAEA safeguards and the ongoing State evaluation underlying IAEA verification activities. In addition to State declared safeguards information and information generated from safeguards activities both in the field and at headquarters, the IAEA collects and analyzes information from a wide array of open sources relevant to States' nuclear related activities. A number of these open sources include information that could be loosely categorized as ''news'': international, regional, and local media; company and government press releases; public records of parliamentary proceedings; and NGO/academic commentaries and analyzes. It is the task of the State Factors Analysis Section of the Department of Safeguards to collect, analyze and disseminate news of relevance to support ongoing State evaluation. This information supports State evaluation by providing the Department with a global overview of safeguards-relevant nuclear developments. Additionally, this type of information can support in-depth analyses of nuclear fuel cycle related activities, alerting State Evaluation Groups to potential inconsistencies in State declarations, and preparing inspectors for activities in the field. The State Factors Analysis Section uses a variety of tools, including subscription services, news aggregators, a roster of specialized sources, and a custom software application developed by an external partner to manage incoming data streams and assist with making sure that critical information is not overlooked. When analyzing data, it is necessary to determine the credibility of a given source and piece of information. Data must be considered for accuracy, bias, and relevance to the overall assessment. Analysts use a variety of methodological techniques to make these types of judgments, which are included when the information is presented to State Evaluation Groups. Dissemination of news to

  18. The analysis of security cost for different energy sources

    Jun, Eunju; Kim, Wonjoon; Chang, Soon Heung

    2009-01-01

    Global concerns for the security of energy have steadily been on the increase and are expected to become a major issue over the next few decades. Urgent policy response is thus essential. However, little attempt has been made at defining both energy security and energy metrics. In this study, we provide such metrics and apply them to four major energy sources in the Korean electricity market: coal, oil, liquefied natural gas, and nuclear. In our approach, we measure the cost of energy security in terms of supply disruption and price volatility, and we consider the degree of concentration in energy supply and demand using the Hirschman-Herfindahl index (HHI). Due to its balanced fuel supply and demand, relatively stable price, and high abundance, we find nuclear energy to be the most competitive energy source in terms of energy security in the Korean electricity market. LNG, on the other hand, was found to have the highest cost in term of energy security due to its high concentration in supply and demand, and its high price volatility. In addition, in terms of cost, we find that economic security dominates supply security, and as such, it is the main factor in the total security cost. Within the confines of concern for global energy security, our study both broadens our understanding of energy security and enables a strategic approach in the portfolio management of energy consumption.

  19. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  20. Use of CITATION code for flux calculation in neutron activation analysis with voluminous sample using an Am-Be source

    Khelifi, R.; Idiri, Z.; Bode, P.

    2002-01-01

    The CITATION code based on neutron diffusion theory was used for flux calculations inside voluminous samples in prompt gamma activation analysis with an isotopic neutron source (Am-Be). The code uses specific parameters related to the energy spectrum source and irradiation system materials (shielding, reflector). The flux distribution (thermal and fast) was calculated in the three-dimensional geometry for the system: air, polyethylene and water cuboidal sample (50x50x50 cm). Thermal flux was calculated in a series of points inside the sample. The results agreed reasonably well with observed values. The maximum thermal flux was observed at a distance of 3.2 cm while CITATION gave 3.7 cm. Beyond a depth of 7.2 cm, the thermal flux to fast flux ratio increases up to twice and allows us to optimise the detection system position in the scope of in-situ PGAA