WorldWideScience

Sample records for source theory analysis

  1. Gravitation and source theory

    International Nuclear Information System (INIS)

    Yilmaz, H.

    1975-01-01

    Schwinger's source theory is applied to the problem of gravitation and its quantization. It is shown that within the framework of a flat-space the source theory implementation leads to a violation of probability. To avoid the difficulty one must introduce a curved space-time hence the source concept may be said to necessitate the transition to a curved-space theory of gravitation. It is further shown that the curved-space theory of gravitation implied by the source theory is not equivalent to the conventional Einstein theory. The source concept leads to a different theory where the gravitational field has a stress-energy tensor t/sup nu//sub mu/ which contributes to geometric curvatures

  2. Superradiance in the source theory

    International Nuclear Information System (INIS)

    Kim, Y.D.

    1979-01-01

    The radiative transition rate is formulated in a new approach within the framework of the source theory which makes use of a vacuum persistence amplitude. It is also shown that the source theory can be applied effectively to detemine the likelihood of superradiance of coherence in spontaneous emission. Since the source theory is applicable not only to electromagnetic interactions but also to many other interaction, it would be most interesting to inquire if superradiance can occur in processes other than the elctromagnetic radiative process, such as nuclear or gravitational process. (Author)

  3. Second-order generalized perturbation theory for source-driven systems

    International Nuclear Information System (INIS)

    Greenspan, E.; Gilai, D.; Oblow, E.M.

    1978-01-01

    A second-order generalized perturbation theory (GPT) for the effect of multiple system variations on a general flux functional in source-driven systems is derived. The derivation is based on a functional Taylor series in which second-order derivatives are retained. The resulting formulation accounts for the nonlinear effect of a given variation accurate to third order in the flux and adjoint perturbations. It also accounts for the effect of interaction between any number of variations. The new formulation is compared with exact perturbation theory as well as with perturbation theory for altered systems. The usefulnes of the second-order GPT formulation is illustrated by applying it to optimization problems. Its applicability to areas of cross-section sensitivity analysis and system design and evaluation is also discussed

  4. Random matrix theory with an external source

    CERN Document Server

    Brézin, Edouard

    2016-01-01

    This is a first book to show that the theory of the Gaussian random matrix is essential to understand the universal correlations with random fluctuations and to demonstrate that it is useful to evaluate topological universal quantities. We consider Gaussian random matrix models in the presence of a deterministic matrix source. In such models the correlation functions are known exactly for an arbitrary source and for any size of the matrices. The freedom given by the external source allows for various tunings to different classes of universality. The main interest is to use this freedom to compute various topological invariants for surfaces such as the intersection numbers for curves drawn on a surface of given genus with marked points, Euler characteristics, and the Gromov–Witten invariants. A remarkable duality for the average of characteristic polynomials is essential for obtaining such topological invariants. The analysis is extended to nonorientable surfaces and to surfaces with boundaries.

  5. An overview of gravitational waves theory, sources and detection

    CERN Document Server

    Auger, Gerard

    2017-01-01

    This book describes detection techniques used to search for and analyze gravitational waves (GW). It covers the whole domain of GW science, starting from the theory and ending with the experimental techniques (both present and future) used to detect them. The theoretical sections of the book address the theory of general relativity and of GW, followed by the theory of GW detection. The various sources of GW are described as well as the methods used to analyse them and to extract their physical parameters. It includes an analysis of the consequences of GW observations in terms of astrophysics as well as a description of the different detectors that exist and that are planned for the future. With the recent announcement of GW detection and the first results from LISA Pathfinder, this book will allow non-specialists to understand the present status of the field and the future of gravitational wave science

  6. Operator theory a comprehensive course in analysis, part 4

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 4 focuses on operator theory, especially on a Hilbert space. Central topics are the spectral theorem, the theory of trace class and Fredholm determinants, and the study of

  7. Blind source separation theory and applications

    CERN Document Server

    Yu, Xianchuan; Xu, Jindong

    2013-01-01

    A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies    The book presents an overview of Blind Source Separation, a relatively new signal processing method.  Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers

  8. Source Similarity and Social Media Health Messages: Extending Construal Level Theory to Message Sources.

    Science.gov (United States)

    Young, Rachel

    2015-09-01

    Social media users post messages about health goals and behaviors to online social networks. Compared with more traditional sources of health communication such as physicians or health journalists, peer sources are likely to be perceived as more socially close or similar, which influences how messages are processed. This experimental study uses construal level theory of psychological distance to predict how mediated health messages from peers influence health-related cognition and behavioral intention. Participants were exposed to source cues that identified peer sources as being either highly attitudinally and demographically similar to or different from participants. As predicted by construal level theory, participants who perceived sources of social media health messages as highly similar listed a greater proportion of beliefs about the feasibility of health behaviors and a greater proportion of negative beliefs, while participants who perceived sources as more dissimilar listed a greater proportion of positive beliefs about the health behaviors. Results of the study could be useful in determining how health messages from peers could encourage individuals to set realistic health goals.

  9. Antenna theory: Analysis and design

    Science.gov (United States)

    Balanis, C. A.

    The book's main objective is to introduce the fundamental principles of antenna theory and to apply them to the analysis, design, and measurements of antennas. In a description of antennas, the radiation mechanism is discussed along with the current distribution on a thin wire. Fundamental parameters of antennas are examined, taking into account the radiation pattern, radiation power density, radiation intensity, directivity, numerical techniques, gain, antenna efficiency, half-power beamwidth, beam efficiency, bandwidth, polarization, input impedance, and antenna temperature. Attention is given to radiation integrals and auxiliary potential functions, linear wire antennas, loop antennas, linear and circular arrays, self- and mutual impedances of linear elements and arrays, broadband dipoles and matching techniques, traveling wave and broadband antennas, frequency independent antennas and antenna miniaturization, the geometrical theory of diffraction, horns, reflectors and lens antennas, antenna synthesis and continuous sources, and antenna measurements.

  10. Challenges in combining different data sets during analysis when using grounded theory.

    Science.gov (United States)

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  11. Point sources and multipoles in inverse scattering theory

    CERN Document Server

    Potthast, Roland

    2001-01-01

    Over the last twenty years, the growing availability of computing power has had an enormous impact on the classical fields of direct and inverse scattering. The study of inverse scattering, in particular, has developed rapidly with the ability to perform computational simulations of scattering processes and led to remarkable advances in a range of applications, from medical imaging and radar to remote sensing and seismic exploration. Point Sources and Multipoles in Inverse Scattering Theory provides a survey of recent developments in inverse acoustic and electromagnetic scattering theory. Focusing on methods developed over the last six years by Colton, Kirsch, and the author, this treatment uses point sources combined with several far-reaching techniques to obtain qualitative reconstruction methods. The author addresses questions of uniqueness, stability, and reconstructions for both two-and three-dimensional problems.With interest in extracting information about an object through scattered waves at an all-ti...

  12. The theory of magnetohydrodynamic wave generation by localized sources. I - General asymptotic theory

    Science.gov (United States)

    Collins, William

    1989-01-01

    The magnetohydrodynamic wave emission from several localized, periodic, kinematically specified fluid velocity fields are calculated using Lighthill's method for finding the far-field wave forms. The waves propagate through an isothermal and uniform plasma with a constant B field. General properties of the energy flux are illustrated with models of pulsating flux tubes and convective rolls. Interference theory from geometrical optics is used to find the direction of minimum fast-wave emission from multipole sources and slow-wave emission from discontinuous sources. The distribution of total flux in fast and slow waves varies with the ratios of the source dimensions l to the acoustic and Alfven wavelengths.

  13. Source-Free Exchange-Correlation Magnetic Fields in Density Functional Theory.

    Science.gov (United States)

    Sharma, S; Gross, E K U; Sanna, A; Dewhurst, J K

    2018-03-13

    Spin-dependent exchange-correlation energy functionals in use today depend on the charge density and the magnetization density: E xc [ρ, m]. However, it is also correct to define the functional in terms of the curl of m for physical external fields: E xc [ρ,∇ × m]. The exchange-correlation magnetic field, B xc , then becomes source-free. We study this variation of the theory by uniquely removing the source term from local and generalized gradient approximations to the functional. By doing so, the total Kohn-Sham moments are improved for a wide range of materials for both functionals. Significantly, the moments for the pnictides are now in good agreement with experiment. This source-free method is simple to implement in all existing density functional theory codes.

  14. Efficient image enhancement using sparse source separation in the Retinex theory

    Science.gov (United States)

    Yoon, Jongsu; Choi, Jangwon; Choe, Yoonsik

    2017-11-01

    Color constancy is the feature of the human vision system (HVS) that ensures the relative constancy of the perceived color of objects under varying illumination conditions. The Retinex theory of machine vision systems is based on the HVS. Among Retinex algorithms, the physics-based algorithms are efficient; however, they generally do not satisfy the local characteristics of the original Retinex theory because they eliminate global illumination from their optimization. We apply the sparse source separation technique to the Retinex theory to present a physics-based algorithm that satisfies the locality characteristic of the original Retinex theory. Previous Retinex algorithms have limited use in image enhancement because the total variation Retinex results in an overly enhanced image and the sparse source separation Retinex cannot completely restore the original image. In contrast, our proposed method preserves the image edge and can very nearly replicate the original image without any special operation.

  15. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  16. On the finite line source problem in diffusion theory

    International Nuclear Information System (INIS)

    Mikkelsen, T.; Troen, I.; Larsen, S.E.

    1981-09-01

    A simple formula for calculating dispersion from a continuous finite line source, placed at right angles to the mean wind direction, is derived on the basis of statistical theory. Comparison is made with the virtual source concept usually used and this is shown to be correct only in the limit where the virtual time lag Tsub(v) is small compared to the timescale of the turbulence tsub(l). (Auth.)

  17. Analysis, Design and Implementation of an Embedded Realtime Sound Source Localization System Based on Beamforming Theory

    Directory of Open Access Journals (Sweden)

    Arko Djajadi

    2009-12-01

    Full Text Available This project is intended to analyze, design and implement a realtime sound source localization system by using a mobile robot as the media. The implementated system uses 2 microphones as the sensors, Arduino Duemilanove microcontroller system with ATMega328p as the microprocessor, two permanent magnet DC motors as the actuators for the mobile robot and a servo motor as the actuator to rotate the webcam directing to the location of the sound source, and a laptop/PC as the simulation and display media. In order to achieve the objective of finding the position of a specific sound source, beamforming theory is applied to the system. Once the location of the sound source is detected and determined, the choice is either the mobile robot will adjust its position according to the direction of the sound source or only webcam will rotate in the direction of the incoming sound simulating the use of this system in a video conference. The integrated system has been tested and the results show the system could localize in realtime a sound source placed randomly on a half circle area (0 - 1800 with a radius of 0.3m - 3m, assuming the system is the center point of the circle. Due to low ADC and processor speed, achievable best angular resolution is still limited to 25o.

  18. Classical electromagnetic field theory in the presence of magnetic sources

    OpenAIRE

    Chen, Wen-Jun; Li, Kang; Naón, Carlos

    2001-01-01

    Using two new well defined 4-dimensional potential vectors, we formulate the classical Maxwell's field theory in a form which has manifest Lorentz covariance and SO(2) duality symmetry in the presence of magnetic sources. We set up a consistent Lagrangian for the theory. Then from the action principle we get both Maxwell's equation and the equation of motion of a dyon moving in the electro-magnetic field.

  19. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  20. Pangenesis as a source of new genetic information. The history of a now disproven theory.

    Science.gov (United States)

    Bergman, Gerald

    2006-01-01

    Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.

  1. The problem of electric sources in Einstein's Hermite-symmetric field theory

    International Nuclear Information System (INIS)

    Kreisel, E.

    1986-01-01

    The possibility is investigated to introduce a geometric source without A-invariance and Hermite-symmetry breaking of Einstein's Hermitian relativity. It would be very meaningful to interpret a source of this kind as electric current. With this extension Einstein's unitary field theory contains Einstein's gravitation, electromagnetism and the gluonic vacuum of chromodynamics. (author)

  2. Functional analysis, spectral theory, and applications

    CERN Document Server

    Einsiedler, Manfred

    2017-01-01

    This textbook provides a careful treatment of functional analysis and some of its applications in analysis, number theory, and ergodic theory. In addition to discussing core material in functional analysis, the authors cover more recent and advanced topics, including Weyl’s law for eigenfunctions of the Laplace operator, amenability and property (T), the measurable functional calculus, spectral theory for unbounded operators, and an account of Tao’s approach to the prime number theorem using Banach algebras. The book further contains numerous examples and exercises, making it suitable for both lecture courses and self-study. Functional Analysis, Spectral Theory, and Applications is aimed at postgraduate and advanced undergraduate students with some background in analysis and algebra, but will also appeal to everyone with an interest in seeing how functional analysis can be applied to other parts of mathematics.

  3. Theories of police legitimacy – its sources and effects

    Directory of Open Access Journals (Sweden)

    Pavla Homolová

    2018-04-01

    Full Text Available The review of theories on police legitimacy aims at introducing the subject with a multidisciplinary approach. It quotes criminological, sociological as well as psychological and institutional theories of legitimacy, in order to provide the reader a rich framework, in which the findings of the presented current empirical studies can be evaluated. Police legitimacy is conceived as a social phenomenon, closely related to social norms such as socially constructed police roles and models of policing. The prevailing normative model of police legitimacy in criminology is discussed in greater detail, including critical outlook on procedural fairness as the assumed main source of police empirical legitimacy. Recent findings concerning legal socialization and theories of legitimization myths are high- lighted in order to supplement the micro-level oriented criminological literature on police legitimacy. Possible future pathways of legitimacy research in criminology are discussed.

  4. Bifurcation and stability in Yang-Mills theory with sources

    International Nuclear Information System (INIS)

    Jackiw, R.

    1979-06-01

    A lecture is presented in which some recent work on solutions to classical Yang-Mills theory is discussed. The investigations summarized include the field equations with static, external sources. A pattern allowing a comprehensive description of the solutions and stability in dynamical systems are covered. A list of open questions and problems for further research is given. 20 references

  5. Information Foraging Theory: A Framework for Intelligence Analysis

    Science.gov (United States)

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  6. Concept analysis and the building blocks of theory: misconceptions regarding theory development.

    Science.gov (United States)

    Bergdahl, Elisabeth; Berterö, Carina M

    2016-10-01

    The purpose of this article is to discuss the attempts to justify concepts analysis as a way to construct theory - a notion often advocated in nursing. The notion that concepts are the building blocks or threads from which theory is constructed is often repeated. It can be found in many articles and well-known textbooks. However, this notion is seldom explained or defended. The notion of concepts as building blocks has also been questioned by several authors. However, most of these authors seem to agree to some degree that concepts are essential components from which theory is built. Discussion paper. Literature was reviewed to synthesize and debate current knowledge. Our point is that theory is not built by concepts analysis or clarification and we will show that this notion has its basis in some serious misunderstandings. We argue that concept analysis is not a part of sound scientific method and should be abandoned. The current methods of concept analysis in nursing have no foundation in philosophy of science or in language philosophy. The type of concept analysis performed in nursing is not a way to 'construct' theory. Rather, theories are formed by creative endeavour to propose a solution to a scientific and/or practical problem. The bottom line is that the current style and form of concept analysis in nursing should be abandoned in favour of methods in line with modern theory of science. © 2016 John Wiley & Sons Ltd.

  7. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    Science.gov (United States)

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  8. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  9. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  10. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  11. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....

  12. Decision analysis with cumulative prospect theory.

    Science.gov (United States)

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  13. Mathematical analysis, approximation theory and their applications

    CERN Document Server

    Gupta, Vijay

    2016-01-01

    Designed for graduate students, researchers, and engineers in mathematics, optimization, and economics, this self-contained volume presents theory, methods, and applications in mathematical analysis and approximation theory. Specific topics include: approximation of functions by linear positive operators with applications to computer aided geometric design, numerical analysis, optimization theory, and solutions of differential equations. Recent and significant developments in approximation theory, special functions and q-calculus along with their applications to mathematics, engineering, and social sciences are discussed and analyzed. Each chapter enriches the understanding of current research problems and theories in pure and applied research.

  14. Methods of Fourier analysis and approximation theory

    CERN Document Server

    Tikhonov, Sergey

    2016-01-01

    Different facets of interplay between harmonic analysis and approximation theory are covered in this volume. The topics included are Fourier analysis, function spaces, optimization theory, partial differential equations, and their links to modern developments in the approximation theory. The articles of this collection were originated from two events. The first event took place during the 9th ISAAC Congress in Krakow, Poland, 5th-9th August 2013, at the section “Approximation Theory and Fourier Analysis”. The second event was the conference on Fourier Analysis and Approximation Theory in the Centre de Recerca Matemàtica (CRM), Barcelona, during 4th-8th November 2013, organized by the editors of this volume. All articles selected to be part of this collection were carefully reviewed.

  15. Variational analysis of regular mappings theory and applications

    CERN Document Server

    Ioffe, Alexander D

    2017-01-01

    This monograph offers the first systematic account of (metric) regularity theory in variational analysis. It presents new developments alongside classical results and demonstrates the power of the theory through applications to various problems in analysis and optimization theory. The origins of metric regularity theory can be traced back to a series of fundamental ideas and results of nonlinear functional analysis and global analysis centered around problems of existence and stability of solutions of nonlinear equations. In variational analysis, regularity theory goes far beyond the classical setting and is also concerned with non-differentiable and multi-valued operators. The present volume explores all basic aspects of the theory, from the most general problems for mappings between metric spaces to those connected with fairly concrete and important classes of operators acting in Banach and finite dimensional spaces. Written by a leading expert in the field, the book covers new and powerful techniques, whic...

  16. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  17. Fourier analysis in combinatorial number theory

    International Nuclear Information System (INIS)

    Shkredov, Il'ya D

    2010-01-01

    In this survey applications of harmonic analysis to combinatorial number theory are considered. Discussion topics include classical problems of additive combinatorics, colouring problems, higher-order Fourier analysis, theorems about sets of large trigonometric sums, results on estimates for trigonometric sums over subgroups, and the connection between combinatorial and analytic number theory. Bibliography: 162 titles.

  18. Fourier analysis in combinatorial number theory

    Energy Technology Data Exchange (ETDEWEB)

    Shkredov, Il' ya D [M. V. Lomonosov Moscow State University, Moscow (Russian Federation)

    2010-09-16

    In this survey applications of harmonic analysis to combinatorial number theory are considered. Discussion topics include classical problems of additive combinatorics, colouring problems, higher-order Fourier analysis, theorems about sets of large trigonometric sums, results on estimates for trigonometric sums over subgroups, and the connection between combinatorial and analytic number theory. Bibliography: 162 titles.

  19. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  20. Spectral theory and nonlinear functional analysis

    CERN Document Server

    Lopez-Gomez, Julian

    2001-01-01

    This Research Note addresses several pivotal problems in spectral theory and nonlinear functional analysis in connection with the analysis of the structure of the set of zeroes of a general class of nonlinear operators. It features the construction of an optimal algebraic/analytic invariant for calculating the Leray-Schauder degree, new methods for solving nonlinear equations in Banach spaces, and general properties of components of solutions sets presented with minimal use of topological tools. The author also gives several applications of the abstract theory to reaction diffusion equations and systems.The results presented cover a thirty-year period and include recent, unpublished findings of the author and his coworkers. Appealing to a broad audience, Spectral Theory and Nonlinear Functional Analysis contains many important contributions to linear algebra, linear and nonlinear functional analysis, and topology and opens the door for further advances.

  1. The flow analysis of supercavitating cascade by linear theory

    Energy Technology Data Exchange (ETDEWEB)

    Park, E.T. [Sung Kyun Kwan Univ., Seoul (Korea, Republic of); Hwang, Y. [Seoul National Univ., Seoul (Korea, Republic of)

    1996-06-01

    In order to reduce damages due to cavitation effects and to improve performance of fluid machinery, supercavitation around the cascade and the hydraulic characteristics of supercavitating cascade must be analyzed accurately. And the study on the effects of cavitation on fluid machinery and analysis on the performances of supercavitating hydrofoil through various elements governing flow field are critically important. In this study comparison of experiment results with the computed results of linear theory using singularity method was obtainable. Specially singularity points like sources and vortexes on hydrofoil and freestreamline were distributed to analyze two dimensional flow field of supercavitating cascade, and governing equations of flow field were derived and hydraulic characteristics of cascade were calculated by numerical analysis of the governing equations. 7 refs., 6 figs.

  2. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Development of a noncompact source theory with applications to helicopter rotors

    Science.gov (United States)

    Farassat, F.; Brown, T. J.

    1976-01-01

    A new formulation for determining the acoustic field of moving bodies, based on acoustic analogy, is derived. The acoustic pressure is given as the sum of two integrals, one of which has a derivative with respect to time. The integrands are functions of the normal velocity and surface pressure of the body. A computer program based on this formulation was used to calculate acoustic pressure signatures for several helicoptor rotors from experimental surface pressure data. Results are compared with those from compact source calculations. It is shown that noncompactness of steady sources on the rotor can account for the high harmonics of the pressure system. Thickness noise is shown to be a significant source of sound, especially for blunt airfoils in regions where noncompact source theory should be applied.

  4. Mean-deviation analysis in the theory of choice.

    Science.gov (United States)

    Grechuk, Bogdan; Molyboha, Anton; Zabarankin, Michael

    2012-08-01

    Mean-deviation analysis, along with the existing theories of coherent risk measures and dual utility, is examined in the context of the theory of choice under uncertainty, which studies rational preference relations for random outcomes based on different sets of axioms such as transitivity, monotonicity, continuity, etc. An axiomatic foundation of the theory of coherent risk measures is obtained as a relaxation of the axioms of the dual utility theory, and a further relaxation of the axioms are shown to lead to the mean-deviation analysis. Paradoxes arising from the sets of axioms corresponding to these theories and their possible resolutions are discussed, and application of the mean-deviation analysis to optimal risk sharing and portfolio selection in the context of rational choice is considered. © 2012 Society for Risk Analysis.

  5. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    International Nuclear Information System (INIS)

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  6. A Meta-Analysis of Institutional Theories

    Science.gov (United States)

    1989-06-01

    GPOUP SUBGROUP Institutional Theory , Isomorphism, Administrative Difterpntiation, Diffusion of Change, Rational, Unit Of Analysis 19 ABSTRACT (Continue on... institutional theory may lead to better decision making and evaluation criteria on the part of managers in the non-profit sector. C. SCOPE This paper... institutional theory : I) Organizations evolving in environments with elabora- ted institutional rules create structure that conform to those rules. 2

  7. Super Yang-Mills theory in 10+2 dimensions, The 2T-physics Source for N=4 SYM and M(atrix) Theory

    CERN Document Server

    Bars, Itzhak

    2010-01-01

    In this paper we construct super Yang-Mills theory in 10+2 dimensions, a number of dimensions that was not reached before in a unitary supersymmetric field theory, and show that this is the 2T-physics source of some cherished lower dimensional field theories. The much studied conformally exact N=4 Super Yang-Mills field theory in 3+1 dimensions is known to be a compactified version of N=1 SYM in 9+1 dimensions, while M(atrix) theory is obtained by compactifications of the 9+1 theory to 0 dimensions (also 0+1 and others). We show that there is a deeper origin of these theories in two higher dimensions as they emerge from the new theory with two times. Pursuing various alternatives of gauge choices, solving kinematic equations and/or dimensional reductions of the 10+2 theory, we suggest a web of connections that include those mentioned above and a host of new theories that relate 2T-physics and 1T-physics field theories, all of which have the 10+2 theory as the parent. In addition to establishing the higher spa...

  8. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  9. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  10. Advertisement Translation under Skopos Theory

    Institute of Scientific and Technical Information of China (English)

    严妙

    2014-01-01

    This paper is an analysis of advertisement translation under skopos theory.It is explained that the nature of advertisement translation under skopos theory is reconstructing the information of the source text to persuade target audience.Three translation strategies are put forward in translating advertisements.

  11. Resolution of point sources of light as analyzed by quantum detection theory.

    Science.gov (United States)

    Helstrom, C. W.

    1973-01-01

    The resolvability of point sources of incoherent thermal light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  12. Diffusion theory model for optimization calculations of cold neutron sources

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1987-01-01

    Cold neutron sources are becoming increasingly important and common experimental facilities made available at many research reactors around the world due to the high utility of cold neutrons in scattering experiments. The authors describe a simple two-group diffusion model of an infinite slab LD 2 cold source. The simplicity of the model permits to obtain an analytical solution from which one can deduce the reason for the optimum thickness based solely on diffusion-type phenomena. Also, a second more sophisticated model is described and the results compared to a deterministic transport calculation. The good (particularly qualitative) agreement between the results suggests that diffusion theory methods can be used in parametric and optimization studies to avoid the generally more expensive transport calculations

  13. Finite element analysis theory and application with ANSYS

    CERN Document Server

    Moaveni, Saeed

    2015-01-01

    For courses in Finite Element Analysis, offered in departments of Mechanical or Civil and Environmental Engineering. While many good textbooks cover the theory of finite element modeling, Finite Element Analysis: Theory and Application with ANSYS is the only text available that incorporates ANSYS as an integral part of its content. Moaveni presents the theory of finite element analysis, explores its application as a design/modeling tool, and explains in detail how to use ANSYS intelligently and effectively. Teaching and Learning Experience This program will provide a better teaching and learning experience-for you and your students. It will help: *Present the Theory of Finite Element Analysis: The presentation of theoretical aspects of finite element analysis is carefully designed not to overwhelm students. *Explain How to Use ANSYS Effectively: ANSYS is incorporated as an integral part of the content throughout the book. *Explore How to Use FEA as a Design/Modeling Tool: Open-ended design problems help stude...

  14. Acoustic Source Localization and Beamforming: Theory and Practice

    Directory of Open Access Journals (Sweden)

    Chen Joe C

    2003-01-01

    Full Text Available We consider the theoretical and practical aspects of locating acoustic sources using an array of microphones. A maximum-likelihood (ML direct localization is obtained when the sound source is near the array, while in the far-field case, we demonstrate the localization via the cross bearing from several widely separated arrays. In the case of multiple sources, an alternating projection procedure is applied to determine the ML estimate of the DOAs from the observed data. The ML estimator is shown to be effective in locating sound sources of various types, for example, vehicle, music, and even white noise. From the theoretical Cramér-Rao bound analysis, we find that better source location estimates can be obtained for high-frequency signals than low-frequency signals. In addition, large range estimation error results when the source signal is unknown, but such unknown parameter does not have much impact on angle estimation. Much experimentally measured acoustic data was used to verify the proposed algorithms.

  15. Theoretical and methodological analysis of personality theories of leadership

    OpenAIRE

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  16. THE RESPONSIBILITY TO PROTECT. A JUST WAR THEORY BASED ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andreea IANCU

    2014-11-01

    Full Text Available This paper analyzes the Responsibility to protect principle as the paradigm that reinforces the just war theory in the current international relations. The importance of this analysis is given by the fact that in the current change of source of international conflicts, the Responsibility to protect principle affirms the responsibility of the international community to protect all the citizens of the world. In this context we witness a translation toward a Post-Westphalian international system, which values the individual as a security referent. This article discusses the origins of the responsibility to protect principle and problematizes (discusses the legitimacy of use of violence and force in the current international system. Moreover, the paper analyzes the possible humanization of the current international relations and, simultaneously, the persistency of conflict and warfare in the international system. The conclusion of this research states that the Responsibility to protect principle revises the just war theory by centering it on the individual.

  17. Positioning Theory and Discourse Analysis: Some Tools for Social Interaction Analysis

    Directory of Open Access Journals (Sweden)

    Francisco Tirado

    2007-05-01

    Full Text Available This article outlines positioning theory as a discursive analysis of interaction, focusing on the topic of conflict. Moreover, said theory is applied to a new work environment for the social sciences: virtual spaces. The analysis is organized in the following way. First, the major key psychosocial issues which define the topic of conflict are reviewed. Then, virtual environments are presented as a new work space for the social sciences. Thirdly, a synthesis of positioning theory and its FOUCAULTian legacy is conducted, while appreciating its particular appropriateness for analyzing conflictive interaction in virtual environments. An empiric case is then presented. This consists of an analysis of interactive sequences within a specific virtual environment: the Universitat Oberta de Catalunya (UOC Humanitats i Filologia Catalana studies forum. Through positioning theory, the production and effects that a conflictive interaction sequence has on the community in which it is produced are understood and explained. URN: urn:nbn:de:0114-fqs0702317

  18. Functional analysis theory and applications

    CERN Document Server

    Edwards, RE

    2011-01-01

    ""The book contains an enormous amount of information - mathematical, bibliographical and historical - interwoven with some outstanding heuristic discussions."" - Mathematical Reviews.In this massive graduate-level study, Emeritus Professor Edwards (Australian National University, Canberra) presents a balanced account of both the abstract theory and the applications of linear functional analysis. Written for readers with a basic knowledge of set theory, general topology, and vector spaces, the book includes an abundance of carefully chosen illustrative examples and excellent exercises at the

  19. Noncommutative analysis, operator theory and applications

    CERN Document Server

    Cipriani, Fabio; Colombo, Fabrizio; Guido, Daniele; Sabadini, Irene; Sauvageot, Jean-Luc

    2016-01-01

    This book illustrates several aspects of the current research activity in operator theory, operator algebras and applications in various areas of mathematics and mathematical physics. It is addressed to specialists but also to graduate students in several fields including global analysis, Schur analysis, complex analysis, C*-algebras, noncommutative geometry, operator algebras, operator theory and their applications. Contributors: F. Arici, S. Bernstein, V. Bolotnikov, J. Bourgain, P. Cerejeiras, F. Cipriani, F. Colombo, F. D'Andrea, G. Dell'Antonio, M. Elin, U. Franz, D. Guido, T. Isola, A. Kula, L.E. Labuschagne, G. Landi, W.A. Majewski, I. Sabadini, J.-L. Sauvageot, D. Shoikhet, A. Skalski, H. de Snoo, D. C. Struppa, N. Vieira, D.V. Voiculescu, and H. Woracek.

  20. Rhetorical structure theory and text analysis

    Science.gov (United States)

    Mann, William C.; Matthiessen, Christian M. I. M.; Thompson, Sandra A.

    1989-11-01

    Recent research on text generation has shown that there is a need for stronger linguistic theories that tell in detail how texts communicate. The prevailing theories are very difficult to compare, and it is also very difficult to see how they might be combined into stronger theories. To make comparison and combination a bit more approachable, we have created a book which is designed to encourage comparison. A dozen different authors or teams, all experienced in discourse research, are given exactly the same text to analyze. The text is an appeal for money by a lobbying organization in Washington, DC. It informs, stimulates and manipulates the reader in a fascinating way. The joint analysis is far more insightful than any one team's analysis alone. This paper is our contribution to the book. Rhetorical Structure Theory (RST), the focus of this paper, is a way to account for the functional potential of text, its capacity to achieve the purposes of speakers and produce effects in hearers. It also shows a way to distinguish coherent texts from incoherent ones, and identifies consequences of text structure.

  1. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  2. The resolution of point sources of light as analyzed by quantum detection theory

    Science.gov (United States)

    Helstrom, C. W.

    1972-01-01

    The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  3. Hamiltonian analysis of Plebanski theory

    International Nuclear Information System (INIS)

    Buffenoir, E; Henneaux, M; Noui, K; Roche, Ph

    2004-01-01

    We study the Hamiltonian formulation of Plebanski theory in both the Euclidean and Lorentzian cases. A careful analysis of the constraints shows that the system is non-regular, i.e., the rank of the Dirac matrix is non-constant on the non-reduced phase space. We identify the gravitational and topological sectors which are regular subspaces of the non-reduced phase space. The theory can be restricted to the regular subspace which contains the gravitational sector. We explicitly identify first- and second-class constraints in this case. We compute the determinant of the Dirac matrix and the natural measure for the path integral of the Plebanski theory (restricted to the gravitational sector). This measure is the analogue of the Leutwyler-Fradkin-Vilkovisky measure of quantum gravity

  4. Sources of political violence, political and psychological analysis

    Directory of Open Access Journals (Sweden)

    O. B. Balatska

    2015-05-01

    We also consider the following approaches to determining the nature and sources of aggression and violence such as instinktyvizm (K. Lorenz and behaviorism (J. B. Watson and B. F. Skinner et al.. Special attention is paid to theories of frustration aggression (J. Dollard, N. E. Miller, L. Berkowitz et al., according to which the causes of aggression and violence are hidden in a particular mental state – frustration. The particular importance of the theory of T. R. Gurr, in which the source of aggression and political violence are defined through the concept of relative deprivation, is underlined. Another approach is described in the article ­ the concept of aggression as a learned reaction (A. Bandura, G. Levin, B. Fleischmann et al.. Supporters of this approach believe that aggressive behavior is formed in the process of social training.

  5. Surveys on surgery theory

    CERN Document Server

    Cappell, Sylvain; Rosenberg, Jonathan

    2014-01-01

    Surgery theory, the basis for the classification theory of manifolds, is now about forty years old. The sixtieth birthday (on December 14, 1996) of C.T.C. Wall, a leading member of the subject''s founding generation, led the editors of this volume to reflect on the extraordinary accomplishments of surgery theory as well as its current enormously varied interactions with algebra, analysis, and geometry. Workers in many of these areas have often lamented the lack of a single source surveying surgery theory and its applications. Because no one person could write such a survey, the editors ask

  6. Surveys on surgery theory

    CERN Document Server

    Cappell, Sylvain; Rosenberg, Jonathan

    2014-01-01

    Surgery theory, the basis for the classification theory of manifolds, is now about forty years old. There have been some extraordinary accomplishments in that time, which have led to enormously varied interactions with algebra, analysis, and geometry. Workers in many of these areas have often lamented the lack of a single source that surveys surgery theory and its applications. Indeed, no one person could write such a survey. The sixtieth birthday of C. T. C. Wall, one of the leaders of the founding generation of surgery theory, provided an opportunity to rectify the situation and produce a

  7. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  8. Source location in plates based on the multiple sensors array method and wavelet analysis

    International Nuclear Information System (INIS)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon

    2014-01-01

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  9. Source location in plates based on the multiple sensors array method and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hong Jun; Shin, Tae Jin; Lee, Sang Kwon [Inha University, Incheon (Korea, Republic of)

    2014-01-15

    A new method for impact source localization in a plate is proposed based on the multiple signal classification (MUSIC) and wavelet analysis. For source localization, the direction of arrival of the wave caused by an impact on a plate and the distance between impact position and sensor should be estimated. The direction of arrival can be estimated accurately using MUSIC method. The distance can be obtained by using the time delay of arrival and the group velocity of the Lamb wave in a plate. Time delay is experimentally estimated using the continuous wavelet transform for the wave. The elasto dynamic theory is used for the group velocity estimation.

  10. Addendum to foundations of multidimensional wave field signal theory: Gaussian source function

    Directory of Open Access Journals (Sweden)

    Natalie Baddour

    2018-02-01

    Full Text Available Many important physical phenomena are described by wave or diffusion-wave type equations. Recent work has shown that a transform domain signal description from linear system theory can give meaningful insight to multi-dimensional wave fields. In N. Baddour [AIP Adv. 1, 022120 (2011], certain results were derived that are mathematically useful for the inversion of multi-dimensional Fourier transforms, but more importantly provide useful insight into how source functions are related to the resulting wave field. In this short addendum to that work, it is shown that these results can be applied with a Gaussian source function, which is often useful for modelling various physical phenomena.

  11. Addendum to foundations of multidimensional wave field signal theory: Gaussian source function

    Science.gov (United States)

    Baddour, Natalie

    2018-02-01

    Many important physical phenomena are described by wave or diffusion-wave type equations. Recent work has shown that a transform domain signal description from linear system theory can give meaningful insight to multi-dimensional wave fields. In N. Baddour [AIP Adv. 1, 022120 (2011)], certain results were derived that are mathematically useful for the inversion of multi-dimensional Fourier transforms, but more importantly provide useful insight into how source functions are related to the resulting wave field. In this short addendum to that work, it is shown that these results can be applied with a Gaussian source function, which is often useful for modelling various physical phenomena.

  12. Self-consistent field theory of collisions: Orbital equations with asymptotic sources and self-averaged potentials

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Y.K., E-mail: ykhahn22@verizon.net

    2014-12-15

    The self-consistent field theory of collisions is formulated, incorporating the unique dynamics generated by the self-averaged potentials. The bound state Hartree–Fock approach is extended for the first time to scattering states, by properly resolving the principal difficulties of non-integrable continuum orbitals and imposing complex asymptotic conditions. The recently developed asymptotic source theory provides the natural theoretical basis, as the asymptotic conditions are completely transferred to the source terms and the new scattering function is made fullyintegrable. The scattering solutions can then be directly expressed in terms of bound state HF configurations, establishing the relationship between the bound and scattering state solutions. Alternatively, the integrable spin orbitals are generated by constructing the individual orbital equations that contain asymptotic sources and self-averaged potentials. However, the orbital energies are not determined by the equations, and a special channel energy fixing procedure is developed to secure the solutions. It is also shown that the variational construction of the orbital equations has intrinsic ambiguities that are generally associated with the self-consistent approach. On the other hand, when a small subset of open channels is included in the source term, the solutions are only partiallyintegrable, but the individual open channels can then be treated more simply by properly selecting the orbital energies. The configuration mixing and channel coupling are then necessary to complete the solution. The new theory improves the earlier continuum HF model. - Highlights: • First extension of HF to scattering states, with proper asymptotic conditions. • Orbital equations with asymptotic sources and integrable orbital solutions. • Construction of self-averaged potentials, and orbital energy fixing. • Channel coupling and configuration mixing, involving the new orbitals. • Critical evaluation of the

  13. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing

    2017-11-03

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  14. Automatic Wave Equation Migration Velocity Analysis by Focusing Subsurface Virtual Sources

    KAUST Repository

    Sun, Bingbing; Alkhalifah, Tariq Ali

    2017-01-01

    Macro velocity model building is important for subsequent pre-stack depth migration and full waveform inversion. Wave equation migration velocity analysis (WEMVA) utilizes the band-limited waveform to invert for the velocity. Normally, inversion would be implemented by focusing the subsurface offset common image gathers (SOCIGs). We re-examine this concept with a different perspective: In subsurface offset domain, using extended Born modeling, the recorded data can be considered as invariant with respect to the perturbation of the position of the virtual sources and velocity at the same time. A linear system connecting the perturbation of the position of those virtual sources and velocity is derived and solved subsequently by Conjugate Gradient method. In theory, the perturbation of the position of the virtual sources is given by the Rytov approximation. Thus, compared to the Born approximation, it relaxes the dependency on amplitude and makes the proposed method more applicable for real data. We demonstrate the effectiveness of the approach by applying the proposed method on both isotropic and anisotropic VTI synthetic data. A real dataset example verifies the robustness of the proposed method.

  15. A theory of gradient analysis

    NARCIS (Netherlands)

    Braak, ter C.J.F.

    1988-01-01

    The theory of gradient analysis is presented in this chapter, in which the heuristic techniques are integrated with regression, calibration, ordination and constrained ordination as distinct, well-defined statistical problems. The various techniques used for each type of problem are classified into

  16. Does the source energy change when gravitaion waves are emitted in the einstein's gravitation theory

    International Nuclear Information System (INIS)

    Logunov, A.A.; Folomeshkin, V.N.

    1977-01-01

    It is shown that in the Einstein's gravitation theory the total ''energy'' of a plane gravitational wave calculated with any pseudotensor is equal to zero. The known Einstein's result, according to which the energy of a sourceis decreased when plane weak gravitational waves are emitted, have no place in the Einstein's gravitational theory. The examples are given of exact wave solutions for which the pseudotensor is strictly equal to zero. The energy-momentum of any weak gravitational waves is always equal to zero in the Einstein's gravitation theory. When such waves are emitted the energy of the source cannot change, although these waves are real curvature waves. By the means in the Einstein's gravitation theory the energy, e, is in essenc generated from nothing

  17. Gravitational-wave physics and astronomy an introduction to theory, experiment and data analysis

    CERN Document Server

    Creighton, Jolien D E

    2011-01-01

    This most up-to-date, one-stop reference combines coverage of both theory and observational techniques, with introductory sections to bring all readers up to the same level. Written by outstanding researchers directly involved with the scientific program of the Laser Interferometer Gravitational-Wave Observatory (LIGO), the book begins with a brief review of general relativity before going on to describe the physics of gravitational waves and the astrophysical sources of gravitational radiation. Further sections cover gravitational wave detectors, data analysis, and the outlook of gravitation

  18. An introduction to nonlinear analysis and fixed point theory

    CERN Document Server

    Pathak, Hemant Kumar

    2018-01-01

    This book systematically introduces the theory of nonlinear analysis, providing an overview of topics such as geometry of Banach spaces, differential calculus in Banach spaces, monotone operators, and fixed point theorems. It also discusses degree theory, nonlinear matrix equations, control theory, differential and integral equations, and inclusions. The book presents surjectivity theorems, variational inequalities, stochastic game theory and mathematical biology, along with a large number of applications of these theories in various other disciplines. Nonlinear analysis is characterised by its applications in numerous interdisciplinary fields, ranging from engineering to space science, hydromechanics to astrophysics, chemistry to biology, theoretical mechanics to biomechanics and economics to stochastic game theory. Organised into ten chapters, the book shows the elegance of the subject and its deep-rooted concepts and techniques, which provide the tools for developing more realistic and accurate models for ...

  19. Chromatographic fingerprint similarity analysis for pollutant source identification

    International Nuclear Information System (INIS)

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  20. Modern Theory of Gratings Resonant Scattering: Analysis Techniques and Phenomena

    CERN Document Server

    Sirenko, Yuriy K

    2010-01-01

    Diffraction gratings are one of the most popular objects of analysis in electromagnetic theory. The requirements of applied optics and microwave engineering lead to many new problems and challenges for the theory of diffraction gratings, which force us to search for new methods and tools for their resolution. In Modern Theory of Gratings, the authors present results of the electromagnetic theory of diffraction gratings that will constitute the base of further development of this theory, which meet the challenges provided by modern requirements of fundamental and applied science. This volume covers: spectral theory of gratings (Chapter 1) giving reliable grounds for physical analysis of space-frequency and space-time transformations of the electromagnetic field in open periodic resonators and waveguides; authentic analytic regularization procedures (Chapter 2) that, in contradistinction to the traditional frequency-domain approaches, fit perfectly for the analysis of resonant wave scattering processes; paramet...

  1. Design and analysis of nuclear battery driven by the external neutron source

    International Nuclear Information System (INIS)

    Wang, Sanbing; He, Chaohui

    2014-01-01

    Highlights: • A new type of space nuclear power called NBDEx is investigated. • NBDEx with 252 Cf has better performance than RTG with similar structure. • Its thermal power gets great improvement with increment of fuel enrichment. • The service life of NBDEx is about 2.96 year. • The launch abortion accident analysis fully demonstrates the advantage of NBDEx. - Abstract: Based on the theory of ADS (Accelerator Driven Subcritical reactor), a new type of nuclear battery was investigated, which was composed of a subcritical fission module and an isotope neutron source, called NBDEx (Nuclear Battery Driven by External neutron source). According to the structure of GPHS-RTG (General Purpose Heat Source Radioisotope Thermoelectric Generator), the fuel cell model and fuel assembly model of NBDEx were set up, and then their performances were analyzed with MCNP code. From these results, it was found that the power and power density of NBDEx were almost six times higher than the RTG’s. For fully demonstrating the advantage of NBDEx, the analysis of its impact factors was performed with MCNP code, and its lifetime was also calculated using the Origen code. These results verified that NBDEx was more suitable for the space missions than RTG

  2. Analysis of Multidimensional Poverty: Theory and Case Studies ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2009-08-18

    Aug 18, 2009 ... ... of applying a factorial technique, Multiple Correspondence Analysis, to poverty analysis. ... Analysis of Multidimensional Poverty: Theory and Case Studies ... agreement to support joint research projects in December 2017.

  3. Particle production in field theories coupled to strong external sources, I: Formalism and main results

    International Nuclear Information System (INIS)

    Gelis, Francois; Venugopalan, Raju

    2006-01-01

    We develop a formalism for particle production in a field theory coupled to a strong time-dependent external source. An example of such a theory is the color glass condensate. We derive a formula, in terms of cut vacuum-vacuum Feynman graphs, for the probability of producing a given number of particles. This formula is valid to all orders in the coupling constant. The distribution of multiplicities is non-Poissonian, even in the classical approximation. We investigate an alternative method of calculating the mean multiplicity. At leading order, the average multiplicity can be expressed in terms of retarded solutions of classical equations of motion. We demonstrate that the average multiplicity at next-to-leading order can be formulated as an initial value problem by solving equations of motion for small fluctuation fields with retarded boundary conditions. The variance of the distribution can be calculated in a similar fashion. Our formalism therefore provides a framework to compute from first principles particle production in proton-nucleus and nucleus-nucleus collisions beyond leading order in the coupling constant and to all orders in the source density. We also provide a transparent interpretation (in conventional field theory language) of the well-known Abramovsky-Gribov-Kancheli (AGK) cancellations. Explicit connections are made between the framework for multi-particle production developed here and the framework of reggeon field theory

  4. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Science.gov (United States)

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  5. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Directory of Open Access Journals (Sweden)

    Tewari Susanta

    2012-10-01

    Full Text Available Abstract Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach.

  6. Dimensional analysis in field theory

    International Nuclear Information System (INIS)

    Stevenson, P.M.

    1981-01-01

    Dimensional Transmutation (the breakdown of scale invariance in field theories) is reconciled with the commonsense notions of Dimensional Analysis. This makes possible a discussion of the meaning of the Renormalisation Group equations, completely divorced from the technicalities of renormalisation. As illustrations, I describe some very farmiliar QCD results in these terms

  7. LED intense headband light source for fingerprint analysis

    Science.gov (United States)

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  8. Mathematical theory of sedimentation analysis

    CERN Document Server

    Fujita, Hiroshi; Van Rysselberghe, P

    1962-01-01

    Mathematical Theory of Sedimentation Analysis presents the flow equations for the ultracentrifuge. This book is organized into two parts encompassing six chapters that evaluate the systems of reacting components, the differential equations for the ultracentrifuge, and the case of negligible diffusion. The first chapters consider the Archibald method for molecular weight determination; pressure-dependent sedimentation; expressions for the refractive index and its gradient; relation between refractive index and concentration; and the analysis of Gaussian distribution. Other chapters deal with th

  9. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  10. A Comparative Analysis of Three Unique Theories of Organizational Learning

    Science.gov (United States)

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  11. Kinematic analysis of parallel manipulators by algebraic screw theory

    CERN Document Server

    Gallardo-Alvarado, Jaime

    2016-01-01

    This book reviews the fundamentals of screw theory concerned with velocity analysis of rigid-bodies, confirmed with detailed and explicit proofs. The author additionally investigates acceleration, jerk, and hyper-jerk analyses of rigid-bodies following the trend of the velocity analysis. With the material provided in this book, readers can extend the theory of screws into the kinematics of optional order of rigid-bodies. Illustrative examples and exercises to reinforce learning are provided. Of particular note, the kinematics of emblematic parallel manipulators, such as the Delta robot as well as the original Gough and Stewart platforms are revisited applying, in addition to the theory of screws, new methods devoted to simplify the corresponding forward-displacement analysis, a challenging task for most parallel manipulators. Stands as the only book devoted to the acceleration, jerk and hyper-jerk (snap) analyses of rigid-body by means of screw theory; Provides new strategies to simplify the forward kinematic...

  12. Research in particle theory

    International Nuclear Information System (INIS)

    Mansouri, F.; Suranyi, P.; Wijewardhana, L.C.R.

    1992-10-01

    Dynamics of 2+1 dimensional gravity is analyzed by coupling matter to Chern Simons Witten action in two ways and obtaining the exact gravity Hamiltonian for each case. 't Hoot's Hamiltonian is obtained as an approximation. The notion of space-time emerges in the very end as a broken phase of the gauge theory. We have studied the patterns of discrete and continuous symmetry breaking in 2+1 dimensional field theories. We formulate our analysis in terms of effective composite scalar field theories. Point-like sources in the Chern-Simons theory of gravity in 2+1 dimensions are described by their Poincare' charges. We have obtained exact solutions of the constraints of Chern-Simons theory with an arbitrary number of isolated point sources in relative motion. We then showed how the space-time metric is constructed. A reorganized perturbation expansion with a propagator of soft infrared behavior has been used to study the critical behavior of the mass gap. The condition of relativistic covariance fixes the form of the soft propagator. Approximants to the correlation critical exponent were obtained in two loop order for the two and three dimensional theories. We proposed a new model of QED exhibiting two phases and a Majorana mass spectrum of single particle states. The model has a new source of coupling constant renormalization which opposes screening and suggests the model may confine. Assuming that the bound states of e + e - essentially obey a Majorana spectrum, we obtained a consistent fit of the GSI peaks as well as predicting new peaks and their spin assignments

  13. Theory analysis of the Dental Hygiene Human Needs Conceptual Model.

    Science.gov (United States)

    MacDonald, L; Bowen, D M

    2017-11-01

    Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Network analysis and synthesis a modern systems theory approach

    CERN Document Server

    Anderson, Brian D O

    2006-01-01

    Geared toward upper-level undergraduates and graduate students, this book offers a comprehensive look at linear network analysis and synthesis. It explores state-space synthesis as well as analysis, employing modern systems theory to unite the classical concepts of network theory. The authors stress passive networks but include material on active networks. They avoid topology in dealing with analysis problems and discuss computational techniques. The concepts of controllability, observability, and degree are emphasized in reviewing the state-variable description of linear systems. Explorations

  15. Kepler's theory of force and his medical sources.

    Science.gov (United States)

    Regier, Jonathan

    2014-01-01

    Johannes Kepler (1571-1630) makes extensive use of souls and spiritus in his natural philosophy. Recent studies have highlighted their importance in his accounts of celestial generation and astrology. In this study, I would like to address two pressing issues. The first is Kepler's context. The biological side of his natural philosophy is not naively Aristotelian. Instead, he is up to date with contemporary discussions in medically flavored natural philosophy. I will examine his relationship to Melanchthon's anatomical-theological Liber de anima (1552) and to Jean Femel's very popular Physiologia (1567), two Galenic sources with a noticeable impact on how he understands the functions of life. The other issue that will direct my article is force at a distance. Medical ideas deeply inform Kepler's theories of light and solar force (virtus motrix). It will become clear that they are not a hindrance even to the hardcore of his celestial physics. Instead, he makes use of soul and spiritus in order to develop a fully mathematized dynamics.

  16. Recognition memory, self-other source memory, and theory-of-mind in children with autism spectrum disorder.

    Science.gov (United States)

    Lind, Sophie E; Bowler, Dermot M

    2009-09-01

    This study investigated semantic and episodic memory in autism spectrum disorder (ASD), using a task which assessed recognition and self-other source memory. Children with ASD showed undiminished recognition memory but significantly diminished source memory, relative to age- and verbal ability-matched comparison children. Both children with and without ASD showed an "enactment effect", demonstrating significantly better recognition and source memory for self-performed actions than other-person-performed actions. Within the comparison group, theory-of-mind (ToM) task performance was significantly correlated with source memory, specifically for other-person-performed actions (after statistically controlling for verbal ability). Within the ASD group, ToM task performance was not significantly correlated with source memory (after controlling for verbal ability). Possible explanations for these relations between source memory and ToM are considered.

  17. Analysis and evaluation of the moral distress theory.

    Science.gov (United States)

    Wilson, Melissa A

    2018-04-01

    Moral distress is a pervasive problem in nursing resulting in a detriment to patient care, providers, and organizations. Over a decade ago, the moral distress theory (MDT) was proposed and utilized in multiple research studies. This middle range theory explains and predicts the distress that occurs in a nurse because of moral conflict. The research findings born from this theory have been substantial. Since inception of this theory, moral distress has been extensively examined which has further elaborated its understanding. This paper provides an analysis and evaluation of the MDT according to applicable guidelines. Current understanding of the phenomenon indicates that a new theory may be warranted to better predict, treat, and manage moral distress. © 2017 Wiley Periodicals, Inc.

  18. Spectral theory and nonlinear analysis with applications to spatial ecology

    CERN Document Server

    Cano-Casanova, S; Mora-Corral , C

    2005-01-01

    This volume details some of the latest advances in spectral theory and nonlinear analysis through various cutting-edge theories on algebraic multiplicities, global bifurcation theory, non-linear Schrödinger equations, non-linear boundary value problems, large solutions, metasolutions, dynamical systems, and applications to spatial ecology. The main scope of the book is bringing together a series of topics that have evolved separately during the last decades around the common denominator of spectral theory and nonlinear analysis - from the most abstract developments up to the most concrete applications to population dynamics and socio-biology - in an effort to fill the existing gaps between these fields.

  19. Concept of spatial channel theory applied to reactor shielding analysis

    International Nuclear Information System (INIS)

    Williams, M.L.; Engle, W.W. Jr.

    1977-01-01

    The concept of channel theory is used to locate spatial regions that are important in contributing to a shielding response. The method is analogous to the channel-theory method developed for ascertaining important energy channels in cross-section analysis. The mathematical basis for the theory is shown to be the generalized reciprocity relation, and sample problems are given to exhibit and verify properties predicted by the mathematical equations. A practical example is cited from the shielding analysis of the Fast Flux Test Facility performed at Oak Ridge National Laboratory, in which a perspective plot of channel-theory results was found useful in locating streaming paths around the reactor cavity shield

  20. Accounting for uncertain fault geometry in earthquake source inversions - I: theory and simplified application

    Science.gov (United States)

    Ragon, Théa; Sladen, Anthony; Simons, Mark

    2018-05-01

    The ill-posed nature of earthquake source estimation derives from several factors including the quality and quantity of available observations and the fidelity of our forward theory. Observational errors are usually accounted for in the inversion process. Epistemic errors, which stem from our simplified description of the forward problem, are rarely dealt with despite their potential to bias the estimate of a source model. In this study, we explore the impact of uncertainties related to the choice of a fault geometry in source inversion problems. The geometry of a fault structure is generally reduced to a set of parameters, such as position, strike and dip, for one or a few planar fault segments. While some of these parameters can be solved for, more often they are fixed to an uncertain value. We propose a practical framework to address this limitation by following a previously implemented method exploring the impact of uncertainties on the elastic properties of our models. We develop a sensitivity analysis to small perturbations of fault dip and position. The uncertainties in fault geometry are included in the inverse problem under the formulation of the misfit covariance matrix that combines both prediction and observation uncertainties. We validate this approach with the simplified case of a fault that extends infinitely along strike, using both Bayesian and optimization formulations of a static inversion. If epistemic errors are ignored, predictions are overconfident in the data and source parameters are not reliably estimated. In contrast, inclusion of uncertainties in fault geometry allows us to infer a robust posterior source model. Epistemic uncertainties can be many orders of magnitude larger than observational errors for great earthquakes (Mw > 8). Not accounting for uncertainties in fault geometry may partly explain observed shallow slip deficits for continental earthquakes. Similarly, ignoring the impact of epistemic errors can also bias estimates of

  1. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  2. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  3. CO_2 volatility impact on energy portfolio choice: A fully stochastic LCOE theory analysis

    International Nuclear Information System (INIS)

    Lucheroni, Carlo; Mari, Carlo

    2017-01-01

    Highlights: • Stochastic LCOE theory is an extension of the levelized cost of electricity analysis. • The fully stochastic analysis include stochastic processes for fossil fuels prices and CO_2 prices. • The nuclear asset is risky through uncertainty about construction times and it is used as a hedge. • Volatility of CO_2 prices has a strong influence on CO_2 emissions reduction. - Abstract: Market based pricing of CO_2 was designed to control CO_2 emissions by means of the price level, since high CO_2 price levels discourage emissions. In this paper, it will be shown that the level of uncertainty on CO_2 market prices, i.e. the volatility of CO_2 prices itself, has a strong influence not only on generation portfolio risk management but also on CO_2 emissions abatement. A reduction of emissions can be obtained when rational power generation capacity investors decide that the capacity expansion cost risk induced jointly by CO_2 volatility and fossil fuels prices volatility can be efficiently hedged adding to otherwise fossil fuel portfolios some nuclear power as a carbon free asset. This intriguing effect will be discussed using a recently introduced economic analysis tool, called stochastic LCOE theory. The stochastic LCOE theory used here was designed to investigate diversification effects on energy portfolios. In previous papers this theory was used to study diversification effects on portfolios composed of carbon risky fossil technologies and a carbon risk-free nuclear technology in a risk-reward trade-off frame. In this paper the stochastic LCOE theory will be extended to include uncertainty about nuclear power plant construction times, i.e. considering nuclear risky as well, this being the main uncertainty source of financial risk in nuclear technology. Two measures of risk will be used, standard deviation and CVaR deviation, to derive efficient frontiers for generation portfolios. Frontier portfolios will be analyzed in their implications on emissions

  4. Evolutionary Game Theory Analysis of Tumor Progression

    Science.gov (United States)

    Wu, Amy; Liao, David; Sturm, James; Austin, Robert

    2014-03-01

    Evolutionary game theory applied to two interacting cell populations can yield quantitative prediction of the future densities of the two cell populations based on the initial interaction terms. We will discuss how in a complex ecology that evolutionary game theory successfully predicts the future densities of strains of stromal and cancer cells (multiple myeloma), and discuss the possible clinical use of such analysis for predicting cancer progression. Supported by the National Science Foundation and the National Cancer Institute.

  5. Salinas : theory manual.

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Timothy Francis; Reese, Garth M.; Bhardwaj, Manoj Kumar

    2011-11-01

    Salinas provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Salinas. For a more detailed description of how to use Salinas, we refer the reader to Salinas, User's Notes. Many of the constructs in Salinas are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Salinas are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature.

  6. Exact-to-precision generalized perturbation theory for source-driven systems

    International Nuclear Information System (INIS)

    Wang Congjian; Abdel-Khalik, Hany S.

    2011-01-01

    Highlights: ► We present a new development in higher order generalized perturbation theory. ► The method addresses the explosion in the flux phase space, input parameters, and responses. ► The method hybridizes first-order GPT and proper orthogonal decomposition snapshots method. ► A simplified 1D and realistic 2D assembly models demonstrate applicability of the method. ► The accuracy of the method is compared to exact direct perturbations and first-order GPT. - Abstract: Presented in this manuscript are new developments to perturbation theory which are intended to extend its applicability to estimate, with quantifiable accuracy, the exact variations in all responses calculated by the model with respect to all possible perturbations in the model's input parameters. The new developments place high premium on reducing the associated computational overhead in order to enable the use of perturbation theory in routine reactor design calculations. By way of examples, these developments could be employed in core simulation to accurately estimate the few-group cross-sections variations resulting from perturbations in neutronics and thermal-hydraulics core conditions. These variations are currently being described using a look-up table approach, where thousands of assembly calculations are performed to capture few-group cross-sections variations for the downstream core calculations. Other applications include the efficient evaluation of surrogates for applications that require repeated model runs such as design optimization, inverse studies, uncertainty quantification, and online core monitoring. The theoretical background of these developments applied to source-driven systems and supporting numerical experiments are presented in this manuscript. Extension to eigenvalue problems will be presented in a future article.

  7. Theory of the detection of the field surrounding half-dressed sources

    International Nuclear Information System (INIS)

    Compagno, G.; Passante, R.; Persico, F.

    1988-01-01

    Half-dressed sources are defined as sources deprived partially or totally of the cloud of virtual quanta which surrounds them in the ground state of the total system. Two models of a half-dressed point source S are considered, the first in the framework of the theory of massive scalar fields and the second in quantum electrodynamics (QED). In both cases the detector is modeled by a second fully dressed source T of the same field, which is also bound to an oscillation center by harmonic forces. It is shown that when S at time t = 0 is suddenly coupled to or decoupled from the field, the detector T, which is initially at rest, is set in motion after a time t = R 0 /c, where R 0 is the S-T distance. Neglecting the reaction back on the field due to the oscillatory motion of T, the amplitude of oscillation for t = ∞ is obtained as a function of R 0 . Thus the time-varying virtual field of S is shown to be capable of exerting a force which excites the model detector. For the QED case, this force is related to the properties of the energy density of the virtual field. This energy density displays a singularity at r = ct, and the mathematical nature of this singularity is studied in detail. In this way it is shown that the energy density of the time-dependent virtual field is rather different from that of a pulse of radiation emitted by a source during energy-conserving processes. The differences are discussed in detail, as well as the limitations of the model

  8. Estimation of distance error by fuzzy set theory required for strength determination of HDR (192)Ir brachytherapy sources.

    Science.gov (United States)

    Kumar, Sudhir; Datta, D; Sharma, S D; Chourasiya, G; Babu, D A R; Sharma, D N

    2014-04-01

    Verification of the strength of high dose rate (HDR) (192)Ir brachytherapy sources on receipt from the vendor is an important component of institutional quality assurance program. Either reference air-kerma rate (RAKR) or air-kerma strength (AKS) is the recommended quantity to specify the strength of gamma-emitting brachytherapy sources. The use of Farmer-type cylindrical ionization chamber of sensitive volume 0.6 cm(3) is one of the recommended methods for measuring RAKR of HDR (192)Ir brachytherapy sources. While using the cylindrical chamber method, it is required to determine the positioning error of the ionization chamber with respect to the source which is called the distance error. An attempt has been made to apply the fuzzy set theory to estimate the subjective uncertainty associated with the distance error. A simplified approach of applying this fuzzy set theory has been proposed in the quantification of uncertainty associated with the distance error. In order to express the uncertainty in the framework of fuzzy sets, the uncertainty index was estimated and was found to be within 2.5%, which further indicates that the possibility of error in measuring such distance may be of this order. It is observed that the relative distance li estimated by analytical method and fuzzy set theoretic approach are consistent with each other. The crisp values of li estimated using analytical method lie within the bounds computed using fuzzy set theory. This indicates that li values estimated using analytical methods are within 2.5% uncertainty. This value of uncertainty in distance measurement should be incorporated in the uncertainty budget, while estimating the expanded uncertainty in HDR (192)Ir source strength measurement.

  9. Adventures in graph theory

    CERN Document Server

    Joyner, W David

    2017-01-01

    This textbook acts as a pathway to higher mathematics by seeking and illuminating the connections between graph theory and diverse fields of mathematics, such as calculus on manifolds, group theory, algebraic curves, Fourier analysis, cryptography and other areas of combinatorics. An overview of graph theory definitions and polynomial invariants for graphs prepares the reader for the subsequent dive into the applications of graph theory. To pique the reader’s interest in areas of possible exploration, recent results in mathematics appear throughout the book, accompanied with examples of related graphs, how they arise, and what their valuable uses are. The consequences of graph theory covered by the authors are complicated and far-reaching, so topics are always exhibited in a user-friendly manner with copious graphs, exercises, and Sage code for the computation of equations. Samples of the book’s source code can be found at github.com/springer-math/adventures-in-graph-theory. The text is geared towards ad...

  10. Natural disaster risk analysis for critical infrastructure systems: An approach based on statistical learning theory

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2009-01-01

    Probabilistic risk analysis has historically been developed for situations in which measured data about the overall reliability of a system are limited and expert knowledge is the best source of information available. There continue to be a number of important problem areas characterized by a lack of hard data. However, in other important problem areas the emergence of information technology has transformed the situation from one characterized by little data to one characterized by data overabundance. Natural disaster risk assessments for events impacting large-scale, critical infrastructure systems such as electric power distribution systems, transportation systems, water supply systems, and natural gas supply systems are important examples of problems characterized by data overabundance. There are often substantial amounts of information collected and archived about the behavior of these systems over time. Yet it can be difficult to effectively utilize these large data sets for risk assessment. Using this information for estimating the probability or consequences of system failure requires a different approach and analysis paradigm than risk analysis for data-poor systems does. Statistical learning theory, a diverse set of methods designed to draw inferences from large, complex data sets, can provide a basis for risk analysis for data-rich systems. This paper provides an overview of statistical learning theory methods and discusses their potential for greater use in risk analysis

  11. Antieigenvalue analysis for continuum mechanics, economics, and number theory

    Directory of Open Access Journals (Sweden)

    Gustafson Karl

    2016-01-01

    Full Text Available My recent book Antieigenvalue Analysis, World-Scientific, 2012, presented the theory of antieigenvalues from its inception in 1966 up to 2010, and its applications within those forty-five years to Numerical Analysis, Wavelets, Statistics, Quantum Mechanics, Finance, and Optimization. Here I am able to offer three further areas of application: Continuum Mechanics, Economics, and Number Theory. In particular, the critical angle of repose in a continuum model of granular materials is shown to be exactly my matrix maximum turning angle of the stress tensor of the material. The important Sharpe ratio of the Capital Asset Pricing Model is now seen in terms of my antieigenvalue theory. Euclid’s Formula for Pythagorean triples becomes a special case of my operator trigonometry.

  12. Dimensional analysis and group theory in astrophysics

    CERN Document Server

    Kurth, Rudolf

    2013-01-01

    Dimensional Analysis and Group Theory in Astrophysics describes how dimensional analysis, refined by mathematical regularity hypotheses, can be applied to purely qualitative physical assumptions. The book focuses on the continuous spectral of the stars and the mass-luminosity relationship. The text discusses the technique of dimensional analysis, covering both relativistic phenomena and the stellar systems. The book also explains the fundamental conclusion of dimensional analysis, wherein the unknown functions shall be given certain specified forms. The Wien and Stefan-Boltzmann Laws can be si

  13. A fundamental study of ''contribution'' transport theory and channel theory applications

    International Nuclear Information System (INIS)

    Williams, M.L.

    1992-01-01

    The objective of this three-year study is to develop a technique called ''channel theory'' that can be used in interpreting particle transport analysis such as frequently required in radiation shielding design and assessment. Channel theory is a technique used to provide insight into the mechanisms by which particles emitted from a source are transported through a complex system and register a response on some detector. It is based on the behavior of a pseudo particle called a ''contributon,'' which is the response carrier through space and energy channels that connect the source and detector. ''Contributons'' are those particles among all the ones contained in the system which will eventually contribute some amount of response to the detector. The specific goals of this projects are to provide a more fundamental theoretical understanding of the method, and to develop computer programs to apply the techniques to practical problems encountered in radiation transport analysis. The overall project can be divided into three components to meet these objectives: (a) Theoretical Development, (b) Code Development, and (c) Sample Applications. During the present third year of this study, an application of contributon theory to the analysis of radiation heating in a nuclear rocket has been completed, and a paper on the assessment of radiation damage response of an LWR pressure vessel and analysis of radiation propagation through space and energy channels in air at the Hiroshima weapon burst was accepted for publication. A major effort was devoted to developing a new ''Contributon Monte Carlo'' method, which can improve the efficiency of Monte Carlo calculations of radiation transport by tracking only contributons. The theoretical basis for Contributon Monte Carlo has been completed, and the implementation and testing of the technique is presently being performed

  14. Non-equilibrium thermodynamics theory of econometric source discovery for large data analysis

    Science.gov (United States)

    van Bergem, Rutger; Jenkins, Jeffrey; Benachenhou, Dalila; Szu, Harold

    2014-05-01

    Almost all consumer and firm transactions are achieved using computers and as a result gives rise to increasingly large amounts of data available for analysts. The gold standard in Economic data manipulation techniques matured during a period of limited data access, and the new Large Data Analysis (LDA) paradigm we all face may quickly obfuscate most tools used by Economists. When coupled with an increased availability of numerous unstructured, multi-modal data sets, the impending 'data tsunami' could have serious detrimental effects for Economic forecasting, analysis, and research in general. Given this reality we propose a decision-aid framework for Augmented-LDA (A-LDA) - a synergistic approach to LDA which combines traditional supervised, rule-based Machine Learning (ML) strategies to iteratively uncover hidden sources in large data, the artificial neural network (ANN) Unsupervised Learning (USL) at the minimum Helmholtz free energy for isothermal dynamic equilibrium strategies, and the Economic intuitions required to handle problems encountered when interpreting large amounts of Financial or Economic data. To make the ANN USL framework applicable to economics we define the temperature, entropy, and energy concepts in Economics from non-equilibrium molecular thermodynamics of Boltzmann viewpoint, as well as defining an information geometry, on which the ANN can operate using USL to reduce information saturation. An exemplar of such a system representation is given for firm industry equilibrium. We demonstrate the traditional ML methodology in the economics context and leverage firm financial data to explore a frontier concept known as behavioral heterogeneity. Behavioral heterogeneity on the firm level can be imagined as a firm's interactions with different types of Economic entities over time. These interactions could impose varying degrees of institutional constraints on a firm's business behavior. We specifically look at behavioral heterogeneity for firms

  15. Compositional Data Analysis Theory and Applications

    CERN Document Server

    Pawlowsky-Glahn, Vera

    2011-01-01

    This book presents the state-of-the-art in compositional data analysis and will feature a collection of papers covering theory, applications to various fields of science and software. Areas covered will range from geology, biology, environmental sciences, forensic sciences, medicine and hydrology. Key features:Provides the state-of-the-art text in compositional data analysisCovers a variety of subject areas, from geology to medicineWritten by leading researchers in the fieldIs supported by a website featuring R code

  16. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  17. Fixed point theory, variational analysis, and optimization

    CERN Document Server

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  18. Theory for source-responsive and free-surface film modeling of unsaturated flow

    Science.gov (United States)

    Nimmo, J.R.

    2010-01-01

    A new model explicitly incorporates the possibility of rapid response, across significant distance, to substantial water input. It is useful for unsaturated flow processes that are not inherently diffusive, or that do not progress through a series of equilibrium states. The term source-responsive is used to mean that flow responds sensitively to changing conditions at the source of water input (e.g., rainfall, irrigation, or ponded infiltration). The domain of preferential flow can be conceptualized as laminar flow in free-surface films along the walls of pores. These films may be considered to have uniform thickness, as suggested by field evidence that preferential flow moves at an approximately uniform rate when generated by a continuous and ample water supply. An effective facial area per unit volume quantitatively characterizes the medium with respect to source-responsive flow. A flow-intensity factor dependent on conditions within the medium represents the amount of source-responsive flow at a given time and position. Laminar flow theory provides relations for the velocity and thickness of flowing source-responsive films. Combination with the Darcy-Buckingham law and the continuity equation leads to expressions for both fluxes and dynamic water contents. Where preferential flow is sometimes or always significant, the interactive combination of source-responsive and diffuse flow has the potential to improve prediction of unsaturated-zone fluxes in response to hydraulic inputs and the evolving distribution of soil moisture. Examples for which this approach is efficient and physically plausible include (i) rainstorm-generated rapid fluctuations of a deep water table and (ii) space- and time-dependent soil water content response to infiltration in a macroporous soil. ?? Soil Science Society of America.

  19. Correspondence analysis theory, practice and new strategies

    CERN Document Server

    Beh, Eric J

    2014-01-01

    A comprehensive overview of the internationalisation of correspondence analysis Correspondence Analysis: Theory, Practice and New Strategies examines the key issues of correspondence analysis, and discusses the new advances that have been made over the last 20 years. The main focus of this book is to provide a comprehensive discussion of some of the key technical and practical aspects of correspondence analysis, and to demonstrate how they may be put to use.  Particular attention is given to the history and mathematical links of the developments made. These links include not just those majo

  20. Blind source separation advances in theory, algorithms and applications

    CERN Document Server

    Wang, Wenwu

    2014-01-01

    Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.

  1. Do violations of the axioms of expected utility theory threaten decision analysis?

    Science.gov (United States)

    Nease, R F

    1996-01-01

    Research demonstrates that people violate the independence principle of expected utility theory, raising the question of whether expected utility theory is normative for medical decision making. The author provides three arguments that violations of the independence principle are less problematic than they might first appear. First, the independence principle follows from other more fundamental axioms whose appeal may be more readily apparent than that of the independence principle. Second, the axioms need not be descriptive to be normative, and they need not be attractive to all decision makers for expected utility theory to be useful for some. Finally, by providing a metaphor of decision analysis as a conversation between the actual decision maker and a model decision maker, the author argues that expected utility theory need not be purely normative for decision analysis to be useful. In short, violations of the independence principle do not necessarily represent direct violations of the axioms of expected utility theory; behavioral violations of the axioms of expected utility theory do not necessarily imply that decision analysis is not normative; and full normativeness is not necessary for decision analysis to generate valuable insights.

  2. Complex space source theory of partially coherent light wave.

    Science.gov (United States)

    Seshadri, S R

    2010-07-01

    The complex space source theory is used to derive a general integral expression for the vector potential that generates the extended full Gaussian wave in terms of the input value of the vector potential of the corresponding paraxial beam. The vector potential and the fields are assumed to fluctuate on a time scale that is large compared to the wave period. The Poynting vector in the propagation direction averaged over a wave period is expressed in terms of the cross-spectral density of the fluctuating vector potential across the input plane. The Schell model is assumed for the cross-spectral density. The radiation intensity distribution and the power radiated are determined. The effect of spatial coherence on the radiation intensity distribution and the radiated power are investigated for different values of the physical parameters. Illustrative numerical results are provided to bring out the effect of spatial coherence on the propagation characteristics of the fluctuating light wave.

  3. Complex analysis a modern first course in function theory

    CERN Document Server

    Muir, Jerry R

    2015-01-01

    A thorough introduction to the theory of complex functions emphasizing the beauty, power, and counterintuitive nature of the subject Written with a reader-friendly approach, Complex Analysis: A Modern First Course in Function Theory features a self-contained, concise development of the fundamental principles of complex analysis. After laying groundwork on complex numbers and the calculus and geometric mapping properties of functions of a complex variable, the author uses power series as a unifying theme to define and study the many rich and occasionally surprising properties of analytic fun

  4. Analysis of event tree with imprecise inputs by fuzzy set theory

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  5. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  6. An Introduction to Wavelet Theory and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miner, N.E.

    1998-10-01

    This report reviews the history, theory and mathematics of wavelet analysis. Examination of the Fourier Transform and Short-time Fourier Transform methods provides tiormation about the evolution of the wavelet analysis technique. This overview is intended to provide readers with a basic understanding of wavelet analysis, define common wavelet terminology and describe wavelet amdysis algorithms. The most common algorithms for performing efficient, discrete wavelet transforms for signal analysis and inverse discrete wavelet transforms for signal reconstruction are presented. This report is intended to be approachable by non- mathematicians, although a basic understanding of engineering mathematics is necessary.

  7. Analysis of the neutrons dispersion in a semi-infinite medium based in transport theory and the Monte Carlo method

    International Nuclear Information System (INIS)

    Arreola V, G.; Vazquez R, R.; Guzman A, J. R.

    2012-10-01

    In this work a comparative analysis of the results for the neutrons dispersion in a not multiplicative semi-infinite medium is presented. One of the frontiers of this medium is located in the origin of coordinates, where a neutrons source in beam form, i.e., μο=1 is also. The neutrons dispersion is studied on the statistical method of Monte Carlo and through the unidimensional transport theory and for an energy group. The application of transport theory gives a semi-analytic solution for this problem while the statistical solution for the flow was obtained applying the MCNPX code. The dispersion in light water and heavy water was studied. A first remarkable result is that both methods locate the maximum of the neutrons distribution to less than two mean free trajectories of transport for heavy water, while for the light water is less than ten mean free trajectories of transport; the differences between both methods is major for the light water case. A second remarkable result is that the tendency of both distributions is similar in small mean free trajectories, while in big mean free trajectories the transport theory spreads to an asymptote value and the solution in base statistical method spreads to zero. The existence of a neutron current of low energy and toward the source is demonstrated, in contrary sense to the neutron current of high energy coming from the own source. (Author)

  8. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    Science.gov (United States)

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  9. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  10. Thermal convection at low Rayleigh number from concentrated sources in porous media

    International Nuclear Information System (INIS)

    Hickox, C.E.

    1980-01-01

    A simple mathematical theory is proposed for the analysis of natural convective motion, at low Rayleigh number, from a concentrated source of heat in a fluid-saturated porous medium. The theory consists of retaining only the leading terms of series expansions of the dependent variables in terms of the Rayleigh number, is thus linear, and is valid only in the limit of small Rayleigh number. Based on fundamental results for a variety of isolated sources, superposition is used to provide solutions for situations of practical interest. Special emphasis is given to the analysis of sub-seabed disposal of nuclear waste. 8 figures

  11. Mokken scale analysis : Between the Guttman scale and parametric item response theory

    NARCIS (Netherlands)

    van Schuur, Wijbrandt H.

    2003-01-01

    This article introduces a model of ordinal unidimensional measurement known as Mokken scale analysis. Mokken scaling is based on principles of Item Response Theory (IRT) that originated in the Guttman scale. I compare the Mokken model with both Classical Test Theory (reliability or factor analysis)

  12. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    Science.gov (United States)

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and

  13. Generalizability Theory and Classical Test Theory

    Science.gov (United States)

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  14. An analysis of the transformational leadership theory | Moradi ...

    African Journals Online (AJOL)

    An analysis of the transformational leadership theory. ... at all levels of the organization also feel the need to cooperate with others to achieve the desired results. ... intellectual stimulation; inspirational motivation; personal considerations ...

  15. Exact interior solutions for static spheres in the Einstein-Cartan theory with two sources of torsion

    CERN Document Server

    Gallakhmetov, A M

    2002-01-01

    In the framework of the problem of existence of exact interior solutions for static spherically symmetric configurations in the Einstein-Cartan theory (ECT), the distributions of perfect fluid and non-minimally coupled scalar field are considered. The exact solutions in the one-torsion ECT and two-torsion one are obtained. Some consequences of two sources of torsion are discussed.

  16. Stability analysis of black holes via a catastrophe theory and black hole thermodynamics in generalized theories of gravity

    International Nuclear Information System (INIS)

    Tamaki, Takashi; Torii, Takashi; Maeda, Kei-ichi

    2003-01-01

    We perform a linear perturbation analysis for black hole solutions with a 'massive' Yang-Mills field (the Proca field) in Brans-Dicke theory and find that the results are quite consistent with those via catastrophe theory where thermodynamic variables play an intrinsic role. Based on this observation, we show the general relation between these two methods in generalized theories of gravity which are conformally related to the Einstein-Hilbert action

  17. Theory and Application of DNA Histogram Analysis.

    Science.gov (United States)

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  18. Towards a Design Theory for Collaborative Qualitative Data Analysis

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel

    2016-01-01

    This position paper addresses how to develop a design theory to support the collaborative practice of qualitative data analysis. Qualitative researchers face several challenges in making sense of their empirical data and IS-support for this practice can be found in software applications...... such as NVivo, Atlas.ti, and DeDoose. While these software tools have utility and are valuable, they are also limiting – and they are particularly limiting researchers in their collaborative efforts with their co-researchers. In this paper, we investigate a design theory to extend it to support collaboration....... We use this as a stepping stone to discuss how to use a design theory to problematize existing applications and how to extend a design theory by abduction....

  19. Analysis of Ward identities in supersymmetric Yang-Mills theory

    Science.gov (United States)

    Ali, Sajid; Bergner, Georg; Gerber, Henning; Montvay, Istvan; Münster, Gernot; Piemonte, Stefano; Scior, Philipp

    2018-05-01

    In numerical investigations of supersymmetric Yang-Mills theory on a lattice, the supersymmetric Ward identities are valuable for finding the critical value of the hopping parameter and for examining the size of supersymmetry breaking by the lattice discretisation. In this article we present an improved method for the numerical analysis of supersymmetric Ward identities, which takes into account the correlations between the various observables involved. We present the first complete analysis of supersymmetric Ward identities in N=1 supersymmetric Yang-Mills theory with gauge group SU(3). The results indicate that lattice artefacts scale to zero as O(a^2) towards the continuum limit in agreement with theoretical expectations.

  20. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  1. Convex analysis and monotone operator theory in Hilbert spaces

    CERN Document Server

    Bauschke, Heinz H

    2017-01-01

    This reference text, now in its second edition, offers a modern unifying presentation of three basic areas of nonlinear analysis: convex analysis, monotone operator theory, and the fixed point theory of nonexpansive operators. Taking a unique comprehensive approach, the theory is developed from the ground up, with the rich connections and interactions between the areas as the central focus, and it is illustrated by a large number of examples. The Hilbert space setting of the material offers a wide range of applications while avoiding the technical difficulties of general Banach spaces. The authors have also drawn upon recent advances and modern tools to simplify the proofs of key results making the book more accessible to a broader range of scholars and users. Combining a strong emphasis on applications with exceptionally lucid writing and an abundance of exercises, this text is of great value to a large audience including pure and applied mathematicians as well as researchers in engineering, data science, ma...

  2. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  3. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation

    Science.gov (United States)

    Alton, G. D.; Bilheux, H.

    2004-05-01

    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j+ext, and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j+ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects.

  4. Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation

    International Nuclear Information System (INIS)

    Alton, G.D.; Bilheux, H.

    2004-01-01

    Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j +ext , and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j +ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects

  5. Network theory-based analysis of risk interactions in large engineering projects

    International Nuclear Information System (INIS)

    Fang, Chao; Marle, Franck; Zio, Enrico; Bocquet, Jean-Claude

    2012-01-01

    This paper presents an approach based on network theory to deal with risk interactions in large engineering projects. Indeed, such projects are exposed to numerous and interdependent risks of various nature, which makes their management more difficult. In this paper, a topological analysis based on network theory is presented, which aims at identifying key elements in the structure of interrelated risks potentially affecting a large engineering project. This analysis serves as a powerful complement to classical project risk analysis. Its originality lies in the application of some network theory indicators to the project risk management field. The construction of the risk network requires the involvement of the project manager and other team members assigned to the risk management process. Its interpretation improves their understanding of risks and their potential interactions. The outcomes of the analysis provide a support for decision-making regarding project risk management. An example of application to a real large engineering project is presented. The conclusion is that some new insights can be found about risks, about their interactions and about the global potential behavior of the project. - Highlights: ► The method addresses the modeling of complexity in project risk analysis. ► Network theory indicators enable other risks than classical criticality analysis to be highlighted. ► This topological analysis improves project manager's understanding of risks and risk interactions. ► This helps project manager to make decisions considering the position in the risk network. ► An application to a real tramway implementation project in a city is provided.

  6. Analysis of the orderly distribution of oil and gas fields in China based on the theory of co-control of source and heat

    Directory of Open Access Journals (Sweden)

    Gongcheng Zhang

    2015-01-01

    Full Text Available Taking a hydrocarbon zone or a basin group as a unit, this paper analyzed the vertical hydrocarbon generation regularity of onshore and offshore oil and gas fields in China, based on the theory of co-control of source and heat. The results demonstrated that the hydrocarbon generation modes of oil and gas fields in China are orderly. First, the hydrocarbon zones in southeastern China offshore area, including the East and South China Sea basins, are dominated by single hydrocarbon generation mode, which displays as either single oil generation in the near shore or single gas generation in the offshore controlled by both source and heat. Second, the eastern hydrocarbon zones, including the Bohai Bay, Songliao and Jianghan basins and the North and South Yellow Sea basins, are dominated by a two-layer hydrocarbon generation mode, which performs as “upper oil and lower gas”. Third, the central hydrocarbon zones, including the Ordos, Sichuan and Chuxiong basins, are also dominated by the “upper oil and lower gas” two-layer hydrocarbon generation mode. In the Ordos Basin, gas is mainly generated in the Triassic, and oil is predominantly generated in the Paleozoic. In the Sichuan Basin, oil was discovered in the Jurassic, and gas was mostly discovered in the Sinian and Triassic. Fourth, the western hydrocarbon zones are dominated by a “sandwich” multi-layer mode, such as the Junggar, Tarim, Qaidam basins. In summary, the theory of co-control of source and heat will be widely applied to oil and gas exploration all over China. Oil targets should be focused on the near shore areas in the southeastern China sea, the upper strata in the eastern and middle hydrocarbon zones, and the Ordovician, Permian and Paleogene strata in the western hydrocarbon zone, while gas targets should be focused on the off-shore areas in the southeastern China sea, the Cambrian, Carboniferous, Jurassic, and Quaternary strata in the western hydrocarbon zone. A pattern of

  7. Cohomology and renormalization of BFYM theory in three dimensions

    International Nuclear Information System (INIS)

    Accardi, A.; Belli, A.; Zeni, M.

    1997-01-01

    The first-order formalism for the 3D Yang-Mills theory is considered and two different formulations are introduced, in which the gauge theory appears to be a deformation of the topological BF theory. We perform the quantization and the algebraic analysis of the renormalization of both the models, which are found to be anomaly free. We discuss also their stability against radiative corrections, giving the full structure of possible counterterms, requiring an involved matricial renormalization of fields and sources. Both models are then proved to be equivalent to the Yang-Mills theory at the renormalized level. (orig.)

  8. Theory-of-Mind Development Influences Suggestibility and Source Monitoring

    Science.gov (United States)

    Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B.

    2008-01-01

    According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind…

  9. A critical experimental test of synchrotron radiation theory with 3rd generation light source

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2016-05-01

    A recent ''beam splitting'' experiment at LCLS apparently demonstrated that after a microbunched electron beam is kicked on a large angle compared to the divergence of the FEL radiation, the microbunching wave front is readjusted along the new direction of motion of the kicked beam. Therefore, coherent radiation from an undulator placed after the kicker is emitted along the kicked direction without suppression. This strong emission of coherent undulator radiation in the kicked direction cannot be explained in the framework of conventional synchrotron radiation theory. In a previous paper we explained this puzzle. We demonstrated that, in accelerator physics, the coupling of fields and particles is based, on the one hand, on the use of results from particle dynamics treated according to the absolute time convention and, on the other hand, on the use of Maxwell equations treated according to the standard (Einstein) synchronization convention. Here lies the misconception which led to the strong qualitative disagreement between theory and experiment. After the ''beam splitting'' experiment at LCLS, it became clear that the conventional theory of synchrotron radiation cannot ensure the correct description of coherent and spontaneous emission from a kicked electron beam, nor the emission from a beam with finite angular divergence, in an undulator or a bending magnet. However, this result requires further experimental confirmation. In this publication we propose an uncomplicated and inexpensive experiment to test synchrotron radiation theory at 3rd generation light sources.

  10. A critical experimental test of synchrotron radiation theory with 3rd generation light source

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-05-15

    A recent ''beam splitting'' experiment at LCLS apparently demonstrated that after a microbunched electron beam is kicked on a large angle compared to the divergence of the FEL radiation, the microbunching wave front is readjusted along the new direction of motion of the kicked beam. Therefore, coherent radiation from an undulator placed after the kicker is emitted along the kicked direction without suppression. This strong emission of coherent undulator radiation in the kicked direction cannot be explained in the framework of conventional synchrotron radiation theory. In a previous paper we explained this puzzle. We demonstrated that, in accelerator physics, the coupling of fields and particles is based, on the one hand, on the use of results from particle dynamics treated according to the absolute time convention and, on the other hand, on the use of Maxwell equations treated according to the standard (Einstein) synchronization convention. Here lies the misconception which led to the strong qualitative disagreement between theory and experiment. After the ''beam splitting'' experiment at LCLS, it became clear that the conventional theory of synchrotron radiation cannot ensure the correct description of coherent and spontaneous emission from a kicked electron beam, nor the emission from a beam with finite angular divergence, in an undulator or a bending magnet. However, this result requires further experimental confirmation. In this publication we propose an uncomplicated and inexpensive experiment to test synchrotron radiation theory at 3rd generation light sources.

  11. Applying thematic analysis theory to practice: a researcher's experience.

    Science.gov (United States)

    Tuckett, Anthony G

    2005-01-01

    This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.

  12. Real analysis an introduction to the theory of real functions and integration

    CERN Document Server

    Dshalalow, Jewgeni H

    2000-01-01

    Designed for use in a two-semester course on abstract analysis, REAL ANALYSIS: An Introduction to the Theory of Real Functions and Integration illuminates the principle topics that constitute real analysis. Self-contained, with coverage of topology, measure theory, and integration, it offers a thorough elaboration of major theorems, notions, and constructions needed not only by mathematics students but also by students of statistics and probability, operations research, physics, and engineering.Structured logically and flexibly through the author''s many years of teaching experience, the material is presented in three main sections:Part 1, chapters 1through 3, covers the preliminaries of set theory and the fundamentals of metric spaces and topology. This section can also serves as a text for first courses in topology.Part II, chapter 4 through 7, details the basics of measure and integration and stands independently for use in a separate measure theory course.Part III addresses more advanced topics, includin...

  13. An introduction to single-user information theory

    CERN Document Server

    Alajaji, Fady

    2018-01-01

    This book presents a succinct and mathematically rigorous treatment of the main pillars of Shannon’s information theory, discussing the fundamental concepts and indispensable results of Shannon’s mathematical theory of communications. It includes five meticulously written core chapters (with accompanying problems), emphasizing the key topics of information measures; lossless and lossy data compression; channel coding; and joint source-channel coding for single-user (point-to-point) communications systems. It also features two appendices covering necessary background material in real analysis and in probability theory and stochastic processes. The book is ideal for a one-semester foundational course on information theory for senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences. A comprehensive instructor’s solutions manual is available.

  14. Frames and operator theory in analysis and signal processing

    CERN Document Server

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  15. The Constant Comparative Analysis Method Outside of Grounded Theory

    Science.gov (United States)

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  16. Variational analysis and generalized differentiation I basic theory

    CERN Document Server

    Mordukhovich, Boris S

    2006-01-01

    Contains a study of the basic concepts and principles of variational analysis and generalized differentiation in both finite-dimensional and infinite-dimensional spaces. This title presents many applications to problems in optimization, equilibria, stability and sensitivity, control theory, economics, mechanics, and more.

  17. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  18. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  19. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Perturbative analysis in higher-spin theories

    Energy Technology Data Exchange (ETDEWEB)

    Didenko, V.E. [I.E. Tamm Department of Theoretical Physics, Lebedev Physical Institute,Leninsky prospect 53, 119991, Moscow (Russian Federation); Misuna, N.G. [Moscow Institute of Physics and Technology,Institutsky lane 9, 141700, Dolgoprudny, Moscow region (Russian Federation); Vasiliev, M.A. [I.E. Tamm Department of Theoretical Physics, Lebedev Physical Institute,Leninsky prospect 53, 119991, Moscow (Russian Federation)

    2016-07-28

    A new scheme of the perturbative analysis of the nonlinear HS equations is developed giving directly the final result for the successive application of the homotopy integrations which appear in the standard approach. It drastically simplifies the analysis and results from the application of the standard spectral sequence approach to the higher-spin covariant derivatives, allowing us in particular to reduce multiple homotopy integrals resulting from the successive application of the homotopy trick to a single integral. Efficiency of the proposed method is illustrated by various examples. In particular, it is shown how the Central on-shell theorem of the free theory immediately results from the nonlinear HS field equations with no intermediate computations.

  1. Astrophysical data analysis with information field theory

    International Nuclear Information System (INIS)

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  2. Astrophysical data analysis with information field theory

    Science.gov (United States)

    Enßlin, Torsten

    2014-12-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  3. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  4. Graph theory applied to noise and vibration control in statistical energy analysis models.

    Science.gov (United States)

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  5. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste

    International Nuclear Information System (INIS)

    Karim Ghani, Wan Azlina Wan Ab.; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni

    2013-01-01

    Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and

  6. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste

    Energy Technology Data Exchange (ETDEWEB)

    Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Idris, Azni, E-mail: azni@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia)

    2013-05-15

    Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and

  7. Quantum theory for 1D X-ray free electron laser

    Science.gov (United States)

    Anisimov, Petr M.

    2018-06-01

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.

  8. Schwinger's quantum action principle from Dirac’s formulation through Feynman’s path integrals, the Schwinger-Keldysh method, quantum field theory, to source theory

    CERN Document Server

    Milton, Kimball A

    2015-01-01

    Starting from the earlier notions of stationary action principles, these tutorial notes shows how Schwinger’s Quantum Action Principle descended from Dirac’s formulation, which independently led Feynman to his path-integral formulation of quantum mechanics. Part I brings out in more detail the connection between the two formulations, and applications are discussed. Then, the Keldysh-Schwinger time-cycle method of extracting matrix elements is described. Part II will discuss the variational formulation of quantum electrodynamics and the development of source theory.

  9. The Use of Narrative Paradigm Theory in Assessing Audience Value Conflict in Image Advertising.

    Science.gov (United States)

    Stutts, Nancy B.; Barker, Randolph T.

    1999-01-01

    Presents an analysis of image advertisement developed from Narrative Paradigm Theory. Suggests that the nature of postmodern culture makes image advertising an appropriate external communication strategy for generating stake holder loyalty. Suggests that Narrative Paradigm Theory can identify potential sources of audience conflict by illuminating…

  10. Can one extract source radii from transport theories?

    International Nuclear Information System (INIS)

    Aichelin, J.

    1996-01-01

    To known the space time evolution of a heavy ion reaction is of great interest especially in cases where the measured spectra do not allow to ascertain the underlying reaction mechanism. In recent times it became popular to believe that the comparison of Hanbury-Brown Twiss correlation functions obtained from classical or semiclassical transport theories, like Boltzmann Uehling Uhlenbeck (BUU), Quantum Molecular Dynamics (QMD), VENUS or ARC, with experiments may provide this insight. It is the purpose of this article to show that this is not the case. None of these transport theories provides a reliable time evolution of those quantities which are mandatory for a correct calculation of the correlation function. The reason for this failure is different for the different transport theories. (author)

  11. Can one extract source radii from transport theories?

    Energy Technology Data Exchange (ETDEWEB)

    Aichelin, J.

    1996-12-31

    To known the space time evolution of a heavy ion reaction is of great interest especially in cases where the measured spectra do not allow to ascertain the underlying reaction mechanism. In recent times it became popular to believe that the comparison of Hanbury-Brown Twiss correlation functions obtained from classical or semiclassical transport theories, like Boltzmann Uehling Uhlenbeck (BUU), Quantum Molecular Dynamics (QMD), VENUS or ARC, with experiments may provide this insight. It is the purpose of this article to show that this is not the case. None of these transport theories provides a reliable time evolution of those quantities which are mandatory for a correct calculation of the correlation function. The reason for this failure is different for the different transport theories. (author).

  12. The theory of asymptotic behaviour

    International Nuclear Information System (INIS)

    Ward, B.F.L.; Purdue Univ., Lafayette, IN

    1978-01-01

    The Green's functions of renormalizable quantum field theory are shown to violate, in general, Euler's theorem on homogeneous functions, that is to say, to violate naive dimensional analysis. The respective violations are established by explicit calculation with Feynman diagrams. These violations, when incorporated into the renormalization group, then provide the basis for an entirely new approach to asymptotic behaviour in renormalizable field theory. Specifically, the violations add new delta-function sources to the usual partial differential equations of the group when these equations are written in terms of the external momenta of the respective Green's functions. The effect of these sources is illustrated by studying the real part, Re GAMMA 6 (lambda p), of the six-point 1PI vertex of the massless scalar field with quartic self-coupling - the simplest of ranormalizable situations. Here, lambda p is symbolic for the six-momenta of GAMMA 6 . Briefly, it is found that the usual theory of characteristics is unable to satisfy the boundary condition attendant to the respective dimensional-analysis-violating sources. Thus, the method of characteristics is completely abandonded in favour of the method of separation of variables. A complete solution which satisfies the inhomogeneous group equation and all boundary conditions is then explicitly constructed. This solution possesses Laurent expansions in the scale lambda of its momentum arguments for all real values of lambda 2 except lambda 2 = 0. For |lambda 2 |→ infinity and |lambda 2 |→ 0, the solution's leading term in its respective Laurent series is proportional to lambda -2 . The limits lambda 2 →0sub(+) and lambda 2 →0sup(-) of lambda 2 ReGAMMA 6 are both nonzero and unequal. The value of the solution at lambda 2 = 0 is not simply related to the value of either of these limits. The new approach would appear to be operationally established

  13. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  14. Sierra Structural Dynamics Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Reese, Garth M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-19

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Sierra/SD. For a more detailed description of how to use Sierra/SD , we refer the reader to Sierra/SD, User's Notes . Many of the constructs in Sierra/SD are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Sierra/SD are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature. This page intentionally left blank.

  15. A Preliminary ZEUS Lightning Location Error Analysis Using a Modified Retrieval Theory

    Science.gov (United States)

    Elander, Valjean; Koshak, William; Phanord, Dieudonne

    2004-01-01

    The ZEUS long-range VLF arrival time difference lightning detection network now covers both Europe and Africa, and there are plans for further expansion into the western hemisphere. In order to fully optimize and assess ZEUS lightning location retrieval errors and to determine the best placement of future receivers expected to be added to the network, a software package is being developed jointly between the NASA Marshall Space Flight Center (MSFC) and the University of Nevada Las Vegas (UNLV). The software package, called the ZEUS Error Analysis for Lightning (ZEAL), will be used to obtain global scale lightning location retrieval error maps using both a Monte Carlo approach and chi-squared curvature matrix theory. At the core of ZEAL will be an implementation of an Iterative Oblate (IO) lightning location retrieval method recently developed at MSFC. The IO method will be appropriately modified to account for variable wave propagation speed, and the new retrieval results will be compared with the current ZEUS retrieval algorithm to assess potential improvements. In this preliminary ZEAL work effort, we defined 5000 source locations evenly distributed across the Earth. We then used the existing (as well as potential future ZEUS sites) to simulate arrival time data between source and ZEUS site. A total of 100 sources were considered at each of the 5000 locations, and timing errors were selected from a normal distribution having a mean of 0 seconds and a standard deviation of 20 microseconds. This simulated "noisy" dataset was analyzed using the IO algorithm to estimate source locations. The exact locations were compared with the retrieved locations, and the results are summarized via several color-coded "error maps."

  16. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  17. Theory and applications of numerical analysis

    CERN Document Server

    Phillips, G M

    1996-01-01

    This text is a self-contained Second Edition, providing an introductory account of the main topics in numerical analysis. The book emphasizes both the theorems which show the underlying rigorous mathematics andthe algorithms which define precisely how to program the numerical methods. Both theoretical and practical examples are included.* a unique blend of theory and applications* two brand new chapters on eigenvalues and splines* inclusion of formal algorithms* numerous fully worked examples* a large number of problems, many with solutions

  18. Who uses nursing theory? A univariate descriptive analysis of five years' research articles.

    Science.gov (United States)

    Bond, A Elaine; Eshah, Nidal Farid; Bani-Khaled, Mohammed; Hamad, Atef Omar; Habashneh, Samira; Kataua', Hussein; al-Jarrah, Imad; Abu Kamal, Andaleeb; Hamdan, Falastine Rafic; Maabreh, Roqia

    2011-06-01

    Since the early 1950s, nursing leaders have worked diligently to build the Scientific Discipline of Nursing, integrating Theory, Research and Practice. Recently, the role of theory has again come into question, with some scientists claiming nurses are not using theory to guide their research, with which to improve practice. The purposes of this descriptive study were to determine: (i) Were nursing scientists' research articles in leading nursing journals based on theory? (ii) If so, were the theories nursing theories or borrowed theories? (iii) Were the theories integrated into the studies, or were they used as organizing frameworks? Research articles from seven top ISI journals were analysed, excluding regularly featured columns, meta-analyses, secondary analysis, case studies and literature reviews. The authors used King's dynamic Interacting system and Goal Attainment Theory as an organizing framework. They developed consensus on how to identify the integration of theory, searching the Title, Abstract, Aims, Methods, Discussion and Conclusion sections of each research article, whether quantitative or qualitative. Of 2857 articles published in the seven journals from 2002 to, and including, 2006, 2184 (76%) were research articles. Of the 837 (38%) authors who used theories, 460 (55%) used nursing theories, 377 (45%) used other theories: 776 (93%) of those who used theory integrated it into their studies, including qualitative studies, while 51 (7%) reported they used theory as an organizing framework for their studies. Closer analysis revealed theory principles were implicitly implied, even in research reports that did not explicitly report theory usage. Increasing numbers of nursing research articles (though not percentagewise) continue to be guided by theory, and not always by nursing theory. Newer nursing research methods may not explicitly state the use of nursing theory, though it is implicitly implied. © 2010 The Authors. Scandinavian Journal of Caring

  19. Theory, analysis and design of RF interferometric sensors

    CERN Document Server

    Nguyen, Cam

    2012-01-01

    Theory, Analysis and Design of RF Interferometric Sensors presents the theory, analysis and design of RF interferometric sensors. RF interferometric sensors are attractive for various sensing applications that require every fine resolution and accuracy as well as fast speed. The book also presents two millimeter-wave interferometric sensors realized using RF integrated circuits. The developed millimeter-wave homodyne sensor shows sub-millimeter resolution in the order of 0.05 mm without correction for the non-linear phase response of the sensor's quadrature mixer. The designed millimeter-wave double-channel homodyne sensor provides a resolution of only 0.01 mm, or 1/840th of the operating wavelength, and can inherently suppress the non-linearity of the sensor's quadrature mixer. The experimental results of displacement and velocity measurement are presented as a way to demonstrate the sensing ability of the RF interferometry and to illustrate its many possible applications in sensing. The book is succinct, ye...

  20. A theory of evidence for undeclared nuclear activities

    International Nuclear Information System (INIS)

    King, J.L.

    1995-01-01

    The IAEA has recently explored techniques to augment and improve its existing safeguards information systems as part of Program 93 + 2 in order to address the detection of undeclared activities. Effective utilization of information on undeclared activities requires a formulation of the relationship between the information being gathered and the resulting safeguards assurance. The process of safeguards is represented as the gathering of evidence to provide assurance that no undeclared activities take place. It is shown that the analysis of this process can be represented by a theory grounded in the Dempster-Shafer theory of evidence and the concept of possibility. This paper presents the underlying evidence theory required to support a new information system tool for the analysis of information with respect to undeclared activities. The Dempster-Shafer theory serves as the calculus for the combination of diverse sources of evidence, and when applied to safeguards information, provides a basis for interpreting the result of safeguards indicators and measurements -- safeguards assurance

  1. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    Science.gov (United States)

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  2. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  3. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    Science.gov (United States)

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.

  4. Kajian Unified Theory of Acceptance and Use of Technology Dalam Penggunaan Open Source Software Database Management System

    Directory of Open Access Journals (Sweden)

    Michael Sonny

    2016-06-01

    Full Text Available Perkembangan perangkat lunak computer dewasa ini terjadi sedemikian pesatnya, perkembangan tidak hanya terjadi pada perangkat lunak yang memiliki lisensi tertentu, perangkat open source pun demikian. Perkembangan itu tentu saja sangat menggembirakan bagi pengguna computer khususnya di kalangan pendidikan maupun di kalangan mahasiswa, karena pengguna mempunyai beberapa pilihan untuk menggunakan aplikasi. Perangkat lunak open source juga menawarkan produk yang umumnya gratis, diberikan kode programnya, kebebasan untuk modifikasi dan mengembangkan. Meneliti aplikasi berbasis open source tentu saja sangat beragam seperti aplikasi untuk pemrograman (PHP, Gambas, Database Management System (MySql, SQLite, browsing (Mozilla, Firefox, Opera. Pada penelitian ini di kaji penerimaan aplikasi DBMS (Database Management System seperti MySql dan SQLite dengan menggunakan sebuah model yang dikembangkan oleh Venkantes(2003 yaitu UTAUT (Unified Theory of Acceptance and Use of Technology. Faktor – faktor tertentu juga mempengaruhi dalam melakukan kegiatan pembelajaran aplikasi open source ini, salah satu faktor atau yang disebut dengan moderating yang bisa mempengaruhi efektifitas dan efisiensi. Dengan demikian akan mendapatkan hasil yang bisa membuat kelancaran dalam pembelajaran aplikasi berbasis open source ini.   Kata kunci— open source, Database Management System (DBMS, Modereting

  5. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    Science.gov (United States)

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  6. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  7. Theory analysis for Pender's health promotion model (HPM) by Barnum's criteria: a critical perspective.

    Science.gov (United States)

    Khoshnood, Zohreh; Rayyani, Masoud; Tirgari, Batool

    2018-01-13

    Background Analysis of nursing theoretical works and its role in knowledge development is presented as an essential process of critical reflection. Health promotion model (HPM) focuses on helping people achieve higher levels of well-being and identifies background factors that influence health behaviors. Objectives This paper aims to evaluate, and critique HPM by Barnum's criteria. Methods The present study reviewed books and articles derived from Proquest, PubMed, Blackwell Databases. The method of evaluation for this model is based on Barnum's criteria for analysis, application and evaluation of nursing theories. The criteria selected by Barnum embrace both internal and external criticism. Internal criticism deals with how theory components fit with each other (internal construction of theory) and external criticism deals with the way in which theory relates to the extended world (which considers theory in its relationships to human beings, nursing, and health). Results The electronic database search yielded over 27,717 titles and abstracts. Following removal of duplicates, 18,963 titles and abstracts were screened using the inclusion criteria and 1278 manuscripts were retrieved. Of these, 80 were specific to HPM and 23 to analysis of any theory in nursing relating to the aim of this article. After final selection using the inclusion criteria for this review, 28 manuscripts were identified as examining the factors contributing to theory analysis. Evaluation of health promotion theory showed that the philosophical claims and their content are consistent and clear. HPM has a logical structure and was applied to diverse age groups from differing cultures with varying health concerns. Conclusion In conclusion, among the strategies for theory critique, the Barnum approach is structured and accurate, considers theory in its relationship to human beings, community psychiatric nursing, and health. While according to Pender, nursing assessment, diagnosis and interventions

  8. Salinas. Theory Manual Version 2.8

    Energy Technology Data Exchange (ETDEWEB)

    Reese, Garth M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Walsh, Timothy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhardwaj, Manoj K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2009-02-01

    Salinas provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Salinas. For a more detailed description of how to use Salinas , we refer the reader to Salinas, Users Notes. Many of the constructs in Salinas are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Salinas are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature.

  9. Generalised perturbation theory and source of information through chemical measurements

    International Nuclear Information System (INIS)

    Lelek, V.; Marek, T.

    2001-01-01

    It is important to make all analyses and collect all information from the work of the new facility (which the transmutation demonstration unit will surely be) to be sure that the operation corresponds to the forecast or to correct the equations of the facility. The behaviour of the molten salt reactor and in particular the system of measurement are very different from that of the solid fuel reactor. Key information from the long time kinetics could be the nearly on line knowledge of the fuel composition. In this work it is shown how to include it into the control and use such data for the correction of neutron cross-sections for the high actinides or other characteristics. Also the problem of safety - change of the boundary problem to the initial problem - is mentioned. The problem is transformed into the generalised perturbation theory in which the adjoint function is obtained through the solution of the equations with right hand side having the form of source. Such an approach should be a theoretical base for the calculation of the sensitivity coefficients. (authors)

  10. A source-initiated on-demand routing algorithm based on the Thorup-Zwick theory for mobile wireless sensor networks.

    Science.gov (United States)

    Mao, Yuxin; Zhu, Ping

    2013-01-01

    The unreliability and dynamics of mobile wireless sensor networks make it hard to perform end-to-end communications. This paper presents a novel source-initiated on-demand routing mechanism for efficient data transmission in mobile wireless sensor networks. It explores the Thorup-Zwick theory to achieve source-initiated on-demand routing with time efficiency. It is able to find out shortest routing path between source and target in a network and transfer data in linear time. The algorithm is easy to be implemented and performed in resource-constrained mobile wireless sensor networks. We also evaluate the approach by analyzing its cost in detail. It can be seen that the approach is efficient to support data transmission in mobile wireless sensor networks.

  11. Activity Analysis: Bridging the Gap between Production Economics Theory and Practical Farm Management Procedures

    OpenAIRE

    Longworth, John W.; Menz, Kenneth M.

    1980-01-01

    This paper is addressed to the traditional problem of demonstrating the relevance of production theory to management-oriented people. Activity analysis, it is argued, is the most appropriate pedagogic framework within which to commence either a production economics or a farm management course. Production economics theory has not been widely accepted as a useful method for the analysis of practical management problems. The theory has been traditionally presented in terms of continuous function...

  12. Stability Analysis for Car Following Model Based on Control Theory

    International Nuclear Information System (INIS)

    Meng Xiang-Pei; Li Zhi-Peng; Ge Hong-Xia

    2014-01-01

    Stability analysis is one of the key issues in car-following theory. The stability analysis with Lyapunov function for the two velocity difference car-following model (for short, TVDM) is conducted and the control method to suppress traffic congestion is introduced. Numerical simulations are given and results are consistent with the theoretical analysis. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  13. Analysis of graphic representations of activity theory in international journals

    Directory of Open Access Journals (Sweden)

    Marco André Mazzarotto

    2016-05-01

    Full Text Available Activity theory is a relevant framework for the Design field, and their graphic representations are cognitive artifacts that aid the understanding, use and communication of this theory. However, there is a lack of consistency around the graphics and labels used in these representations. Based on this, the aim of this study was to identify, analyze and evaluate these differences and propose a representation that aims to be more suitable for the theory. For this, uses as method a literature review based on Engeström (2001 and its three generations of visual models, combined with graphical analysis of representations collected in a hundred papers from international journals.

  14. The use of Theory in Family Therapy Research: Content Analysis and Update.

    Science.gov (United States)

    Chen, Ruoxi; Hughes, Alexandria C; Austin, Jason P

    2017-07-01

    In this study, we evaluated 275 empirical studies from Journal of Marital and Family Therapy and Family Process from 2010 to 2015 on their use of theory, and compared our findings to those of a similar previous analysis (Hawley & Geske, 2000). Overall, theory seems to have become much better incorporated in empirical family therapy research, with only 16.4% of the articles not using theory in either their introductory or discussion sections. Theory appeared better incorporated in the introductory sections than in the discussion sections. Systems theory remained the most commonly used conceptual framework, followed by attachment theory. We discuss areas for improving theory incorporation in family therapy research, and offer suggestions for both family therapy researchers and educators. © 2017 American Association for Marriage and Family Therapy.

  15. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  16. Unique effects and moderators of effects of sources on self-efficacy: A model-based meta-analysis.

    Science.gov (United States)

    Byars-Winston, Angela; Diestelmann, Jacob; Savoy, Julia N; Hoyt, William T

    2017-11-01

    Self-efficacy beliefs are strong predictors of academic pursuits, performance, and persistence, and in theory are developed and maintained by 4 classes of experiences Bandura (1986) referred to as sources: performance accomplishments (PA), vicarious learning (VL), social persuasion (SP), and affective arousal (AA). The effects of sources on self-efficacy vary by performance domain and individual difference factors. In this meta-analysis (k = 61 studies of academic self-efficacy; N = 8,965), we employed B. J. Becker's (2009) model-based approach to examine cumulative effects of the sources as a set and unique effects of each source, controlling for the others. Following Becker's recommendations, we used available data to create a correlation matrix for the 4 sources and self-efficacy, then used these meta-analytically derived correlations to test our path model. We further examined moderation of these associations by subject area (STEM vs. non-STEM), grade, sex, and ethnicity. PA showed by far the strongest unique association with self-efficacy beliefs. Subject area was a significant moderator, with sources collectively predicting self-efficacy more strongly in non-STEM (k = 14) compared with STEM (k = 47) subjects (R2 = .37 and .22, respectively). Within studies of STEM subjects, grade level was a significant moderator of the coefficients in our path model, as were 2 continuous study characteristics (percent non-White and percent female). Practical implications of the findings and future research directions are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. A network analysis of leadership theory : the infancy of integration.

    OpenAIRE

    Meuser, J. D.; Gardner, W. L.; Dinh, J. E.; Hu, J.; Liden, R. C.; Lord, R. G.

    2016-01-01

    We investigated the status of leadership theory integration by reviewing 14 years of published research (2000 through 2013) in 10 top journals (864 articles). The authors of these articles examined 49 leadership approaches/theories, and in 293 articles, 3 or more of these leadership approaches were included in their investigations. Focusing on these articles that reflected relatively extensive integration, we applied an inductive approach and used graphic network analysis as a guide for drawi...

  18. Spot-on or not? : an analysis of Seurat's colour theory

    OpenAIRE

    Marks-Donaldson, Roberta Lynne

    1997-01-01

    An analysis of mid- to late-nineteenth century scientific colour theories sets the stage for the introduction of the artistic style of French painter Georges Seurat. His traditional beaux-arts training, extraordinary skills as a draughtsman, and keen interest in the then existing science theories on colour combined in his person to create a new approach called Divisionism, (also called Pointillisme, pointillism, and melange optique). As Seurat's readings of scientific literature and his pract...

  19. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  20. Source-system windowing for speech analysis

    NARCIS (Netherlands)

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  1. Computerized study of several electrostatic, surface-ionization ion-source configurations

    Energy Technology Data Exchange (ETDEWEB)

    Balestrini, S.J.; Schuster, B.G.

    1984-08-01

    A computer-based method is presented whereby the optics of electrostatic, surface-ionization ion-source designs can be analyzed theoretically. The analysis solves for the luminosity and disperstion of a beam of charged particles at the final collimating slit and at locations preceding the slit. The performance of an ion source tested in 1960 and also some newer optical configurations are compared with theory.

  2. Open source development

    DEFF Research Database (Denmark)

    Ulhøi, John Parm

    2004-01-01

    This paper addresses innovations based on open source or non-proprietary knowledge. Viewed through the lens of private property theory, such agency appears to be a true anomaly. However, by a further turn of the theoretical kaleidoscope, we will show that there may be perfectly justifiable reasons...... for not regarding open source innovations as anomalies. The paper is based on three sectorial and generic cases of open source innovation, which is an offspring of contemporary theory made possible by combining elements of the model of private agency with those of the model of collective agency. In closing...

  3. Analysis of General Power Counting Rules in Effective Field Theory

    CERN Document Server

    Gavela, B M; Manohar, A V; Merlo, L

    2016-01-01

    We derive the general counting rules for a quantum effective field theory (EFT) in $\\mathsf{d}$ dimensions. The rules are valid for strongly and weakly coupled theories, and predict that all kinetic energy terms are canonically normalized. They determine the energy dependence of scattering cross sections in the range of validity of the EFT expansion. The size of cross sections is controlled by the $\\Lambda$ power counting of EFT, not by chiral counting, even for chiral perturbation theory ($\\chi$PT). The relation between $\\Lambda$ and $f$ is generalized to $\\mathsf{d}$ dimensions. We show that the naive dimensional analysis $4\\pi$ counting is related to $\\hbar$ counting. The EFT counting rules are applied to $\\chi$PT, to Standard Model EFT and to the non-trivial case of Higgs EFT, which combines the $\\Lambda$ and chiral counting rules within a single theory.

  4. An analysis of natural gas exploration potential in the Qiongdongnan Basin by use of the theory of “joint control of source rocks and geothermal heat”

    Directory of Open Access Journals (Sweden)

    Zhang Gongcheng

    2014-10-01

    Full Text Available The Oligocene Yacheng Fm contains the most important source rocks that have been confirmed by exploratory wells in the Qiongdongnan Basin. The efficiency of these source rocks is the key to the breakthrough in natural gas exploration in the study area. This paper analyzes the hydrocarbon potential of each sag in this basin from the perspective of control of both source rocks and geothermal heat. Two types of source rocks occur in the Yacheng Fm, namely mudstone of transitional facies and mudstone of neritic facies. Both of them are dominated by a kerogen of type-III, followed by type-II. Their organic matter abundances are controlled by the amount of continental clastic input. The mudstone of transitional facies is commonly higher in organic matter abundance, while that of neritic facies is lower. The coal-measure source rocks of transitional facies were mainly formed in such environments as delta plains, coastal plains and barrier tidal flat-marshes. Due to the control of Cenozoic lithosphere extension and influence of neotectonism, the geothermal gradient, terrestrial heat flow value (HFV and level of thermal evolution are generally high in deep water. The hot setting not only determines the predominance of gas generation in the deep-water sags, but can promote the shallow-buried source rocks in shallow water into oil window to generate oil. In addition to promoting the hydrocarbon generation of source rocks, the high geothermal and high heat flow value can also speed up the cracking of residual hydrocarbons, thus enhancing hydrocarbon generation efficiency and capacity. According to the theory of joint control of source quality and geothermal heat on hydrocarbon generation, we comprehensively evaluate and rank the exploration potentials of major sags in the Qiongdongnan Basin. These sags are divided into 3 types, of which type-I sags including Yanan, Lingshui, Baodao, Ledong and Huaguang are the highest in hydrocarbon exploration potential.

  5. Situation-specific theories from the middle-range transitions theory.

    Science.gov (United States)

    Im, Eun-Ok

    2014-01-01

    The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.

  6. Principle-based concept analysis: intentionality in holistic nursing theories.

    Science.gov (United States)

    Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri

    2015-03-01

    This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.

  7. Dependence theory via game theory

    NARCIS (Netherlands)

    Grossi, D.; Turrini, P.

    2011-01-01

    In the multi-agent systems community, dependence theory and game theory are often presented as two alternative perspectives on the analysis of social interaction. Up till now no research has been done relating these two approaches. The unification presented provides dependence theory with the sort

  8. Theory and Analysis of JolTech’s GyroPTO

    DEFF Research Database (Denmark)

    Kurniawan, Adi; Kofoed, Jens Peter; Kramer, Morten Mejlhede

    This report summarizes the work done by Aalborg University (AAU) for the project \\Gyro electric energy converter theory and analysis" (Olsen, 2015). The project's objective is to build a theoretical knowledge about gyro electric energy conversion systems, particularly for use in connection...

  9. Mathematical theory of compressible viscous fluids analysis and numerics

    CERN Document Server

    Feireisl, Eduard; Pokorný, Milan

    2016-01-01

    This book offers an essential introduction to the mathematical theory of compressible viscous fluids. The main goal is to present analytical methods from the perspective of their numerical applications. Accordingly, we introduce the principal theoretical tools needed to handle well-posedness of the underlying Navier-Stokes system, study the problems of sequential stability, and, lastly, construct solutions by means of an implicit numerical scheme. Offering a unique contribution – by exploring in detail the “synergy” of analytical and numerical methods – the book offers a valuable resource for graduate students in mathematics and researchers working in mathematical fluid mechanics. Mathematical fluid mechanics concerns problems that are closely connected to real-world applications and is also an important part of the theory of partial differential equations and numerical analysis in general. This book highlights the fact that numerical and mathematical analysis are not two separate fields of mathematic...

  10. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  11. Developing interprofessional education online: An ecological systems theory analysis.

    Science.gov (United States)

    Bluteau, Patricia; Clouder, Lynn; Cureton, Debra

    2017-07-01

    This article relates the findings of a discourse analysis of an online asynchronous interprofessional learning initiative involving two UK universities. The impact of the initiative is traced over three intensive periods of online interaction, each of several-weeks duration occurring over a three-year period, through an analysis of a random sample of discussion forum threads. The corpus of rich data drawn from the forums is interpreted using ecological systems theory, which highlights the complexity of interaction of individual, social and cultural elements. Ecological systems theory adopts a life course approach to understand how development occurs through processes of progressively more complex reciprocal interaction between people and their environment. This lens provides a novel approach for analysis and interpretation of findings with respect to the impact of pre-registration interprofessional education and the interaction between the individual and their social and cultural contexts as they progress through 3/4 years of their programmes. Development is mapped over time (the chronosystem) to highlight the complexity of interaction across microsystems (individual), mesosystems (curriculum and institutional/care settings), exosystems (community/wider local context), and macrosystems (national context and culture). This article illustrates the intricacies of students' interprofessional development over time and the interactive effects of social ecological components in terms of professional knowledge and understanding, wider appreciation of health and social care culture and identity work. The implications for contemporary pre-registration interprofessional education and the usefulness and applicability of ecological systems theory for future research and development are considered.

  12. Peace Journalism through the Lense of Conflict Theory: Analysis and Practice

    Directory of Open Access Journals (Sweden)

    Samuel Peleg

    2006-10-01

    Full Text Available Peace Journalism is a bold attempt to redefine and reconstruct the role of journalists who cover conflicts. As a new arena of knowledge, Peace Journalism draws upon several theories and disciplines to enrich its validity and applicability. A major source which peace journalism can rely on to bolster its analytical as well as its normative rigor is conflict theory. This article demonstrates how several insights from conflict theory can advance the lucidity of peace journalism and render it a powerful tool in the hands of reporters and their readers to realize the futility of conflict and to bring about its resolution. More specifically, the article introduces the notion of the media as a third party to a conflict. The third party is the facilitator of communication, the mediator or the arbitrator between the two rivaling sides. It is our contention that Peace Journalism as a third side can best enhance prospects for resolution and reconciliation by changing the norms and habits of reporting conflicts. This is succinctly illustrated in three case studies of protracted conflicts, which are described through the lenses of conflict theory. By contrasting regular newspaper coverage with peace journalism coverage, the merits of the latter are revealed.

  13. Theory of safety needs (about the theory of arise of physical education

    Directory of Open Access Journals (Sweden)

    V.S. Muntian

    2014-12-01

    Full Text Available Purpose: Existing theories of physical education are examinated. Material : the analysis and synthesis of more than 20 literary sources and Internet information, reflecting the general patterns of occurrence and development of physical education during birth civilization. Results : Informed that early humans lived in a permanent state of the struggle for existence, associated with the satisfaction of primary needs. Ascertain in the process of obtaining food and ensuring their own safety, people began to use the means of physical education, resulting in a conscious understanding of the phenomenon and the importance effectiveness (the result of doing (perform the exercises preparation. Conclusions : First put forward and substantiated the theory safety needs as one of the top priorities and the likely causes of physical education and sport, as this needs arose almost simultaneously with the appearance of a person.

  14. Assessing Coverage of Maslow's Theory in Educational Psychology Textbooks: A Content Analysis

    Science.gov (United States)

    Wininger, Steven R.; Norman, Antony D.

    2010-01-01

    Although Maslow's hierarchy of needs theory (HNT) is one of the most prevalent theories in psychology, the authors argued that it is also one of the most misinterpreted or misrepresented, particularly in educational psychology textbooks. Therefore, after carefully reading Maslow's writings on HNT they conducted a content analysis of 18 educational…

  15. An Institutional Theory Analysis of Charter Schools: Addressing Institutional Challenges to Scale

    Science.gov (United States)

    Huerta, Luis A.; Zuckerman, Andrew

    2009-01-01

    This article presents a conceptual framework derived from institutional theory in sociology that offers two competing policy contexts in which charter schools operate--a bureaucratic frame versus a decentralized frame. An analysis of evolving charter school types based on three underlying theories of action is considered. As charter school leaders…

  16. Theory of high frequency discharge in gases under low pressures. Experimental investigation of high-frequency type ion sources; Theorie de la decharge haute frequence dans les gaz aux faibles pressions. Etude experimentale des sources d'ions du type haute frequence

    Energy Technology Data Exchange (ETDEWEB)

    Salmon, Jean

    1955-03-02

    The first part of this research thesis addresses the theory of high frequency discharge in gases under low pressures, and first proposes a calculation of the distribution function for electrons present within the gas. The author then studies the evolution of electron density within a discharge tube by assigning the governing role in electron multiplication to the secondary emission of tube walls. The second part proposes a detailed description of a source operating at 96.5 Mc/s, a discussion of measurements performed on this source, and the search for a theoretical explanation of some of its properties. The author then briefly analyses various existing types of high frequency sources, and finally discusses their use in corpuscular microscopy and in particle accelerators [French] La presente these comprend deux parties. La premiere est consacree a la theorie de la decharge haute frequence dans les gaz aux faibles pressions et comporte tout d'abord le calcul de la fonction de distribution des electrons presents au sein du gaz. Nous etudions ensuite l'evolution de la densite electronique a l'interieur d'un tube a decharge en attribuant, a l'emission secondaire des parois de ce dernier, le role essentiel dans la multiplicalion des electrons. Nous obtenons ainsi les conditions d'amorcage. Tout au long de cette etude, on doit distinguer soigneusement le cas ou le libre parcours moyen des electrons dans le gaz est inferieur aux dimensions de l'enceinte et le cas ou il leur est superieur. La deuxieme partie comprend la description detaillee d'une source fonctionnant sur 96.5 Mc/s, l'expose des mesures effectuees sur celle-ci et la recherche d'une explication theorique de certaines de ses proprietes. Nous faisons ensuite une breve analyse des divers types de sources d'ions haute frequence existant a l'heure actuelle et nous terminons en traitant de leur utilisation en microscopie corpusculaire et dans les accelerateurs de particules.

  17. Linear analysis near a steady-state of biochemical networks: control analysis, correlation metrics and circuit theory

    Directory of Open Access Journals (Sweden)

    Qian Hong

    2008-05-01

    Full Text Available Abstract Background: Several approaches, including metabolic control analysis (MCA, flux balance analysis (FBA, correlation metric construction (CMC, and biochemical circuit theory (BCT, have been developed for the quantitative analysis of complex biochemical networks. Here, we present a comprehensive theory of linear analysis for nonequilibrium steady-state (NESS biochemical reaction networks that unites these disparate approaches in a common mathematical framework and thermodynamic basis. Results: In this theory a number of relationships between key matrices are introduced: the matrix A obtained in the standard, linear-dynamic-stability analysis of the steady-state can be decomposed as A = SRT where R and S are directly related to the elasticity-coefficient matrix for the fluxes and chemical potentials in MCA, respectively; the control-coefficients for the fluxes and chemical potentials can be written in terms of RT BS and ST BS respectively where matrix B is the inverse of A; the matrix S is precisely the stoichiometric matrix in FBA; and the matrix eAt plays a central role in CMC. Conclusion: One key finding that emerges from this analysis is that the well-known summation theorems in MCA take different forms depending on whether metabolic steady-state is maintained by flux injection or concentration clamping. We demonstrate that if rate-limiting steps exist in a biochemical pathway, they are the steps with smallest biochemical conductances and largest flux control-coefficients. We hypothesize that biochemical networks for cellular signaling have a different strategy for minimizing energy waste and being efficient than do biochemical networks for biosynthesis. We also discuss the intimate relationship between MCA and biochemical systems analysis (BSA.

  18. A tutorial on incremental stability analysis using contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Fossen, Thor I.

    2010-01-01

    This paper introduces a methodology for dierential nonlinear stability analysis using contraction theory (Lohmiller and Slotine, 1998). The methodology includes four distinct steps: the descriptions of two systems to be compared (the plant and the observer in the case of observer convergence...... on several simple examples....

  19. Intuitive theories of information: beliefs about the value of redundancy.

    Science.gov (United States)

    Soll, J B

    1999-03-01

    In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.

  20. Application of Dempster–Shafer theory in dose response outcome analysis

    International Nuclear Information System (INIS)

    Chen Wenzhou; Cui Yunfeng; Yu Yan; Galvin, James; Xiao Ying; He Yanyan; Hussaini, Yousuff M

    2012-01-01

    The Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) reviews summarize the currently available three-dimensional dose/volume/outcome data from multi-institutions and numerous articles to update and refine the normal tissue dose/volume tolerance guidelines. As pointed out in the review, the data have limitations and even some inconsistency. However, with the help of new physical and statistical techniques, the information in the review could be updated so that patient care can be continually improved. The purpose of this work is to demonstrate the application of a mathematical theory, the Dempster–Shafer theory, in dose/volume/outcome data analysis. We applied this theory to the original data obtained from published clinical studies describing dose response for radiation pneumonitis. Belief and plausibility concepts were introduced for dose response evaluation. We were also able to consider the uncertainty and inconsistency of the data from these studies with Yager's combination rule, a special methodology of Dempster–Shafer theory, to fuse the data at several specific doses. The values of belief and plausibility functions were obtained at the corresponding doses. Then we applied the Lyman–Kutcher–Burman (LKB) model to fit these values and a belief–plausibility range was obtained. This range could be considered as a probability range to assist physicians and treatment planners in determining acceptable dose–volume constraints. Finally, the parameters obtained from the LKB model fitting were compared with those in Emami and Burman's papers and those from other frequentist statistics methods. We found that Emami and Burman's parameters are within the belief–plausibility range we calculated by the Dempster–Shafer theory. (paper)

  1. Application of Dempster-Shafer theory in dose response outcome analysis

    Science.gov (United States)

    Chen, Wenzhou; Cui, Yunfeng; He, Yanyan; Yu, Yan; Galvin, James; Hussaini, Yousuff M.; Xiao, Ying

    2012-09-01

    The Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) reviews summarize the currently available three-dimensional dose/volume/outcome data from multi-institutions and numerous articles to update and refine the normal tissue dose/volume tolerance guidelines. As pointed out in the review, the data have limitations and even some inconsistency. However, with the help of new physical and statistical techniques, the information in the review could be updated so that patient care can be continually improved. The purpose of this work is to demonstrate the application of a mathematical theory, the Dempster-Shafer theory, in dose/volume/outcome data analysis. We applied this theory to the original data obtained from published clinical studies describing dose response for radiation pneumonitis. Belief and plausibility concepts were introduced for dose response evaluation. We were also able to consider the uncertainty and inconsistency of the data from these studies with Yager's combination rule, a special methodology of Dempster-Shafer theory, to fuse the data at several specific doses. The values of belief and plausibility functions were obtained at the corresponding doses. Then we applied the Lyman-Kutcher-Burman (LKB) model to fit these values and a belief-plausibility range was obtained. This range could be considered as a probability range to assist physicians and treatment planners in determining acceptable dose-volume constraints. Finally, the parameters obtained from the LKB model fitting were compared with those in Emami and Burman's papers and those from other frequentist statistics methods. We found that Emami and Burman's parameters are within the belief-plausibility range we calculated by the Dempster-Shafer theory.

  2. Prequantum classical statistical field theory: background field as a source of everything?

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2011-01-01

    Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's 'double solution' approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special 'prequantum fields': the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the 'photonic field' (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of 'vacuum fluctuations') might play the role of a source of such pulses, i.e., the source of everything.

  3. Numerically-based ducted propeller design using vortex lattice lifting line theory

    OpenAIRE

    Stubblefield, John M.

    2008-01-01

    CIVINS (Civilian Institutions) Thesis document This thesis used vortex lattice lifting line theory to model an axisymmetrical-ducted propeller with no gap between the duct and the propeller. The theory required to model the duct and its interaction with the propeller were discussed and implemented in Open-source Propeller Design and Analysis Program (OpenProp). Two routines for determining the optimum circulation distribution were considered, and a method based on calculus of variation...

  4. A One-Dimensional Thermoelastic Problem due to a Moving Heat Source under Fractional Order Theory of Thermoelasticity

    Directory of Open Access Journals (Sweden)

    Tianhu He

    2014-01-01

    Full Text Available The dynamic response of a one-dimensional problem for a thermoelastic rod with finite length is investigated in the context of the fractional order theory of thermoelasticity in the present work. The rod is fixed at both ends and subjected to a moving heat source. The fractional order thermoelastic coupled governing equations for the rod are formulated. Laplace transform as well as its numerical inversion is applied to solving the governing equations. The variations of the considered temperature, displacement, and stress in the rod are obtained and demonstrated graphically. The effects of time, velocity of the moving heat source, and fractional order parameter on the distributions of the considered variables are of concern and discussed in detail.

  5. Analysis of jacobian and singularity of planar parallel robots using screw theory

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jung Hyun; Lee, Jeh Won; Lee, Hyuk Jin [Yeungnam Univ., Gyeongsan (Korea, Republic of)

    2012-11-15

    The Jacobian and singularity analysis of parallel robots is necessary to analyze robot motion. The derivations of the Jacobian matrix and singularity configuration are complicated and have no geometrical earning in the velocity form of the Jacobian matrix. In this study, the screw theory is used to derive the Jacobian of parallel robots. The statics form of the Jacobian has a geometrical meaning. In addition, singularity analysis can be performed by using the geometrical values. Furthermore, this study shows that the screw theory is applicable to redundantly actuated robots as well as non redundant robots.

  6. Analysis of the tuning characteristics of microwave plasma source

    International Nuclear Information System (INIS)

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  7. Motion to innovation: Brand value sources have (not changed over time

    Directory of Open Access Journals (Sweden)

    Kliestikova Jana

    2017-01-01

    Full Text Available Innovation drives the expansion of economies in a global dimension. This is also the reason why contemporary researches indicates the trend of incorporation of the innovation to the strategic concept of brand value building and managing. It has been proven, that innovation is relevant source of brand value perceived by consumers. Such a trend has been established in reaction to the growing importance of brand value for competitive advantage creation in a global perspective. Currently published scientific contributions mainly highlight the importance of brand innovation in selected sectors of national economies abstracting from innovation perceptions due to the national socio-psychological profile. This is the reason why universally applicable theory of innovation in scope of brand value building and managing is missing. The lack of the theory can be removed by identification of importance of innovation attributes as brand value sources in context of market specifics. So, the aim of this paper is to provide such an identification and to verify existence of divergences between “foreign” theory and “domestic” practice. To do that, we use questionnaire, selection analysis and cluster analysis. We detect specifics of brand value perception focusing on innovation and its attributes comparing theory and reality of Slovak environment.

  8. Thought analysis on self-organization theories of MHD plasma

    International Nuclear Information System (INIS)

    Kondoh, Yoshiomi; Sato, Tetsuya.

    1992-08-01

    A thought analysis on the self-organization theories of dissipative MHD plasma is presented to lead to three groups of theories that lead to the same relaxed state of ∇ x B = λB, in order to find an essential physical picture embedded in the self-organization phenomena due to nonlinear and dissipative processes. The self-organized relaxed state due to the dissipation by the Ohm loss is shown to be formulated generally as the state such that yields the minimum dissipation rate of global auto-and/or cross-correlations between two quantities in j, B, and A for their own instantaneous values of the global correlations. (author)

  9. Item response theory analysis applied to the Spanish version of the Personal Outcomes Scale.

    Science.gov (United States)

    Guàrdia-Olmos, J; Carbó-Carreté, M; Peró-Cebollero, M; Giné, C

    2017-11-01

    The study of measurements of quality of life (QoL) is one of the great challenges of modern psychology and psychometric approaches. This issue has greater importance when examining QoL in populations that were historically treated on the basis of their deficiency, and recently, the focus has shifted to what each person values and desires in their life, as in cases of people with intellectual disability (ID). Many studies of QoL scales applied in this area have attempted to improve the validity and reliability of their components by incorporating various sources of information to achieve consistency in the data obtained. The adaptation of the Personal Outcomes Scale (POS) in Spanish has shown excellent psychometric attributes, and its administration has three sources of information: self-assessment, practitioner and family. The study of possible congruence or incongruence of observed distributions of each item between sources is therefore essential to ensure a correct interpretation of the measure. The aim of this paper was to analyse the observed distribution of items and dimensions from the three Spanish POS information sources cited earlier, using the item response theory. We studied a sample of 529 people with ID and their respective practitioners and family member, and in each case, we analysed items and factors using Samejima's model of polytomic ordinal scales. The results indicated an important number of items with differential effects regarding sources, and in some cases, they indicated significant differences in the distribution of items, factors and sources of information. As a result of this analysis, we must affirm that the administration of the POS, considering three sources of information, was adequate overall, but a correct interpretation of the results requires that it obtain much more information to consider, as well as some specific items in specific dimensions. The overall ratings, if these comments are considered, could result in bias. © 2017

  10. Developments of saddle field ion sources and their applications

    International Nuclear Information System (INIS)

    Abdelrahman, M.M.; Helal, A.G.

    2009-01-01

    Ion sources should have different performance parameters according to the various applications for which they are used, ranging from ion beam production to high energy ion implanters. There are many kinds of ion sources, which produce different ion beams with different characteristics. This paper deals with the developments and applications of some saddle field ion sources which were designed and constructed in our lab. Theory of operation and types of saddle field ion sources are discussed in details. Some experimental results are given. The saddle field ion sources operate at low gas pressure and require neither magnetic field nor filament. This type of ion sources is used for many different applications as ion beam machining, sputtering, cleaning and profiling for surface analysis etc

  11. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Science.gov (United States)

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  12. Hazard analysis of typhoon-related external events using extreme value theory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Chan; Jang, Seung Cheol [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lim, Tae Jin [Dept. of Industrial Information Systems Engineering, Soongsil University, Seoul (Korea, Republic of)

    2015-02-15

    After the Fukushima accident, the importance of hazard analysis for extreme external events was raised. To analyze typhoon-induced hazards, which are one of the significant disasters of East Asian countries, a statistical analysis using the extreme value theory, which is a method for estimating the annual exceedance frequency of a rare event, was conducted for an estimation of the occurrence intervals or hazard levels. For the four meteorological variables, maximum wind speed, instantaneous wind speed, hourly precipitation, and daily precipitation, the parameters of the predictive extreme value theory models were estimated. The 100-year return levels for each variable were predicted using the developed models and compared with previously reported values. It was also found that there exist significant long-term climate changes of wind speed and precipitation. A fragility analysis should be conducted to ensure the safety levels of a nuclear power plant for high levels of wind speed and precipitation, which exceed the results of a previous analysis.

  13. Sensitivity theory for reactor burnup analysis based on depletion perturbation theory

    International Nuclear Information System (INIS)

    Yang, Wonsik.

    1989-01-01

    The large computational effort involved in the design and analysis of advanced reactor configurations motivated the development of Depletion Perturbation Theory (DPT) for general fuel cycle analysis. The work here focused on two important advances in the current methods. First, the adjoint equations were developed for using the efficient linear flux approximation to decouple the neutron/nuclide field equations. And second, DPT was extended to the constrained equilibrium cycle which is important for the consistent comparison and evaluation of alternative reactor designs. Practical strategies were formulated for solving the resulting adjoint equations and a computer code was developed for practical applications. In all cases analyzed, the sensitivity coefficients generated by DPT were in excellent agreement with the results of exact calculations. The work here indicates that for a given core response, the sensitivity coefficients to all input parameters can be computed by DPT with a computational effort similar to a single forward depletion calculation

  14. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  15. Time-domain ultra-wideband radar, sensor and components theory, analysis and design

    CERN Document Server

    Nguyen, Cam

    2014-01-01

    This book presents the theory, analysis, and design of ultra-wideband (UWB) radar and sensor systems (in short, UWB systems) and their components. UWB systems find numerous applications in the military, security, civilian, commercial and medicine fields. This book addresses five main topics of UWB systems: System Analysis, Transmitter Design, Receiver Design, Antenna Design and System Integration and Test. The developments of a practical UWB system and its components using microwave integrated circuits, as well as various measurements, are included in detail to demonstrate the theory, analysis and design technique. Essentially, this book will enable the reader to design their own UWB systems and components. In the System Analysis chapter, the UWB principle of operation as well as the power budget analysis and range resolution analysis are presented. In the UWB Transmitter Design chapter, the design, fabrication and measurement of impulse and monocycle pulse generators are covered. The UWB Receiver Design cha...

  16. Antenna theory analysis and design

    CERN Document Server

    Balanis, Constantine A

    2005-01-01

    The discipline of antenna theory has experienced vast technological changes. In response, Constantine Balanis has updated his classic text, Antenna Theory, offering the most recent look at all the necessary topics. New material includes smart antennas and fractal antennas, along with the latest applications in wireless communications. Multimedia material on an accompanying CD presents PowerPoint viewgraphs of lecture notes, interactive review questions, Java animations and applets, and MATLAB features. Like the previous editions, Antenna Theory, Third Edition meets the needs of e

  17. Analysis of Health Behavior Theories for Clustering of Health Behaviors.

    Science.gov (United States)

    Choi, Seung Hee; Duffy, Sonia A

    The objective of this article was to review the utility of established behavior theories, including the Health Belief Model, Theory of Reasoned Action, Theory of Planned Behavior, Transtheoretical Model, and Health Promotion Model, for addressing multiple health behaviors among people who smoke. It is critical to design future interventions for multiple health behavior changes tailored to individuals who currently smoke, yet it has not been addressed. Five health behavior theories/models were analyzed and critically evaluated. A review of the literature included a search of PubMed and Google Scholar from 2010 to 2016. Two hundred sixty-seven articles (252 studies from the initial search and 15 studies from the references of initially identified studies) were included in the analysis. Most of the health behavior theories/models emphasize psychological and cognitive constructs that can be applied only to one specific behavior at a time, thus making them not suitable to address multiple health behaviors. However, the Health Promotion Model incorporates "related behavior factors" that can explain multiple health behaviors among persons who smoke. Future multiple behavior interventions guided by the Health Promotion Model are necessary to show the utility and applicability of the model to address multiple health behaviors.

  18. Men's passage to fatherhood: an analysis of the contemporary relevance of transition theory

    OpenAIRE

    Draper, Janet

    2003-01-01

    This paper presents a theoretical analysis of men's experiences of pregnancy, birth and early fatherhood. It does so using a framework of ritual transition theory and argues that despite its earlier structural-functionalist roots, transition theory remains a valuable framework, illuminating contemporary transitions across the life course. The paper discusses the historical development of transition or ritual theory and, drawing upon data generated during longitudinal ethnographic interviews w...

  19. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  20. Operational analysis and comparative evaluation of embedded Z-Source inverters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  1. The Interaction between Multimedia Data Analysis and Theory Development in Design Research

    Science.gov (United States)

    van Nes, Fenna; Doorman, Michiel

    2010-01-01

    Mathematics education researchers conducting instruction experiments using a design research methodology are challenged with the analysis of often complex and large amounts of qualitative data. In this paper, we present two case studies that show how multimedia analysis software can greatly support video data analysis and theory development in…

  2. Systemic Functional Theory: A Pickax of Textual Investigation

    Directory of Open Access Journals (Sweden)

    Taofeek Dalamu

    2017-03-01

    Full Text Available The study examines Systemic Functional Theory (SFT as a tool of examining text, and perhaps, text of any dimension as long as it falls within the grammatical organs of the clause. The author provides explanations for the theory from its relevant source(s. The chronological appreciation involves the efforts of Saussure, Firth, Malinowski, Hjelmslev, etc. However, Halliday’s insight seems prominent and upon which Systemic Functional Theory receives a global status that it has assumed today. Halliday constructs numerous concepts e.g. lexicogrammar, processes, cohesion, coherence, system, system network with background from traditional grammar and sociological tokens. In addition to that, the three metafunctions are characterized as its core operational concepts. Out of these, the mood system serves as the instrument of analysis of Psalm one utilized in this endeavor as a case study. Although the clauses fall within the profile of the indicative and imperative, the study reveals that some of the structures are inverted in order to propagate the intended messages. To that end, there are inverted indicative clauses expressed as inverted declarative statements, inverted imperative questions and inverted negativized polarity. In sum, Systemic Functional Theory is a facility for explaining different shapes of texts.

  3. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Science.gov (United States)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  4. Radioisotope sources for X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  5. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    Science.gov (United States)

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  6. SCALE Sensitivity Calculations Using Contributon Theory

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Perfetti, Chris; Williams, Mark L.; Petrie, Lester M. Jr.

    2010-01-01

    The SCALE TSUNAMI-3D sensitivity and uncertainty analysis sequence computes the sensitivity of k-eff to each constituent multigroup cross section using adjoint techniques with the KENO Monte Carlo codes. A new technique to simultaneously obtain the product of the forward and adjoint angular flux moments within a single Monte Carlo calculation has been developed and implemented in the SCALE TSUNAMI-3D analysis sequence. A new concept in Monte Carlo theory has been developed for this work, an eigenvalue contributon estimator, which is an extension of previously developed fixed-source contributon estimators. A contributon is a particle for which the forward solution is accumulated, and its importance to the response, which is equivalent to the adjoint solution, is simultaneously accumulated. Thus, the contributon is a particle coupled with its contribution to the response, in this case k-eff. As implemented in SCALE, the contributon provides the importance of a particle exiting at any energy or direction for each location, energy and direction at which the forward flux solution is sampled. Although currently implemented for eigenvalue calculations in multigroup mode in KENO, this technique is directly applicable to continuous-energy calculations for many other responses such as fixed-source sensitivity analysis and quantification of reactor kinetics parameters. This paper provides the physical bases of eigenvalue contributon theory, provides details of implementation into TSUNAMI-3D, and provides results of sample calculations.

  7. Relativistic quantum mechanics and introduction to field theory

    Energy Technology Data Exchange (ETDEWEB)

    Yndurain, F.J. [Universidad Autonoma de Madrid (Spain). Dept. de Fisica Teorica

    1996-12-01

    The following topics were dealt with: relativistic transformations, the Lorentz group, Klein-Gordon equation, spinless particles, spin 1/2 particles, Dirac particle in a potential, massive spin 1 particles, massless spin 1 particles, relativistic collisions, S matrix, cross sections, decay rates, partial wave analysis, electromagnetic field quantization, interaction of radiation with matter, interactions in quantum field theory and relativistic interactions with classical sources.

  8. Relativistic quantum mechanics and introduction to field theory

    International Nuclear Information System (INIS)

    Yndurain, F.J.

    1996-01-01

    The following topics were dealt with: relativistic transformations, the Lorentz group, Klein-Gordon equation, spinless particles, spin 1/2 particles, Dirac particle in a potential, massive spin 1 particles, massless spin 1 particles, relativistic collisions, S matrix, cross sections, decay rates, partial wave analysis, electromagnetic field quantization, interaction of radiation with matter, interactions in quantum field theory and relativistic interactions with classical sources

  9. Quantal density functional theory

    CERN Document Server

    Sahni, Viraht

    2016-01-01

    This book deals with quantal density functional theory (QDFT) which is a time-dependent local effective potential theory of the electronic structure of matter. The treated time-independent QDFT constitutes a special case. In the 2nd edition, the theory is extended to include the presence of external magnetostatic fields. The theory is a description of matter based on the ‘quantal Newtonian’ first and second laws which is in terms of “classical” fields that pervade all space, and their quantal sources. The fields, which are explicitly defined, are separately representative of electron correlations due to the Pauli exclusion principle, Coulomb repulsion, correlation-kinetic, correlation-current-density, and correlation-magnetic effects. The book further describes Schrödinger theory from the new physical perspective of fields and quantal sources. It also describes traditional Hohenberg-Kohn-Sham DFT, and explains via QDFT the physics underlying the various energy functionals and functional derivatives o...

  10. Sociological Analysis of Contemporary Youth Movements: the Strenghts of Classical Leadership Theories

    Directory of Open Access Journals (Sweden)

    N V Andrievskaya

    2011-03-01

    Full Text Available The article is devoted to the study of youth socio-political movements in terms of leadership theories. The author examines the development of leadership theories and provides the analysis of the activities of two youth organization's leaders in the context of the theories involved. In order to analyze the efficiency of leadership the author highlights the qualities essential for an ideal leader of a youth organization and identifies the type of leadership style. Further on, the author considers how far each of the candidates answers the ideal leadership model description.

  11. Analysis of the tuning characteristics of microwave plasma source

    Energy Technology Data Exchange (ETDEWEB)

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  12. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    Directory of Open Access Journals (Sweden)

    J.D. Clayton

    2016-08-01

    Full Text Available Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceramics are AD-995 alumina, aluminum nitride, silicon carbide, and boron carbide. Test data can be accurately represented by the linear form of the power series, whereby the same value of a single fitting parameter applies remarkably well for all four ceramics. Comparison of the present model with others in the literature (e.g., Tate's theory demonstrates a target resistance stress that depends on impact velocity, linearly in the limiting case. Comparison of the present analysis with recent research involving penetration of thin ceramic tiles at lower typical impact velocities confirms the importance of target properties related to fracture and shear strength at the Hugoniot Elastic Limit (HEL only in the latter. In contrast, in the former (i.e., hypervelocity and thick target experiments, the current analysis demonstrates dominant dependence of penetration depth only by target mass density. Such comparisons suggest transitions from microstructure-controlled to density-controlled penetration resistance with increasing impact velocity and ceramic target thickness.

  13. Effective equivalence of the Einstein-Cartan and Einstein theories of gravity

    International Nuclear Information System (INIS)

    Nester, J.M.

    1977-01-01

    I prove that, for any choice of minimally coupled source field Lagrangian for the Einstein-Cartan-Sciama-Kibble theory of gravity, there exists a related minimally coupled source field Lagrangian for the Einstein theory which produces the same field equations for the metric and source field. By using a standard first-order form for source Lagrangians, the converse is also demonstrated. This establishes a one-to-one correspondence between source Lagrangians for the two theories which clearly reveals their similarities and their differences. Because of this ''equivalence,'' one can view either theory, in terms of the other, as minimal coupling for a related Minkowski source Lagrangian or as nonminimal coupling for the same Minkowski source Lagrangian. Consequently the two theories are, in this sense, indistinguishable. Some other implications of this ''equivalence'' are discussed

  14. Spallation neutron sources

    International Nuclear Information System (INIS)

    Fraser, J.S.; Bartholomew, G.A.

    1983-01-01

    The principles and theory of spallation neutron sources are outlined and a comparison is given with other types of neutron source. A summary of the available accelerator types for spallation neutron sources and their advantages and disadvantages is presented. Suitable target materials are discussed for specific applications, and typical target assemblies shown. (U.K.)

  15. Advances in complex analysis and operator theory festschrift in honor of Daniel Alpay’s 60th birthday

    CERN Document Server

    Sabadini, Irene; Struppa, Daniele; Vajiac, Mihaela

    2017-01-01

    This book gathers contributions written by Daniel Alpay’s friends and collaborators. Several of the papers were presented at the International Conference on Complex Analysis and Operator Theory held in honor of Professor Alpay’s 60th birthday at Chapman University in November 2016. The main topics covered are complex analysis, operator theory and other areas of mathematics close to Alpay’s primary research interests. The book is recommended for mathematicians from the graduate level on, working in various areas of mathematical analysis, operator theory, infinite dimensional analysis, linear systems, and stochastic processes.

  16. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  17. Health Behavior Theory in Popular Calorie Counting Apps: A Content Analysis

    OpenAIRE

    Davis, Siena F; Ellsworth, Marisa A; Payne, Hannah E; Hall, Shelby M; West, Joshua H; Nordhagen, Amber L

    2016-01-01

    Background Although the Health & Fitness category of the Apple App Store features hundreds of calorie counting apps, the extent to which popular calorie counting apps include health behavior theory is unknown. Objective This study evaluates the presence of health behavior theory in calorie counting apps. Methods Data for this study came from an extensive content analysis of the 10 most popular calorie counting apps in the Health & Fitness category of the Apple App Store. Results Each app was ...

  18. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  19. The Swift/UVOT catalogue of NGC 4321 star-forming sources: a case against density wave theory

    Science.gov (United States)

    Ferreras, Ignacio; Cropper, Mark; Kawata, Daisuke; Page, Mat; Hoversten, Erik A.

    2012-08-01

    We study the star-forming regions in the spiral galaxy NGC 4321 (M100). We take advantage of the spatial resolution (2.5 arcsec full width at half-maximum) of the Swift/Ultraviolet/Optical Telescope camera and the availability of three ultraviolet (UV) passbands in the region 1600 spiral arms. The Hα luminosities of the sources have a strong decreasing radial trend, suggesting more massive star-forming regions in the central part of the galaxy. When segregated with respect to near-UV (NUV)-optical colour, blue sources have a significant excess of flux in the IR at 8 μm, revealing the contribution from polycyclic aromatic hydrocarbons, although the overall reddening of these sources stays below E(B - V) = 0.2 mag. The distribution of distances to the spiral arms is compared for subsamples selected according to Hα luminosity, NUV-optical colour or ages derived from a population synthesis model. An offset would be expected between these subsamples as a function of radius if the pattern speed of the spiral arm were constant - as predicted by classic density wave theory. No significant offsets are found, favouring instead a mechanism where the pattern speed has a radial dependence.

  20. Using Molecular Modeling in Teaching Group Theory Analysis of the Infrared Spectra of Organometallic Compounds

    Science.gov (United States)

    Wang, Lihua

    2012-01-01

    A new method is introduced for teaching group theory analysis of the infrared spectra of organometallic compounds using molecular modeling. The main focus of this method is to enhance student understanding of the symmetry properties of vibrational modes and of the group theory analysis of infrared (IR) spectra by using visual aids provided by…

  1. Asymptotic Analysis of Large Cooperative Relay Networks Using Random Matrix Theory

    Directory of Open Access Journals (Sweden)

    H. Poor

    2008-04-01

    Full Text Available Cooperative transmission is an emerging communication technology that takes advantage of the broadcast nature of wireless channels. In cooperative transmission, the use of relays can create a virtual antenna array so that multiple-input/multiple-output (MIMO techniques can be employed. Most existing work in this area has focused on the situation in which there are a small number of sources and relays and a destination. In this paper, cooperative relay networks with large numbers of nodes are analyzed, and in particular the asymptotic performance improvement of cooperative transmission over direction transmission and relay transmission is analyzed using random matrix theory. The key idea is to investigate the eigenvalue distributions related to channel capacity and to analyze the moments of this distribution in large wireless networks. A performance upper bound is derived, the performance in the low signal-to-noise-ratio regime is analyzed, and two approximations are obtained for high and low relay-to-destination link qualities, respectively. Finally, simulations are provided to validate the accuracy of the analytical results. The analysis in this paper provides important tools for the understanding and the design of large cooperative wireless networks.

  2. Stability analysis of jointed rock slope by the block theory

    International Nuclear Information System (INIS)

    Yoshinaka, Ryunoshin; Yamabe, Tadashi; Fujita, Tomoo.

    1990-01-01

    The block theory to analyze three dimensional stability problems of discontinuous rock masses is applied to the actual discontinuous rock slope. Taking into consideration that the geometrical information about discontinuities generally increases according to progressive steps of rock investigation in field, the method adopted for analysis is divided into following two steps; 1) the statistical/probabilitical analysis using information from the primary investigation stage which mainly consists of that of natural rock outcrops, and 2) the deterministic analysis correspond to the secondary stage using exploration adits. (author)

  3. Conference on Geometric Analysis &Conference on Type Theory, Homotopy Theory and Univalent Foundations : Extended Abstracts Fall 2013

    CERN Document Server

    Yang, Paul; Gambino, Nicola; Kock, Joachim

    2015-01-01

    The two parts of the present volume contain extended conference abstracts corresponding to selected talks given by participants at the "Conference on Geometric Analysis" (thirteen abstracts) and at the "Conference on Type Theory, Homotopy Theory and Univalent Foundations" (seven abstracts), both held at the Centre de Recerca Matemàtica (CRM) in Barcelona from July 1st to 5th, 2013, and from September 23th to 27th, 2013, respectively. Most of them are brief articles, containing preliminary presentations of new results not yet published in regular research journals. The articles are the result of a direct collaboration between active researchers in the area after working in a dynamic and productive atmosphere. The first part is about Geometric Analysis and Conformal Geometry; this modern field lies at the intersection of many branches of mathematics (Riemannian, Conformal, Complex or Algebraic Geometry, Calculus of Variations, PDE's, etc) and relates directly to the physical world, since many natural phenomena...

  4. [Effects of attitude formation, persuasive message, and source expertise on attitude change: an examination based on the Elaboration Likelihood Model and the Attitude Formation Theory].

    Science.gov (United States)

    Nakamura, M; Saito, K; Wakabayashi, M

    1990-04-01

    The purpose of this study was to investigate how attitude change is generated by the recipient's degree of attitude formation, evaluative-emotional elements contained in the persuasive messages, and source expertise as a peripheral cue in the persuasion context. Hypotheses based on the Attitude Formation Theory of Mizuhara (1982) and the Elaboration Likelihood Model of Petty and Cacioppo (1981, 1986) were examined. Eighty undergraduate students served as subjects in the experiment, the first stage of which involving manipulating the degree of attitude formation with respect to nuclear power development. Then, the experimenter presented persuasive messages with varying combinations of evaluative-emotional elements from a source with either high or low expertise on the subject. Results revealed a significant interaction effect on attitude change among attitude formation, persuasive message and the expertise of the message source. That is, high attitude formation subjects resisted evaluative-emotional persuasion from the high expertise source while low attitude formation subjects changed their attitude when exposed to the same persuasive message from a low expertise source. Results exceeded initial predictions based on the Attitude Formation Theory and the Elaboration Likelihood Model.

  5. Constructivism theory analysis and application to curricula.

    Science.gov (United States)

    Brandon, Amy F; All, Anita C

    2010-01-01

    Today's nursing programs are struggling to accommodate the changing needs of the health care environment and need to make changes in how students are taught. Using constructivism theory, whereby learning is an active process in which learners construct new ideas or concepts based upon their current or past knowledge, leaders in nursing education can make a paradigm shift toward concept-based curricula. This article presents a summary and analysis of constructivism and an innovative application of its active-learning principles to curriculum development, specifically for the education of nursing students.

  6. Methods of Approximation Theory in Complex Analysis and Mathematical Physics

    CERN Document Server

    Saff, Edward

    1993-01-01

    The book incorporates research papers and surveys written by participants ofan International Scientific Programme on Approximation Theory jointly supervised by Institute for Constructive Mathematics of University of South Florida at Tampa, USA and the Euler International Mathematical Instituteat St. Petersburg, Russia. The aim of the Programme was to present new developments in Constructive Approximation Theory. The topics of the papers are: asymptotic behaviour of orthogonal polynomials, rational approximation of classical functions, quadrature formulas, theory of n-widths, nonlinear approximation in Hardy algebras,numerical results on best polynomial approximations, wavelet analysis. FROM THE CONTENTS: E.A. Rakhmanov: Strong asymptotics for orthogonal polynomials associated with exponential weights on R.- A.L. Levin, E.B. Saff: Exact Convergence Rates for Best Lp Rational Approximation to the Signum Function and for Optimal Quadrature in Hp.- H. Stahl: Uniform Rational Approximation of x .- M. Rahman, S.K. ...

  7. Theory of nanolaser devices: Rate equation analysis versus microscopic theory

    DEFF Research Database (Denmark)

    Lorke, Michael; Skovgård, Troels Suhr; Gregersen, Niels

    2013-01-01

    A rate equation theory for quantum-dot-based nanolaser devices is developed. We show that these rate equations are capable of reproducing results of a microscopic semiconductor theory, making them an appropriate starting point for complex device simulations of nanolasers. The input...

  8. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  9. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  10. The determinants of physician attitudes and subjective norms toward drug information sources: modification and test of the theory of reasoned action.

    Science.gov (United States)

    Gaither, C A; Bagozzi, R P; Ascione, F J; Kirking, D M

    1997-10-01

    To improve upon the theory of reasoned action and apply it to pharmaceutical research, we investigated the effects of relevant appraisals attributes, and past behavior of physicians on the use of drug information sources. We also examined the moderating effects of practice characteristics. A mail questionnaire asked HMO physicians to evaluate seven common sources of drug information on general appraisals (degree of usefulness and ease of use), specific attributes (availability, quality of information on harmful effects and on drug efficacy), and past behavior when searching for information on a new, simulated H2 antagonist agent. Semantic differential scales were used to measure each appraisal, attribute and past behavior. Information was also collected on practice characteristics. Findings from 108/200 respondents indicated that appraisals and attributes were useful determinants of attitudes and subjective norms toward use. Degree of usefulness and quality of information on harmful effects were important predictors of attitudes toward use for several sources of information. Ease of use and degree of usefulness were important predictors of subjective norms toward use. In many cases, moderating effects of practice characteristics were in opposing directions. Past behavior had significant direct effects on attitudes toward the PDR. The findings suggest ways to improve the usefulness of the theory of reasoned action as a model of decision-making. We also propose practical guidelines that can be used to improve the types of drug information sources used by physicians.

  11. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, Andrei V

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of string theory in the modern picture of the physical world. Even though quantum field theory describes a wide range of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments which are our concern in this review. (reviews of topical problems)

  12. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  13. Physical analysis of some features of the gauge theories with Higgs sectors

    International Nuclear Information System (INIS)

    Beshtoev, Kh.M.

    1995-01-01

    A physical analysis of some features of the gauge theories with Higgs sectors is made. It is shown that we should assume gauge transformations in the fermion and Higgs sectors to be different (i.e., to have different charges) in order to remove contradictions arising in gauge theories with Higgs sectors. Then, the Higgs mechanism can be interpreted as some mechanism of gauge field shielding. In such a mechanism fermions remain without masses. The conclusion is made that in the standard theory of the development of the Universe, monopoles cannot survive at low temperatures. 15 refs

  14. Structural reliability analysis under evidence theory using the active learning kriging model

    Science.gov (United States)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  15. Mechanical system reliability analysis using a combination of graph theory and Boolean function

    International Nuclear Information System (INIS)

    Tang, J.

    2001-01-01

    A new method based on graph theory and Boolean function for assessing reliability of mechanical systems is proposed. The procedure for this approach consists of two parts. By using the graph theory, the formula for the reliability of a mechanical system that considers the interrelations of subsystems or components is generated. Use of the Boolean function to examine the failure interactions of two particular elements of the system, followed with demonstrations of how to incorporate such failure dependencies into the analysis of larger systems, a constructive algorithm for quantifying the genuine interconnections between the subsystems or components is provided. The combination of graph theory and Boolean function provides an effective way to evaluate the reliability of a large, complex mechanical system. A numerical example demonstrates that this method an effective approaches in system reliability analysis

  16. Politics, Security, Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2011-01-01

    theory is found to ‘act politically’ through three structural features that systematically shape the political effects of using the theory. The article further discusses – on the basis of the preceding articles in the special issue – three emerging debates around securitization theory: ethics......This article outlines three ways of analysing the ‘politics of securitization’, emphasizing an often-overlooked form of politics practised through theory design. The structure and nature of a theory can have systematic political implications. Analysis of this ‘politics of securitization......’ is distinct from both the study of political practices of securitization and explorations of competing concepts of politics among security theories. It means tracking what kinds of analysis the theory can produce and whether such analysis systematically impacts real-life political struggles. Securitization...

  17. Use of CITATION code for flux calculation in neutron activation analysis with voluminous sample using an Am-Be source

    International Nuclear Information System (INIS)

    Khelifi, R.; Idiri, Z.; Bode, P.

    2002-01-01

    The CITATION code based on neutron diffusion theory was used for flux calculations inside voluminous samples in prompt gamma activation analysis with an isotopic neutron source (Am-Be). The code uses specific parameters related to the energy spectrum source and irradiation system materials (shielding, reflector). The flux distribution (thermal and fast) was calculated in the three-dimensional geometry for the system: air, polyethylene and water cuboidal sample (50x50x50 cm). Thermal flux was calculated in a series of points inside the sample. The results agreed reasonably well with observed values. The maximum thermal flux was observed at a distance of 3.2 cm while CITATION gave 3.7 cm. Beyond a depth of 7.2 cm, the thermal flux to fast flux ratio increases up to twice and allows us to optimise the detection system position in the scope of in-situ PGAA

  18. Pattern theory the stochastic analysis of real-world signals

    CERN Document Server

    Mumford, David

    2010-01-01

    Pattern theory is a distinctive approach to the analysis of all forms of real-world signals. At its core is the design of a large variety of probabilistic models whose samples reproduce the look and feel of the real signals, their patterns, and their variability. Bayesian statistical inference then allows you to apply these models in the analysis of new signals. This book treats the mathematical tools, the models themselves, and the computational algorithms for applying statistics to analyze six representative classes of signals of increasing complexity. The book covers patterns in text, sound

  19. Acknowledging the Infrasystem: A Critical Feminist Analysis of Systems Theory.

    Science.gov (United States)

    Creedon, Pamela J.

    1993-01-01

    Examines the absence of a critical feminist perspective in the application of systems theory as a unifying model for public relations. Describes an unacknowledged third system, the infrasystem, that constructs both suprasystem and subsystem interactions. Concludes with a case analysis of sport as illustration. (HB)

  20. Item response theory analysis of the mechanics baseline test

    Science.gov (United States)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  1. Analysis of interacting quantum field theory in curved spacetime

    International Nuclear Information System (INIS)

    Birrell, N.D.; Taylor, J.G.

    1980-01-01

    A detailed analysis of interacting quantized fields propagating in a curved background spacetime is given. Reduction formulas for S-matrix elements in terms of vacuum Green's functions are derived, special attention being paid to the possibility that the ''in'' and ''out'' vacuum states may not be equivalent. Green's functions equations are obtained and a diagrammatic representation for them given, allowing a formal, diagrammatic renormalization to be effected. Coordinate space techniques for showing renormalizability are developed in Minkowski space, for lambdaphi 3 /sub() 4,6/ field theories. The extension of these techniques to curved spacetimes is considered. It is shown that the possibility of field theories becoming nonrenormalizable there cannot be ruled out, although, allowing certain modifications to the theory, phi 3 /sub( 4 ) is proven renormalizable in a large class of spacetimes. Finally particle production from the vacuum by the gravitational field is discussed with particular reference to Schwarzschild spacetime. We shed some light on the nonlocalizability of the production process and on the definition of the S matrix for such processes

  2. Alice and Bob meet Banach the interface of asymptotic geometric analysis and quantum information theory

    CERN Document Server

    Aubrun, Guillaume

    2017-01-01

    The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...

  3. Analysis of the environmental behavior of farmers for non-point source pollution control and management in a water source protection area in China.

    Science.gov (United States)

    Wang, Yandong; Yang, Jun; Liang, Jiping; Qiang, Yanfang; Fang, Shanqi; Gao, Minxue; Fan, Xiaoyu; Yang, Gaihe; Zhang, Baowen; Feng, Yongzhong

    2018-08-15

    The environmental behavior of farmers plays an important role in exploring the causes of non-point source pollution and taking scientific control and management measures. Based on the theory of planned behavior (TPB), the present study investigated the environmental behavior of farmers in the Water Source Area of the Middle Route of the South-to-North Water Diversion Project in China. Results showed that TPB could explain farmers' environmental behavior (SMC=0.26) and intention (SMC=0.36) well. Furthermore, the farmers' attitude towards behavior (AB), subjective norm (SN), and perceived behavioral control (PBC) positively and significantly influenced their environmental intention; their environmental intention further impacted their behavior. SN was proved to be the main key factor indirectly influencing the farmers' environmental behavior, while PBC had no significant and direct effect. Moreover, environmental knowledge following as a moderator, gender and age was used as control variables to conduct the environmental knowledge on TPB construct moderated mediation analysis. It demonstrated that gender had a significant controlling effect on environmental behavior; that is, males engage in more environmentally friendly behaviors. However, age showed a significant negative controlling effect on pro-environmental intention and an opposite effect on pro-environmental behavior. In addition, environmental knowledge could negatively moderate the relationship between PBC and environmental intention. PBC had a greater impact on the environmental intention of farmers with poor environmental knowledge, compared to those with plenty environmental knowledge. Altogether, the present study could provide a theoretical basis for non-point source pollution control and management. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Application of depletion perturbation theory to fuel cycle burnup analysis

    International Nuclear Information System (INIS)

    White, J.R.

    1979-01-01

    Over the past several years static perturbation theory methods have been increasingly used for reactor analysis in lieu of more detailed and costly direct computations. Recently, perturbation methods incorporating time dependence have also received attention, and several authors have demonstrated their applicability to fuel burnup analysis. The objective of the work described here is to demonstrate that a time-dependent perturbation method can be easily and accurately applied to realistic depletion problems

  5. Model Theory in Algebra, Analysis and Arithmetic

    CERN Document Server

    Dries, Lou; Macpherson, H Dugald; Pillay, Anand; Toffalori, Carlo; Wilkie, Alex J

    2014-01-01

    Presenting recent developments and applications, the book focuses on four main topics in current model theory: 1) the model theory of valued fields; 2) undecidability in arithmetic; 3) NIP theories; and 4) the model theory of real and complex exponentiation. Young researchers in model theory will particularly benefit from the book, as will more senior researchers in other branches of mathematics.

  6. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  7. Transformative Learning: A Case for Using Grounded Theory as an Assessment Analytic

    Science.gov (United States)

    Patterson, Barbara A. B.; Munoz, Leslie; Abrams, Leah; Bass, Caroline

    2015-01-01

    Transformative Learning Theory and pedagogies leverage disruptive experiences as catalysts for learning and teaching. By facilitating processes of critical analysis and reflection that challenge assumptions, transformative learning reframes what counts as knowledge and the sources and processes for gaining and producing it. Students develop a…

  8. REDISCOVERING MISES-HAYEK MONETARY AND BUSINESS CYCLE THEORY IN LIGHT OF THE CURRENT CRISIS: CREDIT EXPANSION AS A SOURCE OF ECONOMIC BOOM AND BUST

    Directory of Open Access Journals (Sweden)

    Marcin Mrowiec

    2013-10-01

    Full Text Available The article starts with a brief description of Mises’ monetary theory, with emphasis on the Misesian differentiation of two kinds of credit: commodity and circulation credit, and with the description of the impact of circulation credit expansion on the business cycle. Further on it is described how Mises’ insights constituted the kernel of Austrian Business Cycle Theory, and how the same observations on the nature of credit constituted the kernel of the Chicago Plan (though Mises’ views on the nature of credit led him to different conculsions than it led the authors of the Chicago Plan, and how this plan is being “rediscovered” now. The following sections deal with observations of one of the preeminent current macroeconomic researches, Mr. Claudio Borio, on the elasticity of credit as the source of the current crisis, and on the importance of the financial cycle in analysing the current economic crisis. The author of this text demonstrates that Austrian Business Cycle Theory gave the same answer regarding the sources of economic crises that now modern macroeconomic theory seems to be approaching, and that the postulates for successful financial cycle modeling are already included in the ABCT. Finally, some observations on the current crisis, as well as proposals of avenues of further research are proposed.

  9. A lifting-surface theory solution for the diffraction of internal sound sources by an engine nacelle

    Science.gov (United States)

    Martinez, R.

    1986-07-01

    Lifting-surface theory is used to solve the problem of diffraction by a rigid open-ended pipe of zero thickness and finite length, with application to the prediction of acoustic insertion-loss performance for the encasing structure of a ducted propeller or turbofan. An axisymmetric situation is assumed, and the incident field due to a force applied directly to the fluid in the cylinder axial direction is used. A virtual-source distribution of unsteady dipoles is found whose integrated component of radial velocity is set to cancel that of the incident field over the surface. The calculated virtual load is verified by whether its effect on the near-field input power at the actual source is consistent with the far-field power radiated by the system, a balance which is possible if the no-flow-through boundary condition has been satisfied over the rigid pipe surface such that the velocity component of the acoustic intensity is zero.

  10. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  11. Global Sourcing of Heterogeneous Firms: Theory and Evidence

    DEFF Research Database (Denmark)

    Kohler, Wilhelm; Smolka, Marcel

    the Encuesta sobre Estrategias Empresariales (ESEE). We find a pattern of effects whereby productivity stimulates vertical integration in industries of low sourcing intensity, but favors outsourcing in industries of high sourcing intensity. Moreover, we find that productivity boosts offshoring throughout all...

  12. Information theory and rate distortion theory for communications and compression

    CERN Document Server

    Gibson, Jerry

    2013-01-01

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the cover

  13. Recruiting highly educated graduates: a study on the relationship between recruitment information sources, the theory of planned behavior, and actual job pursuit

    NARCIS (Netherlands)

    Jaidi, Y.; van Hooft, E.A.J.; Arends, L.R.

    2011-01-01

    Using the theory of planned behavior, we examined the effects of different recruitment-related information sources on the job pursuit of highly educated graduates. The study was conducted using a real-life longitudinal design. Participants reported on potential employers they were interested in. We

  14. From bed to bench: bridging from informatics practice to theory: an exploratory analysis.

    Science.gov (United States)

    Haux, R; Lehmann, C U

    2014-01-01

    In 2009, Applied Clinical Informatics (ACI)--focused on applications in clinical informatics--was launched as a companion journal to Methods of Information in Medicine (MIM). Both journals are official journals of the International Medical Informatics Association. To explore which congruencies and interdependencies exist in publications from theory to practice and from practice to theory and to determine existing gaps. Major topics discussed in ACI and MIM were analyzed. We explored if the intention of publishing companion journals to provide an information bridge from informatics theory to informatics practice and vice versa could be supported by this model. In this manuscript we will report on congruencies and interdependences from practice to theory and on major topics in MIM. Retrospective, prolective observational study on recent publications of ACI and MIM. All publications of the years 2012 and 2013 were indexed and analyzed. Hundred and ninety-six publications were analyzed (ACI 87, MIM 109). In MIM publications, modelling aspects as well as methodological and evaluation approaches for the analysis of data, information, and knowledge in biomedicine and health care were frequently raised - and often discussed from an interdisciplinary point of view. Important themes were ambient-assisted living, anatomic spatial relations, biomedical informatics as scientific discipline, boosting, coding, computerized physician order entry, data analysis, grid and cloud computing, health care systems and services, health-enabling technologies, health information search, health information systems, imaging, knowledge-based decision support, patient records, signal analysis, and web science. Congruencies between journals could be found in themes, but with a different focus on content. Interdependencies from practice to theory, found in these publications, were only limited. Bridging from informatics theory to practice and vice versa remains a major component of successful

  15. Transmission from theory to practice: Experiences using open-source code development and a virtual short course to increase the adoption of new theoretical approaches

    Science.gov (United States)

    Harman, C. J.

    2015-12-01

    Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.

  16. Nonlinear analysis approximation theory, optimization and applications

    CERN Document Server

    2014-01-01

    Many of our daily-life problems can be written in the form of an optimization problem. Therefore, solution methods are needed to solve such problems. Due to the complexity of the problems, it is not always easy to find the exact solution. However, approximate solutions can be found. The theory of the best approximation is applicable in a variety of problems arising in nonlinear functional analysis and optimization. This book highlights interesting aspects of nonlinear analysis and optimization together with many applications in the areas of physical and social sciences including engineering. It is immensely helpful for young graduates and researchers who are pursuing research in this field, as it provides abundant research resources for researchers and post-doctoral fellows. This will be a valuable addition to the library of anyone who works in the field of applied mathematics, economics and engineering.

  17. Utilization of graph theory in security analysis of power grid

    Directory of Open Access Journals (Sweden)

    Dalibor Válek

    2014-12-01

    Full Text Available This paper describes way how to use graph theory in security analysis. As an environment is used network of power lines and devices which are included here. Power grid is considered as a system of nodes which make together graph (network. On the simple example is applied Fiedler´s theory which is able to select the most important power lines of whole network. Components related to these lines are logicly ordered and considered by author´s modified analysis. This method has been improved and optimalized for risks related with illegal acts. Each power grid component has been connected with possible kind of attack and every of this device was gradually evaluated by five coefficients which takes values from 1 to 10. On the coefficient basis was assessed the level of risk. In the last phase the most risky power grid components have been selected. On the selected devices have been proposed security measures.

  18. Inference algorithms and learning theory for Bayesian sparse factor analysis

    International Nuclear Information System (INIS)

    Rattray, Magnus; Sharp, Kevin; Stegle, Oliver; Winn, John

    2009-01-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  19. Inference algorithms and learning theory for Bayesian sparse factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rattray, Magnus; Sharp, Kevin [School of Computer Science, University of Manchester, Manchester M13 9PL (United Kingdom); Stegle, Oliver [Max-Planck-Institute for Biological Cybernetics, Tuebingen (Germany); Winn, John, E-mail: magnus.rattray@manchester.ac.u [Microsoft Research Cambridge, Roger Needham Building, Cambridge, CB3 0FB (United Kingdom)

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  20. Mobile applications for weight management: theory-based content analysis.

    Science.gov (United States)

    Azar, Kristen M J; Lesser, Lenard I; Laing, Brian Y; Stephens, Janna; Aurora, Magi S; Burke, Lora E; Palaniappan, Latha P

    2013-11-01

    The use of smartphone applications (apps) to assist with weight management is increasingly prevalent, but the quality of these apps is not well characterized. The goal of the study was to evaluate diet/nutrition and anthropometric tracking apps based on incorporation of features consistent with theories of behavior change. A comparative, descriptive assessment was conducted of the top-rated free apps in the Health and Fitness category available in the iTunes App Store. Health and Fitness apps (N=200) were evaluated using predetermined inclusion/exclusion criteria and categorized based on commonality in functionality, features, and developer description. Four researchers then evaluated the two most popular apps in each category using two instruments: one based on traditional behavioral theory (score range: 0-100) and the other on the Fogg Behavioral Model (score range: 0-6). Data collection and analysis occurred in November 2012. Eligible apps (n=23) were divided into five categories: (1) diet tracking; (2) healthy cooking; (3) weight/anthropometric tracking; (4) grocery decision making; and (5) restaurant decision making. The mean behavioral theory score was 8.1 (SD=4.2); the mean persuasive technology score was 1.9 (SD=1.7). The top-rated app on both scales was Lose It! by Fitnow Inc. All apps received low overall scores for inclusion of behavioral theory-based strategies. © 2013 American Journal of Preventive Medicine.

  1. Stepped-frequency radar sensors theory, analysis and design

    CERN Document Server

    Nguyen, Cam

    2016-01-01

    This book presents the theory, analysis and design of microwave stepped-frequency radar sensors. Stepped-frequency radar sensors are attractive for various sensing applications that require fine resolution. The book consists of five chapters. The first chapter describes the fundamentals of radar sensors including applications followed by a review of ultra-wideband pulsed, frequency-modulated continuous-wave (FMCW), and stepped-frequency radar sensors. The second chapter discusses a general analysis of radar sensors including wave propagation in media and scattering on targets, as well as the radar equation. The third chapter addresses the analysis of stepped-frequency radar sensors including their principles and design parameters. Chapter 4 presents the development of two stepped-frequency radar sensors at microwave and millimeter-wave frequencies based on microwave integrated circuits (MICs), microwave monolithic integrated circuits (MMICs) and printed-circuit antennas, and discusses their signal processing....

  2. Proposal for a source of polarized protons; Projet de source de protons polarises

    Energy Technology Data Exchange (ETDEWEB)

    Abragam, A.; Winter, J. M. [Commissariat a l' energie atomique et aux energies alternatives - CEA, Centre d' Etudes Nucleaires de Saclay, BP2, Gif-sur-Yvette (France)

    1959-07-01

    Proposal for a source of polarized protons based on the theory of adiabatic fast passage due to F. Bloch. Reprint of a paper published in 'Physical review letters', vol 1, n. 10, 15 Nov 1958, p. 374-375 [French] On propose une methode nouvelle pour la realisation d'une source de protons polarises basee sur la theorie du passage adiabatique de F. Bloch. Reproduction d'un article publie dans 'Physical review letters', vol 1, n. 10, 15 nov 1958, p. 374-375.

  3. Logistics Sourcing Strategies in Supply Chain Design

    OpenAIRE

    Liu, Liwen

    2007-01-01

    A company's logistics sourcing strategy determines whether it structures and organizeslogistics within the company or company group or integrates logistics upstream and downstreamin the supply chain. First, three different types of logistics sourcing strategies in supply chaindesign are described and the theoretical background for the development of these strategies,including both transaction cost theory and network theory, is analyzed. Two special casesabout logistics sourcing strategy decis...

  4. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  5. International Conference Modern Stochastics: Theory and Applications III

    CERN Document Server

    Limnios, Nikolaos; Mishura, Yuliya; Sakhno, Lyudmyla; Shevchenko, Georgiy; Modern Stochastics and Applications

    2014-01-01

    This volume presents an extensive overview of all major modern trends in applications of probability and stochastic analysis. It will be a  great source of inspiration for designing new algorithms, modeling procedures, and experiments. Accessible to researchers, practitioners, as well as graduate and postgraduate students, this volume presents a variety of new tools, ideas, and methodologies in the fields of optimization, physics, finance, probability, hydrodynamics, reliability, decision making, mathematical finance, mathematical physics, and economics. Contributions to this Work include those of selected speakers from the international conference entitled “Modern Stochastics: Theory and Applications III,”  held on September 10 –14, 2012 at Taras Shevchenko National University of Kyiv, Ukraine. The conference covered the following areas of research in probability theory and its applications: stochastic analysis, stochastic processes and fields, random matrices, optimization methods in probability, st...

  6. Applicability of the diffusion and simplified P3 theories for BWR pin-by-pin core analysis

    International Nuclear Information System (INIS)

    Tada, Kenichi; Yamamoto, Akio; Kitamura, Yasunori; Yamane, Yoshihiro; Watanabe, Masato; Noda, Hiroshi

    2007-01-01

    The pin-by-pin fine mesh core calculation method is considered as a candidate of next-generation core calculation method for BWR. In this study, the diffusion and the simplified P 3 (SP 3 ) theories are applied to the pin-by-pin core analysis of BWR. Performances of the diffusion and the SP 3 theories for cell-homogeneous pin-by-pin fine mesh BWR core analysis are evaluated through comparison with cell-heterogeneous detailed transport calculation by the method of characteristics (MOC). In this study, two-dimensional, 2x2 multi-assemblies geometry is used to compare the prediction accuracies of the diffusion and the SP 3 theories. The 2x2 multi- assemblies geometry consists of two types of 9x9 UO 2 assembly that have two different enrichment splittings. To mitigate the cell-homogenization error, the SPH method is applied for the pin-by-pin fine mesh calculation. The SPH method is a technique that reproduces a result of heterogeneous calculation by that of homogeneous calculation. The calculation results indicated that diffusion theory shows larger discrepancy than that of SP 3 theory on pin-wise fission rates. Furthermore, the accuracy of the diffusion theory would not be sufficient for the pin-by-pin fine mesh calculation. In contrast to the diffusion theory, the SP 3 theory shows much better accuracy on pin wise fission rates. Therefore, if the SP 3 theory is applied, the accuracy of the pin-by-pin fine mesh BWR core analysis will be higher and will be sufficient for production calculation. (author)

  7. Finding joy in social work. II: Intrapersonal sources.

    Science.gov (United States)

    Pooler, David Kenneth; Wolfer, Terry; Freeman, Miriam

    2014-07-01

    Despite the social work profession's strengths orientation, research on its workforce tends to focus on problems (for example, depression, problem drinking, compassion fatigue, burnout). In contrast, this study explored ways in which social workers find joy in their work. The authors used an appreciative inquiry approach, semistructured interviews (N = 26), and a collaborative grounded theory method of analysis. Participants identified interpersonal (making connections and making a difference) and intrapersonal (making meaning and making a life) sources of joy and reflected significant personal initiative in the process of finding joy. The authors present findings regarding these intrapersonal sources of joy.

  8. The Use of Modelling for Theory Building in Qualitative Analysis

    Science.gov (United States)

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  9. Facilitating Constructive Alignment in Power Systems Engineering Education Using Free and Open-Source Software

    Science.gov (United States)

    Vanfretti, L.; Milano, F.

    2012-01-01

    This paper describes how the use of free and open-source software (FOSS) can facilitate the application of constructive alignment theory in power systems engineering education by enabling the deep learning approach in power system analysis courses. With this aim, this paper describes the authors' approach in using the Power System Analysis Toolbox…

  10. Fit for Practice: Analysis and Evaluation of Watson's Theory of Human Caring.

    Science.gov (United States)

    Pajnkihar, Majda; McKenna, Hugh P; Štiglic, Gregor; Vrbnjak, Dominika

    2017-07-01

    The aim of the authors of this paper is to analyze Watson's theory of human caring for its usefulness and worth in education, practice, and research. The reason for undertaking this analysis is to evaluate if Watson's theory would be useful for nursing in those countries where such theories were not an established part of the nursing curriculum. Furthermore, in some European countries, their political past or cultural influences led to an unquestioned adoption of the biomedical model. As their political culture changes, many social structures have had to be revisited, and for nursing, this has meant the introduction of theoretical reasoning, teaching, and practice.

  11. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  12. Exploratory analysis of a neutron-rich nuclei source based on photo-fission

    CERN Document Server

    Mirea, M; Clapier, F; Essabaa, S; Groza, L; Ibrahim, F; Kandri-Rody, S; Müller, A C; Pauwels, N; Proust, J

    2003-01-01

    A source of neutron rich ions can be conceived through the photo-fission process. An exploratory study of such a source is realized. A survey of the radiative electron energy loss theory is reported in order to estimate numerically the bremsstrahlung production of thick targets. The resulted bremsstrahlung angular and energy theoretical distributions delivered from W and UCx thick converters are presented and compared with previous results. Some quantities as the number of fission events produced in the fissionable source and the energy loss in the converters are also reported as function of the geometry of the combination and the incident electron energy. An attempt of comparison with experimental data shows a quantitative agreement. This study is focussed on initial kinetic energies of the electron beam included in the range 30-60 MeV, suitable for the production of large radiative gamma-ray yields able to induce the $^{238}$U fission through the giant dipole resonance. A confrontation with the number of fi...

  13. Combinatorial constructions in ergodic theory and dynamics

    CERN Document Server

    Katok, Anatole

    2003-01-01

    Ergodic theory studies measure-preserving transformations of measure spaces. These objects are intrinsically infinite and the notion of an individual point or an orbit makes no sense. Still there is a variety of situations when a measure-preserving transformation (and its asymptotic behavior) can be well described as a limit of certain finite objects (periodic processes). In the first part of this book this idea is developed systematically, genericity of approximation in various categories is explored, and numerous applications are presented, including spectral multiplicity and properties of the maximal spectral type. The second part of the book contains a treatment of various constructions of cohomological nature with an emphasis on obtaining interesting asymptotic behavior from approximate pictures at different time scales. The book presents a view of ergodic theory not found in other expository sources and is suitable for graduate students familiar with measure theory and basic functional analysis.

  14. Acoustic array systems theory, implementation, and application

    CERN Document Server

    Bai, Mingsian R; Benesty, Jacob

    2013-01-01

    Presents a unified framework of far-field and near-field array techniques for noise source identification and sound field visualization, from theory to application. Acoustic Array Systems: Theory, Implementation, and Application provides an overview of microphone array technology with applications in noise source identification and sound field visualization. In the comprehensive treatment of microphone arrays, the topics covered include an introduction to the theory, far-field and near-field array signal processing algorithms, practical implementations, and common applic

  15. Insult in Context: Incorporating Speech Act Theory in Doctrinal Legal Analysis of Interpretative Discussions

    NARCIS (Netherlands)

    H.T.M. Kloosterhuis (Harm)

    2015-01-01

    textabstractIn this article, I want to show that some doctrinal problems of legal interpretation and argumentation can be analysed in a more precise way than a standard doctrinal analysis, when we use insights from speech act theory and argumentation theory. Taking a discussion about the accusation

  16. Theory of Belief Functions for Data Analysis and Machine Learning Applications: Review and Prospects

    Science.gov (United States)

    Denoeux, Thierry

    The Dempster-Shafer theory of belief functions provides a unified framework for handling both aleatory uncertainty, arising from statistical variability in populations, and epistemic uncertainty, arising from incompleteness of knowledge. An overview of both the fundamentals and some recent developments in this theory will first be presented. Several applications in data analysis and machine learning will then be reviewed, including learning under partial supervision, multi-label classification, ensemble clustering and the treatment of pairwise comparisons in sensory or preference analysis.

  17. The liquidity preference theory: a critical analysis

    OpenAIRE

    Giancarlo Bertocco; Andrea Kalajzic

    2014-01-01

    Keynes in the General Theory, explains the monetary nature of the interest rate by means of the liquidity preference theory. The objective of this paper is twofold. First, to point out the limits of the liquidity preference theory. Second, to present an explanation of the monetary nature of the interest rate based on the arguments with which Keynes responded to the criticism levelled at the liquidity preference theory by supporters of the loanable funds theory such as Ohlin and Robertson. It ...

  18. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Science.gov (United States)

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  19. Enhancing source location protection in wireless sensor networks

    Science.gov (United States)

    Chen, Juan; Lin, Zhengkui; Wu, Di; Wang, Bailing

    2015-12-01

    Wireless sensor networks are widely deployed in the internet of things to monitor valuable objects. Once the object is monitored, the sensor nearest to the object which is known as the source informs the base station about the object's information periodically. It is obvious that attackers can capture the object successfully by localizing the source. Thus, many protocols have been proposed to secure the source location. However, in this paper, we examine that typical source location protection protocols generate not only near but also highly localized phantom locations. As a result, attackers can trace the source easily from these phantom locations. To address these limitations, we propose a protocol to enhance the source location protection (SLE). With phantom locations far away from the source and widely distributed, SLE improves source location anonymity significantly. Theory analysis and simulation results show that our SLE provides strong source location privacy preservation and the average safety period increases by nearly one order of magnitude compared with existing work with low communication cost.

  20. Music from Ground Zero. The coloniality of theory and musical analysis at the university.

    Directory of Open Access Journals (Sweden)

    Pilar Jovanna Holguín

    2018-01-01

    Full Text Available This article seeks to answer the influence of the coloniality of knowledge, the cleansing of blood and the hybris of the zero ground in the conception of the theory and musical analysis which is taught in our Latin American universities. The aim is to examine the survival of coloniality in the imaginary of musical theory and analysis in order to recognize the predominant hegemonic discourses and propose some options to decolonize these disciplines of professional music education. The rationale is based on the presentation of some concepts of decolonial studies, research on the conservatory model and the ideology of theory and analysis.The article is divided into three parts. In the first one, an approximation is made to the concept of coloniality and its categories. The second takes the categories to review the ideology of the chosen fields of musical knowledge and the third proposes some options to decolonize our conceptions in higher education.

  1. Foreshocks and aftershocks of strong earthquakes in the light of catastrophe theory

    International Nuclear Information System (INIS)

    Guglielmi, A V

    2015-01-01

    In this review, general ideas and specific results from catastrophe theory and the theory of critical phenomena are applied to the analysis of strong earthquakes. Aspects given particular attention are the sharp rise in the fluctuation level, the increased reactivity of dynamical systems in the near-threshold region, and other anomalous phenomena similar to critical opalescence. Given the lack of a sufficiently complete theory of earthquakes, this appears to be a valid approach to the analysis of observations. The study performed brought out some nontrivial properties of a strong-earthquake source that manifest themselves both before and after the main rupture discontinuity forms at the mainshock. In the course of the analysis of the foreshocks and aftershocks, such concepts as the round-the-world seismic echo, the cumulative effect of converging surface waves on the epicentral zone, and global seismicity modulation by Earth's free oscillations are introduced. Further research in this field is likely to be interesting and promising. (methodological notes)

  2. Foreshocks and aftershocks of strong earthquakes in the light of catastrophe theory

    Science.gov (United States)

    Guglielmi, A. V.

    2015-04-01

    In this review, general ideas and specific results from catastrophe theory and the theory of critical phenomena are applied to the analysis of strong earthquakes. Aspects given particular attention are the sharp rise in the fluctuation level, the increased reactivity of dynamical systems in the near-threshold region, and other anomalous phenomena similar to critical opalescence. Given the lack of a sufficiently complete theory of earthquakes, this appears to be a valid approach to the analysis of observations. The study performed brought out some nontrivial properties of a strong-earthquake source that manifest themselves both before and after the main rupture discontinuity forms at the mainshock. In the course of the analysis of the foreshocks and aftershocks, such concepts as the round-the-world seismic echo, the cumulative effect of converging surface waves on the epicentral zone, and global seismicity modulation by Earth's free oscillations are introduced. Further research in this field is likely to be interesting and promising.

  3. EVOLUTIONARY THEORY AND THE MARKET COMPETITION

    Directory of Open Access Journals (Sweden)

    SIRGHI Nicoleta

    2014-12-01

    Full Text Available Evolutionary theory study of processes that transform economy for firms, institutions, industries, employment, production, trade and growth within, through the actions of diverse agents from experience and interactions, using evolutionary methodology. Evolutionary theory analyses the unleashing of a process of technological and institutional innovation by generating and testing a diversity of ideas which discover and accumulate more survival value for the costs incurred than competing alternatives.This paper presents study the behavior of the firms on the market used the evolutionary theory.The paper is to present in full the developments that have led to the re-assessment of theories of firms starting from the criticism on Coase's theory based on the lack of testable hypotheses and on non-operative definition of transaction costs. In the literature in the field studies on firms were allotted a secondary place for a long period of time, to date the new theories of the firm hold a dominant place in the firms’ economic analysis. In an article, published in 1937, Ronald H. Coase identified the main sources of the cost of using the market mechanism. The firms theory represent a issue intensively studied in the literature in the field, regarding the survival, competitiveness and innovation of firm on the market. The research of Nelson and Winter, “An Evolutionary Theory of Economic Change” (1982 is the starting point for a modern literature in the field which considers the approach of the theory of the firm from an evolutionary perspective. Nelson and Winter have shown that the “orthodox” theory, is objectionable primarily by the fact that the hypothesis regarding profit maximization has a normative character and is not valid in any situation. Nelson and Winter reconsidered their microeconomic analysis showing that excessive attention should not be paid to market equilibrium but rather to dynamic processes resulting from irreversible

  4. A Workshop on Algebraic Design Theory and Hadamard Matrices

    CERN Document Server

    2015-01-01

    This volume develops the depth and breadth of the mathematics underlying the construction and analysis of Hadamard matrices and their use in the construction of combinatorial designs. At the same time, it pursues current research in their numerous applications in security and cryptography, quantum information, and communications. Bridges among diverse mathematical threads and extensive applications make this an invaluable source for understanding both the current state of the art and future directions. The existence of Hadamard matrices remains one of the most challenging open questions in combinatorics. Substantial progress on their existence has resulted from advances in algebraic design theory using deep connections with linear algebra, abstract algebra, finite geometry, number theory, and combinatorics. Hadamard matrices arise in a very diverse set of applications. Starting with applications in experimental design theory and the theory of error-correcting codes, they have found unexpected and important ap...

  5. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  6. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Science.gov (United States)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  7. GEOSPATIAL ANALYSIS OF ATMOSPHERIC HAZE EFFECT BY SOURCE AND SINK LANDSCAPE

    Directory of Open Access Journals (Sweden)

    T. Yu

    2017-09-01

    Full Text Available Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents

  8. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    Science.gov (United States)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  9. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  10. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  11. Human factors and fuzzy set theory for safety analysis

    International Nuclear Information System (INIS)

    Nishiwaki, Y.

    1987-01-01

    Human reliability and performance is affected by many factors: medical, physiological and psychological, etc. The uncertainty involved in human factors may not necessarily be probabilistic, but fuzzy. Therefore, it is important to develop a theory by which both the non-probabilistic uncertainties, or fuzziness, of human factors and the probabilistic properties of machines can be treated consistently. In reality, randomness and fuzziness are sometimes mixed. From the mathematical point of view, probabilistic measures may be considered a special case of fuzzy measures. Therefore, fuzzy set theory seems to be an effective tool for analysing man-machine systems. The concept 'failure possibility' based on fuzzy sets is suggested as an approach to safety analysis and fault diagnosis of a large complex system. Fuzzy measures and fuzzy integrals are introduced and their possible applications are also discussed. (author)

  12. Foundations of complex analysis in non locally convex spaces function theory without convexity condition

    CERN Document Server

    Bayoumi, A

    2003-01-01

    All the existing books in Infinite Dimensional Complex Analysis focus on the problems of locally convex spaces. However, the theory without convexity condition is covered for the first time in this book. This shows that we are really working with a new, important and interesting field. Theory of functions and nonlinear analysis problems are widespread in the mathematical modeling of real world systems in a very broad range of applications. During the past three decades many new results from the author have helped to solve multiextreme problems arising from important situations, non-convex and

  13. Science and information theory

    CERN Document Server

    Brillouin, Léon

    1962-01-01

    A classic source for exploring the connections between information theory and physics, this text is geared toward upper-level undergraduates and graduate students. The author, a giant of 20th-century mathematics, applies the principles of information theory to a variety of issues, including Maxwell's demon, thermodynamics, and measurement problems. 1962 edition.

  14. Group theory for chemists fundamental theory and applications

    CERN Document Server

    Molloy, K C

    2010-01-01

    The basics of group theory and its applications to themes such as the analysis of vibrational spectra and molecular orbital theory are essential knowledge for the undergraduate student of inorganic chemistry. The second edition of Group Theory for Chemists uses diagrams and problem-solving to help students test and improve their understanding, including a new section on the application of group theory to electronic spectroscopy.Part one covers the essentials of symmetry and group theory, including symmetry, point groups and representations. Part two deals with the application of group theory t

  15. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  16. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  17. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  18. Mind-wandering, cognition, and performance: a theory-driven meta-analysis of attention regulation.

    Science.gov (United States)

    Randall, Jason G; Oswald, Frederick L; Beier, Margaret E

    2014-11-01

    The current meta-analysis accumulates empirical findings on the phenomenon of mind-wandering, integrating and interpreting findings in light of psychological theories of cognitive resource allocation. Cognitive resource theory emphasizes both individual differences in attentional resources and task demands together to predict variance in task performance. This theory motivated our conceptual and meta-analysis framework by introducing moderators indicative of task-demand to predict who is more likely to mind-wander under what conditions, and to predict when mind-wandering and task-related thought are more (or less) predictive of task performance. Predictions were tested via a random-effects meta-analysis of correlations obtained from normal adult samples (k = 88) based on measurement of specified episodes of off-task and/or on-task thought frequency and task performance. Results demonstrated that people with fewer cognitive resources tend to engage in more mind-wandering, whereas those with more cognitive resources are more likely to engage in task-related thought. Addressing predictions of resource theory, we found that greater time-on-task-although not greater task complexity-tended to strengthen the negative relation between cognitive resources and mind-wandering. Additionally, increases in mind-wandering were generally associated with decreases in task performance, whereas increases in task-related thought were associated with increased performance. Further supporting resource theory, the negative relation between mind-wandering and performance was more pronounced for more complex tasks, though not longer tasks. Complementarily, the positive association between task-related thought and performance was stronger for more complex tasks and for longer tasks. We conclude by discussing implications and future research directions for mind-wandering as a construct of interest in psychological research. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Quantum biological information theory

    CERN Document Server

    Djordjevic, Ivan B

    2016-01-01

    This book is a self-contained, tutorial-based introduction to quantum information theory and quantum biology. It serves as a single-source reference to the topic for researchers in bioengineering, communications engineering, electrical engineering, applied mathematics, biology, computer science, and physics. The book provides all the essential principles of the quantum biological information theory required to describe the quantum information transfer from DNA to proteins, the sources of genetic noise and genetic errors as well as their effects. Integrates quantum information and quantum biology concepts; Assumes only knowledge of basic concepts of vector algebra at undergraduate level; Provides a thorough introduction to basic concepts of quantum information processing, quantum information theory, and quantum biology; Includes in-depth discussion of the quantum biological channel modelling, quantum biological channel capacity calculation, quantum models of aging, quantum models of evolution, quantum models o...

  20. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  1. Scalar radiation from a radially infalling source into a Schwarzschild black hole in the framework of quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Leandro A. [Campus Salinopolis, Universidade Federal do Para, Salinopolis, Para (Brazil); Universidade Federal do Para, Faculdade de Fisica, Belem, Para (Brazil); Crispino, Luis C.B. [Universidade Federal do Para, Faculdade de Fisica, Belem, Para (Brazil); Higuchi, Atsushi [University of York, Department of Mathematics, Heslington, York (United Kingdom)

    2018-02-15

    We investigate the radiation to infinity of a massless scalar field from a source falling radially towards a Schwarzschild black hole using the framework of the quantum field theory at tree level. When the source falls from infinity, the monopole radiation is dominant for low initial velocities. Higher multipoles become dominant at high initial velocities. It is found that, as in the electromagnetic and gravitational cases, at high initial velocities the energy spectrum for each multipole with l ≥ 1 approximately is constant up to the fundamental quasinormal frequency and then drops to zero. We also investigate the case where the source falls from rest at a finite distance from the black hole. It is found that the monopole and dipole contributions in this case are dominant. This case needs to be carefully distinguished from the unphysical process where the source abruptly appears at rest and starts falling, which would result in radiation of an infinite amount of energy. We also investigate the radiation of a massless scalar field to the horizon of the black hole, finding some features similar to the gravitational case. (orig.)

  2. Analysis on the inbound tourist source market in Fujian Province

    Science.gov (United States)

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  3. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices develo...

  4. What can organisational theory offer knowledge translation in healthcare? A thematic and lexical analysis.

    Science.gov (United States)

    Dadich, Ann; Doloswala, Navin

    2018-05-10

    Despite the relative abundance of frameworks and models to guide implementation science, the explicit use of theory is limited. Bringing together two seemingly disparate fields of research, this article asks, what can organisational theory offer implementation science? This is examined by applying a theoretical lens that incorporates agency, institutional, and situated change theories to understand the implementation of healthcare knowledge into practice. Interviews were conducted with 20 general practitioners (GPs) before and after using a resource to facilitate evidence-based sexual healthcare. Research material was analysed using two approaches - researcher-driven thematic coding and lexical analysis, which was relatively less researcher-driven. The theoretical lens elucidated the complex pathways of knowledge translation. More specifically, agency theory revealed tensions between the GP as agent and their organisations and patients as principals. Institutional theory highlighted the importance of GP-embeddedness within their chosen specialty of general practice; their medical profession; and the practice in which they worked. Situated change theory exposed the role of localised adaptations over time - a metamorphosis. This study has theoretical, methodological, and practical implications. Theoretically, it is the first to examine knowledge translation using a lens premised on agency, institutional, and situated change theories. Methodologically, the study highlights the complementary value of researcher-driven and researcher-guided analysis of qualitative research material. Practically, this study signposts opportunities to facilitate knowledge translation - more specifically, it suggests that efforts to shape clinician practices should accommodate the interrelated influence of the agent and the institution, and recognise that change can be ever so subtle.

  5. What kind of theory is music theory? : Epistemological exercises in music theory and analysis

    OpenAIRE

    2008-01-01

    Music theory has long aligned itself with the sciences - particularly with physics, mathematics, and experimental psychology - seeking to cloak itself in the mantle of their epistemological legitimacy. This affinity, which was foreshadowed in music's inclusion in the medieval quadrivium alongside geometry, astronomy, and arithmetic, is evident throughout the history of music theory from the scientific revolution onward. Yet, as eager as music theorists have been to claim the epistemological p...

  6. Fringe pattern analysis for optical metrology theory, algorithms, and applications

    CERN Document Server

    Servin, Manuel; Padilla, Moises

    2014-01-01

    The main objective of this book is to present the basic theoretical principles and practical applications for the classical interferometric techniques and the most advanced methods in the field of modern fringe pattern analysis applied to optical metrology. A major novelty of this work is the presentation of a unified theoretical framework based on the Fourier description of phase shifting interferometry using the Frequency Transfer Function (FTF) along with the theory of Stochastic Process for the straightforward analysis and synthesis of phase shifting algorithms with desired properties such

  7. Transport perturbation theory in nuclear reactor analysis

    International Nuclear Information System (INIS)

    Nishigori, Takeo; Takeda, Toshikazu; Selvi, S.

    1985-01-01

    Perturbation theory is formulated on the basis of transport theory to obtain a formula for the reactivity changes due to possible variations of cross sections. Useful applications to cell homogenization are presented for the whole core calculation in transport and in diffusion theories. (author)

  8. Four-point correlation function of stress-energy tensors in N=4 superconformal theories

    CERN Document Server

    Korchemsky, G P

    2015-01-01

    We derive the explicit expression for the four-point correlation function of stress-energy tensors in four-dimensional N=4 superconformal theory. We show that it has a remarkably simple and suggestive form allowing us to predict a large class of four-point correlation functions involving the stress-energy tensor and other conserved currents. We then apply the obtained results on the correlation functions to computing the energy-energy correlations, which measure the flow of energy in the final states created from the vacuum by a source. We demonstrate that they are given by a universal function independent of the choice of the source. Our analysis relies only on N=4 superconformal symmetry and does not use the dynamics of the theory.

  9. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Science.gov (United States)

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  10. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  11. Communicating Science to Impact Learning? A Phenomenological Inquiry into 4th and 5th Graders' Perceptions of Science Information Sources

    Science.gov (United States)

    Gelmez Burakgazi, Sevinc; Yildirim, Ali; Weeth Feinstein, Noah

    2016-01-01

    Rooted in science education and science communication studies, this study examines 4th and 5th grade students' perceptions of science information sources (SIS) and their use in communicating science to students. It combines situated learning theory with uses and gratifications theory in a qualitative phenomenological analysis. Data were gathered…

  12. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    Directory of Open Access Journals (Sweden)

    Charles M. Schweik

    2013-01-01

    Full Text Available In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,333 and implementing a major survey of open source developers (n=1403, we were able to conduct statistical analyses to investigate over forty theoretically-based testable hypotheses. Our data firmly support what we call the conventional theory of open source software, showing that projects start small, and, in successful cases, grow slightly larger in terms of team size. We describe the “virtuous circle” supporting conventional wisdom of open source collaboration that comes out of this analysis, and we discuss two other interesting findings related to developer motivations and how team members find each other. Each of these findings is related to the sustainability of these projects.

  13. Conceptual and critical analysis of the Implicit Leadership Theory

    OpenAIRE

    Hernández Avilés, Omar David; García Ramos, Tania

    2013-01-01

    The purpose of this essay is to present a conceptual and critical analysis of the Implicit Leadership Theory (ILT). The objectives are: 1) explaining the main concepts of the ILT; 2) explaining the main processes of the ILT; 3) identifying constructivist assumptions in the ILT; 4) identifying constructionist assumptions in the ILT, and 5) analyzing critically theoretical assumptions of the ILT. At analyzing constructivism and constructionism assumptions in the ILP, the constructivist leadersh...

  14. Comparative analysis of methods and sources of financing of the transport organizations activity

    Science.gov (United States)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  15. Buckling analysis of micro- and nano-rods/tubes based on nonlocal Timoshenko beam theory

    International Nuclear Information System (INIS)

    Wang, C M; Zhang, Y Y; Ramesh, Sai Sudha; Kitipornchai, S

    2006-01-01

    This paper is concerned with the elastic buckling analysis of micro- and nano-rods/tubes based on Eringen's nonlocal elasticity theory and the Timoshenko beam theory. In the former theory, the small scale effect is taken into consideration while the effect of transverse shear deformation is accounted for in the latter theory. The governing equations and the boundary conditions are derived using the principle of virtual work. Explicit expressions for the critical buckling loads are derived for axially loaded rods/tubes with various end conditions. These expressions account for a better representation of the buckling behaviour of micro- and nano-rods/tubes where small scale effect and transverse shear deformation effect are significant. By comparing it with the classical beam theories, the sensitivity of the small scale effect on the buckling loads may be observed

  16. Systematic uncertainties in direct reaction theories

    International Nuclear Information System (INIS)

    Lovell, A E; Nunes, F M

    2015-01-01

    Nuclear reactions are common probes to study nuclei and in particular, nuclei at the limits of stability. The data from reaction measurements depend strongly on theory for a reliable interpretation. Even when using state-of-the-art reaction theories, there are a number of sources of systematic uncertainties. These uncertainties are often unquantified or estimated in a very crude manner. It is clear that for theory to be useful, a quantitative understanding of the uncertainties is critical. Here, we discuss major sources of uncertainties in a variety of reaction theories used to analyze (d,p) nuclear reactions in the energy range E d = 10–20 MeV, and we provide a critical view on how these have been handled in the past and how estimates can be improved. (paper)

  17. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  18. An introduction to Item Response Theory and Rasch Analysis of the Eating Assessment Tool (EAT-10).

    Science.gov (United States)

    Kean, Jacob; Brodke, Darrel S; Biber, Joshua; Gross, Paul

    2018-03-01

    Item response theory has its origins in educational measurement and is now commonly applied in health-related measurement of latent traits, such as function and symptoms. This application is due in large part to gains in the precision of measurement attributable to item response theory and corresponding decreases in response burden, study costs, and study duration. The purpose of this paper is twofold: introduce basic concepts of item response theory and demonstrate this analytic approach in a worked example, a Rasch model (1PL) analysis of the Eating Assessment Tool (EAT-10), a commonly used measure for oropharyngeal dysphagia. The results of the analysis were largely concordant with previous studies of the EAT-10 and illustrate for brain impairment clinicians and researchers how IRT analysis can yield greater precision of measurement.

  19. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  20. Transfer of learning between 2D and 3D sources during infancy: Informing theory and practice.

    Science.gov (United States)

    Barr, Rachel

    2010-06-01

    The ability to transfer learning across contexts is an adaptive skill that develops rapidly during early childhood. Learning from television is a specific instance of transfer of learning between a 2-Dimensional (2D) representation and a 3-Dimensional (3D) object. Understanding the conditions under which young children might accomplish this particular kind of transfer is important because by 2 years of age 90% of US children are viewing television on a daily basis. Recent research shows that children can imitate actions presented on television using the corresponding real-world objects, but this same research also shows that children learn less from television than they do from live demonstrations until they are at least 3 years old; termed the video deficit effect. At present, there is no coherent theory to account for the video deficit effect; how learning is disrupted by this change in context is poorly understood. The aims of the present review are (1) to review the conditions under which children transfer learning between 2D images and 3D objects during early childhood, and (2) to integrate developmental theories of memory processing into the transfer of learning from media literature using Hayne's (2004) developmental representational flexibility account. The review will conclude that studies on the transfer of learning between 2D and 3D sources have important theoretical implications for general developmental theories of cognitive development, and in particular the development of a flexible representational system, as well as policy implications for early education regarding the potential use and limitations of media as effective teaching tools during early childhood.

  1. Quantum theory at the crossroads reconsidering the 1927 Solvay conference

    CERN Document Server

    Bacciagaluppi, Guido

    2009-01-01

    We reconsider the crucial 1927 Solvay conference in the context of current research in the foundations of quantum theory. Contrary to folklore, the interpretation question was not settled at this conference and no consensus was reached; instead, a range of sharply conflicting views were presented and extensively discussed. Today, there is no longer an established or dominant interpretation of quantum theory, so it is important to re-evaluate the historical sources and keep the interpretation debate open. In this spirit, we provide a complete English translation of the original proceedings (lectures and discussions), and give background essays on the three main interpretations presented: de Broglie's pilot-wave theory, Born and Heisenberg's quantum mechanics, and Schroedinger's wave mechanics. We provide an extensive analysis of the lectures and discussions that took place, in the light of current debates about the meaning of quantum theory. The proceedings contain much unexpected material, including extensive...

  2. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  3. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  4. Development of a theory of implementation and integration: Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    May Carl R

    2009-05-01

    Full Text Available Abstract Background Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built. Methods Between 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model. Results Each phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory. Conclusion Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.

  5. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  6. Statistical analysis of activation and reaction energies with quasi-variational coupled-cluster theory

    Science.gov (United States)

    Black, Joshua A.; Knowles, Peter J.

    2018-06-01

    The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.

  7. Spectrographic analysis

    International Nuclear Information System (INIS)

    Quinn, C.A.

    1983-01-01

    The article deals with spectrographic analysis and the analytical methods based on it. The theory of spectrographic analysis is discussed as well as the layout of a spectrometer system. The infrared absorption spectrum of a compound is probably its most unique property. The absorption of infrared radiation depends on increasing the energy of vibration and rotation associated with a covalent bond. The infrared region is intrinsically low in energy thus the design of infrared spectrometers is always directed toward maximising energy throughput. The article also considers atomic absorption - flame atomizers, non-flame atomizers and the source of radiation. Under the section an emission spectroscopy non-electrical energy sources, electrical energy sources and electrical flames are discussed. Digital computers form a part of the development on spectrographic instrumentation

  8. Qualitative analysis of cosmological models in Brans-Dicke theory, solutions from non-minimal coupling and viscous universe

    International Nuclear Information System (INIS)

    Romero Filho, C.A.

    1988-01-01

    Using dynamical system theory we investigate homogeneous and isotropic models in Brans-Dicke theory for perfect fluids with general equation of state and arbitrary ω. Phase diagrams are drawn on the Poincare sphere which permits a qualitative analysis of the models. Based on this analysis we construct a method for generating classes of solutions in Brans-Dicke theory. The same technique is used for studying models arising from non-minimal coupling of electromagnetism with gravity. In addition, viscous fluids are considered and non-singular solutions with bulk viscosity are found. (author)

  9. T.I.Tech./K.E.S. Conference on Nonlinear and Convex Analysis in Economic Theory

    CERN Document Server

    Takahashi, Wataru

    1995-01-01

    The papers collected in this volume are contributions to T.I.Tech./K.E.S. Conference on Nonlinear and Convex Analysis in Economic Theory, which was held at Keio University, July 2-4, 1993. The conference was organized by Tokyo Institute of Technology (T. I. Tech.) and the Keio Economic Society (K. E. S.) , and supported by Nihon Keizai Shimbun Inc .. A lot of economic problems can be formulated as constrained optimiza­ tions and equilibrations of their solutions. Nonlinear-convex analysis has been supplying economists with indispensable mathematical machineries for these problems arising in economic theory. Conversely, mathematicians working in this discipline of analysis have been stimulated by various mathematical difficulties raised by economic the­ ories. Although our special emphasis was laid upon "nonlinearity" and "con­ vexity" in relation with economic theories, we also incorporated stochastic aspects of financial economics in our project taking account of the remark­ able rapid growth of this dis...

  10. Critical Analysis on Open Source LMSs Using FCA

    Science.gov (United States)

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  11. Obsidian sources characterized by neutron-activation analysis.

    Science.gov (United States)

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  12. Analysis and Optimal Condition of the Rear-Sound-Aided Control Source in Active Noise Control

    Directory of Open Access Journals (Sweden)

    Karel Kreuter

    2011-01-01

    Full Text Available An active noise control scenario of simple ducts is considered. The previously suggested technique of using an single loudspeaker and its rear sound to cancel the upstream sound is further examined and compared to the bidirectional solution in order to give theoretical proof of its advantage. Firstly, a model with a new approach for taking damping effects into account is derived based on the electrical transmission line theory. By comparison with the old model, the new approach is validated, and occurring differences are discussed. Moreover, a numerical application with the consideration of damping is implemented for confirmation. The influence of the rear sound strength on the feedback-path system is investigated, and the optimal condition is determined. Finally, it is proven that the proposed source has an advantage of an extended phase lag and a time delay in the feedback-path system by both frequency-response analysis and numerical calculation of the time response.

  13. Total source charge and charge screening in Yang-Mills theories

    International Nuclear Information System (INIS)

    Campbell, W.B.; Norton, R.E.

    1991-01-01

    New gauge-invariant definitions for the total charge on a static Yang-Mills source are suggested which we argue are better suited for determining when true color screening has occurred. In particular, these new definitions imply that the Abelian Coulomb solution for a simple ''electric'' dipole source made up of two opposite point charges has zero total source charge and therefore no color screening. With the definition of total source charge previously suggested by other authors, such a source would have a total source charge of 2q and therefore a screening charge in the field of -2q, where q is the magnitude of the charge of either point charge. Our definitions for more general solutions are not unique because of the path dependence of the parallel transport of charges. Suggestions for removing this ambiguity are offered, but it is not known if a unique, physically meaningful definition of total source charge in fact exists

  14. Organizational Theories and Analysis: A Feminist Perspective

    Science.gov (United States)

    Irefin, Peace; Ifah, S. S.; Bwala, M. H.

    2012-06-01

    This paper is a critique of organization theories and their failure to come to terms with the fact of the reproduction of labour power within a particular form of the division of labour. It examines feminist theory and its aims to understand the nature of inequality and focuses on gender, power relations and sexuality part of the task of feminists which organizational theories have neglected is to offer an account of how the different treatments of the sexes operate in our culture. The paper concludes that gender has been completely neglected within the organizational theory which result in a rhetorical reproduction of males as norms and women as others. It is recommended that only radical form of organization theory can account for the situation of women in organisational setting

  15. Pragmatism and practice theory

    DEFF Research Database (Denmark)

    Buch, Anders; Elkjær, Bente

    Proponents of the ‘practice turn’ in the social sciences rarely mention American pragmatism as a source of inspiration or refer to pragmatist philosophy. This strikes us as not only odd, but also a disadvantage since the pragmatist legacy has much to offer practice theory in the study of organiza......Proponents of the ‘practice turn’ in the social sciences rarely mention American pragmatism as a source of inspiration or refer to pragmatist philosophy. This strikes us as not only odd, but also a disadvantage since the pragmatist legacy has much to offer practice theory in the study...... of organizations. In this paper we want to spell out the theoretical similarities and divergences between practice theory and pragmatism to consider whether the two traditions can find common ground when gazing upon organization studies. We suggest that pragmatism should be included in the ‘tool-kit’ of practice...

  16. Analysis of 3-panel and 4-panel microscale ionization sources

    International Nuclear Information System (INIS)

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  17. Prospect Theory and the Risks Involved in Decision-Making: Content Analysis in ProQuest Articles

    Directory of Open Access Journals (Sweden)

    Sady Darcy da Silva-Junior

    2016-04-01

    Full Text Available In this study, the objective is to perform content analysis on articles of a reliable database, dealing with the prospect theory and the risks involved in the decision making process, evaluating some criteria for the theoretical and methodological approaches that allow a joint analysis and comparative. Therefore, a search in ProQuest database was performed which resulted in 15 articles that were submitted to content analysis process, based on the evaluation of nine factors identified by researchers. Among the results highlight the critical attitude to the prospect theory, in contrast to the assertion of his representative capacity of real situations and application in various situations.

  18. Dosimetric analysis of radiation sources for use dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  19. Relevance analysis and short-term prediction of PM2.5 concentrations in Beijing based on multi-source data

    Science.gov (United States)

    Ni, X. Y.; Huang, H.; Du, W. P.

    2017-02-01

    The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.

  20. All the fundamental massless bosonic fields in superstring theory

    International Nuclear Information System (INIS)

    Manoukian, E.B.

    2012-01-01

    A systematic analysis of all the massless bosonic fields in superstring theory is carried out. Emphasis is put on the derivation of their propagators, their polarization aspects and the investigation of their underlying constraints as well as their number of degrees of freedom. The treatment is given in the presence of external sources, in the celebrated Coulomb gauge, ensuring the positivity of the formalism - a result which is also established in the process. The challenge here is the investigation involved in the self-dual fourth rank anti-symmetric tensor field. No constraints are imposed on the external sources so that their components may be varied independently, thus the complete expressions of the propagators may be obtained. As emphasized in our earlier work, the latter condition is an important one in dynamical theories with constraints giving rise to modifications as Faddeev-Popov factors. The analysis is carried out in 10 dimensions, not only because of the consistency requirement by the superstrings, but also in order to take into account of the self-duality character of the fourth rank anti-symmetric tensor field as spelled out in the paper. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  1. Risk Route Choice Analysis and the Equilibrium Model under Anticipated Regret Theory

    Directory of Open Access Journals (Sweden)

    pengcheng yuan

    2014-02-01

    Full Text Available The assumption about travellers’ route choice behaviour has major influence on the traffic flow equilibrium analysis. Previous studies about the travellers’ route choice were mainly based on the expected utility maximization theory. However, with the gradually increasing knowledge about the uncertainty of the transportation system, the researchers have realized that there is much constraint in expected util­ity maximization theory, because expected utility maximiza­tion requires travellers to be ‘absolutely rational’; but in fact, travellers are not truly ‘absolutely rational’. The anticipated regret theory proposes an alternative framework to the tra­ditional risk-taking in route choice behaviour which might be more scientific and reasonable. We have applied the antici­pated regret theory to the analysis of the risk route choosing process, and constructed an anticipated regret utility func­tion. By a simple case which includes two parallel routes, the route choosing results influenced by the risk aversion degree, regret degree and the environment risk degree have been analyzed. Moreover, the user equilibrium model based on the anticipated regret theory has been established. The equivalence and the uniqueness of the model are proved; an efficacious algorithm is also proposed to solve the model. Both the model and the algorithm are demonstrated in a real network. By an experiment, the model results and the real data have been compared. It was found that the model re­sults can be similar to the real data if a proper regret degree parameter is selected. This illustrates that the model can better explain the risk route choosing behaviour. Moreover, it was also found that the traveller’ regret degree increases when the environment becomes more and more risky.

  2. An open-source solution for advanced imaging flow cytometry data analysis using machine learning.

    Science.gov (United States)

    Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew

    2017-01-01

    Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery

  3. Dimensional analysis, similarity, analogy, and the simulation theory

    International Nuclear Information System (INIS)

    Davis, A.A.

    1978-01-01

    Dimensional analysis, similarity, analogy, and cybernetics are shown to be four consecutive steps in application of the simulation theory. This paper introduces the classes of phenomena which follow the same formal mathematical equations as models of the natural laws and the interior sphere of restraints groups of phenomena in which one can introduce simplfied nondimensional mathematical equations. The simulation by similarity in a specific field of physics, by analogy in two or more different fields of physics, and by cybernetics in nature in two or more fields of mathematics, physics, biology, economics, politics, sociology, etc., appears as a unique theory which permits one to transport the results of experiments from the models, convenably selected to meet the conditions of researches, constructions, and measurements in the laboratories to the originals which are the primary objectives of the researches. Some interesting conclusions which cannot be avoided in the use of simplified nondimensional mathematical equations as models of natural laws are presented. Interesting limitations on the use of simulation theory based on assumed simplifications are recognized. This paper shows as necessary, in scientific research, that one write mathematical models of general laws which will be applied to nature in its entirety. The paper proposes the extent of the second law of thermodynamics as the generalized law of entropy to model life and its activities. This paper shows that the physical studies and philosophical interpretations of phenomena and natural laws cannot be separated in scientific work; they are interconnected and one cannot be put above the others

  4. PyLDM - An open source package for lifetime density analysis of time-resolved spectroscopic data.

    Directory of Open Access Journals (Sweden)

    Gabriel F Dorlhiac

    2017-05-01

    Full Text Available Ultrafast spectroscopy offers temporal resolution for probing processes in the femto- and picosecond regimes. This has allowed for investigation of energy and charge transfer in numerous photoactive compounds and complexes. However, analysis of the resultant data can be complicated, particularly in more complex biological systems, such as photosystems. Historically, the dual approach of global analysis and target modelling has been used to elucidate kinetic descriptions of the system, and the identity of transient species respectively. With regards to the former, the technique of lifetime density analysis (LDA offers an appealing alternative. While global analysis approximates the data to the sum of a small number of exponential decays, typically on the order of 2-4, LDA uses a semi-continuous distribution of 100 lifetimes. This allows for the elucidation of lifetime distributions, which may be expected from investigation of complex systems with many chromophores, as opposed to averages. Furthermore, the inherent assumption of linear combinations of decays in global analysis means the technique is unable to describe dynamic motion, a process which is resolvable with LDA. The technique was introduced to the field of photosynthesis over a decade ago by the Holzwarth group. The analysis has been demonstrated to be an important tool to evaluate complex dynamics such as photosynthetic energy transfer, and complements traditional global and target analysis techniques. Although theory has been well described, no open source code has so far been available to perform lifetime density analysis. Therefore, we introduce a python (2.7 based package, PyLDM, to address this need. We furthermore provide a direct comparison of the capabilities of LDA with those of the more familiar global analysis, as well as providing a number of statistical techniques for dealing with the regularization of noisy data.

  5. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution.

    Science.gov (United States)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R

    2015-10-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). Copyright © 2015. Published by Elsevier Ltd.

  6. Mechanics and analysis of beams, columns and cables. A modern introduction to the classic theories

    DEFF Research Database (Denmark)

    Krenk, Steen

    The book illustrates the use of simple mathematical analysis techniques within the area of basic structural mechanics, in particular the elementary theories of beams, columns and cables. The focus is on: i) Identification of the physical background of the theories and their particular mathematical...... properties. ii) Demonstration of mathematical techniques for analysis of simple problems in structural mechanics, and identification of the relevant parameters and properties of the solution. iii) Derivation of the solutions to a number of basic problems of structural mechanics in a form suitable for later...

  7. Modal Analysis of In-Wheel Motor-Driven Electric Vehicle Based on Bond Graph Theory

    Directory of Open Access Journals (Sweden)

    Di Tan

    2017-01-01

    Full Text Available A half-car vibration model of an electric vehicle driven by rear in-wheel motors was developed using bond graph theory and the modular modeling method. Based on the bond graph model, modal analysis was carried out to study the vibration characteristics of the electric vehicle. To verify the effectiveness of the established model, the results were compared to ones computed on the ground of modal analysis and Newton equations. The comparison shows that the vibration model of the electric vehicle based on bond graph theory not only is able to better compute the natural frequency but also can easily determine the deformation mode, momentum mode, and other isomorphism modes and describe the dynamic characteristics of an electric vehicle driven by in-wheel motors more comprehensively than other modal analysis methods.

  8. Contemporary Theories and International Lawmaking

    NARCIS (Netherlands)

    Venzke, I.; Brölmann, C.; Radi, Y.

    2016-01-01

    Many contemporary theories approach international law-making with a shift in emphasis from the sources of law towards the communicative practices in which a plethora of actors use, claim and speak international law. The contribution proceeds by sketching the move from sources to communicative

  9. Soprano and source: A laryngographic analysis

    Science.gov (United States)

    Bateman, Laura Anne

    2005-04-01

    Popular music in the 21st century uses a particular singing quality for female voice that is quite different from the trained classical singing quality. Classical quality has been the subject of a vast body of research, whereas research that deals with non-classical qualities is limited. In order to learn more about these issues, the author chose to do research on singing qualities using a variety of standard voice quality tests. This paper looks at voice qualities found in various different styles of singing: Classical, Belt, Legit, R&B, Jazz, Country, and Pop. The data was elicited from a professional soprano and the voice qualities reflect industry standards. The data set for this paper is limited to samples using the vowel [i]. Laryngographic (LGG) data was generated simultaneously with the audio samples. This paper will focus on the results of the LGG analysis; however, an audio analysis was also performed using Spectrogram, LPC, and FFT. Data from the LGG is used to calculate the contact quotient, speed quotient, and ascending slope. The LGG waveform is also visually assessed. The LGG analysis gives insights into the source vibration for the different singing styles.

  10. Social cognitive theories used to explain physical activity behavior in adolescents: a systematic review and meta-analysis.

    Science.gov (United States)

    Plotnikoff, Ronald C; Costigan, Sarah A; Karunamuni, Nandini; Lubans, David R

    2013-05-01

    To systematically review and examine the explanatory power of key social-cognitive theories used to explain physical activity (PA) intention and behavior, among adolescents. A systematic review and meta-analysis of the literature was performed using the electronic databases Medline, Cumulative Index to Nursing and Allied Health Literature, SPORTdiscus, EBSCO and Education Resources Information Center, Proquest Education Journals Collection, Science Direct, Web of Science and Scopus for social-cognitive theories (i.e., Health Promotion Model, Theory of Planned Behavior, Theory of Reasoned Action, Protection Motivation Theory, Social Cognitive Theory/Self-Efficacy Theory, Health Belief Model, Self-Determination Theory, Transtheoretical Model) used to explain PA intention and behavior. Related keywords in titles, abstracts, or indexing fields were searched. Twenty-three studies satisfied the inclusion criteria and were retained for data extraction and analysis; 16 were cross-sectional studies and seven were longitudinal studies. Most studies employed self-report measures. In general, the models explained greater proportions of variance for intention compared to behavior. The meta-analyses revealed 33% and 48% of the variance respectively for PA and intention were explained by social cognitive models. Few studies have tested the predictive capacity of social cognitive theories to explain objectively measured PA. The majority of PA variance remains unexplained and more theoretical research is needed. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Theoretical foundations of international migration process studies: analysis of key migration theories development

    Directory of Open Access Journals (Sweden)

    Shymanska K.V.

    2017-03-01

    Full Text Available The need for transformation of Ukraine's migration policy based on globalized world development trends and in response to the challenges of European integration transformations causes the need of researching the theoretical and methodological basis of migration studies, and the regulations of existing theories of international migration. The bibliometric analysis of scientific publications on international migration in cites indexes found that the recent researches on these problems acquire interdisciplinary character. It necessitates the transformation of migration study approaches basing on economic, social, institutional theories and concepts synthesis. The article is devoted to the study of theoretical regulations of existing international migration theories in the context of the evolution of scientists’ views on this phenomenon. The author found that the existing theories of international migration should be divided into three categories (microeconomic, macroeconomic, globalizational that contributes to their understanding in the context of implementation possibilities in migrational public administration practice. It allows to determine the theories which should be used for Ukrainian state migration policy constructing and eliminating or reducing the external migration negative effects.

  12. On the relation of the theoretical foundations of quantum theory and general relativity theory

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    The specific content of the present thesis is presented in the following way. First the most important contents of quantum theory and general relativity theory are presented. In connection with the general relativity theory the mathematical property of the diffeomorphism invariance plays the deciding role, while concerning the quantum theory starting from the Copenhagen interpretation first the measurement problem is treated, before basing on the analysis of concrete phenomena and the mathematical apparatus of quantum theory the nonlocality is brought into focus as an important property. This means that both theories suggest a relationalistic view of the nature of the space. This analysis of the theoretical foundations of quantum theory and general relativity theory in relation to the nature of the space obtains only under inclusion of Kant's philosophy and his analysis of the terms space and time as fundamental forms of perception its full persuasive power. Then von Weizsaeckers quantum theory of the ur-alternatives is presented. Finally attempts are made to apply the obtained knowledge to the question of the quantum-theoretical formulation of general relativity theory.

  13. Political Discourse Analysis Through Solving Problems of Graph Theory

    Directory of Open Access Journals (Sweden)

    Monica Patrut

    2010-03-01

    Full Text Available In this article, we show how, using graph theory, we can make a content analysis of political discourse. Assumptions of this analysis are:
    - we have a corpus of speech of each party or candidate;
    - we consider that speech conveys economic, political, socio-cultural values, these taking the form of words or word families;
    - we consider that there are interdependences between the values of a political discourse; they are given by the co-occurrence of two values, as words in the text, within a well defined fragment, or they are determined by the internal logic of political discourse;
    - established links between values in a political speech have associated positive numbers indicating the "power" of those links; these "powers" are defined according to both the number of co-occurrences of values, and the internal logic of the discourse where they occur.
    In this context we intend to highlight the following:
    a which is the dominant value in a political speech;
    b which groups of values have ties between them and have no connection with the rest;
    c which is the order in which political values should be set in order to obtain an equivalent but more synthetic speech compared to the already given one;
    d which are the links between values that form the "core" political speech.
    To solve these problems, we shall use the Political Analyst program. After that, we shall present the concepts necessary to the understanding of the introductory graph theory, useful in understanding the analysis of the software and then the operation of the program. This paper extends the previous paper [6].

  14. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  15. Quantum theory of scattering

    CERN Document Server

    Wu Ta You

    1962-01-01

    This volume addresses the broad formal aspects and applications of the quantum theory of scattering in atomic and nuclear collisions. An encyclopedic source of pioneering work, it serves as a text for students and a reference for professionals in the fields of chemistry, physics, and astrophysics. The self-contained treatment begins with the general theory of scattering of a particle by a central field. Subsequent chapters explore particle scattering by a non-central field, collisions between composite particles, the time-dependent theory of scattering, and nuclear reactions. An examinati

  16. Grounded theory: building a middle-range theory in nursing

    Directory of Open Access Journals (Sweden)

    Maria João Fernandes

    2015-03-01

    Full Text Available The development of nursing as a discipline results from a boom of investigations underway for nearly a century, and of the construction of theories that have arisen during the 1950’s, with greater relevance since the 1960’s. Giving continuation to the production of knowledge in nursing and seeking to contribute to the increase in the number of explanatory theories of the functional content of nurses, there is interest in answering the question: how can a middle-range theory in nursing be built that explains the nurse-elderly interaction in a successful aging process? As well, we address the goal of describing the process of building a middle-range theory in nursing. Middle-range theory refers to a qualitative paradigm study of inductive thinking, developed in the context of primary health care. The information was collected through participant observation and interviews. Method of analysis grounded theory by Corbin and Strauss(1 was followed, utilizing the triangulation of data and theoretical sampling. Grounded theory has become a method of analysis which facilitates the understanding and explanation of the phenomenon under study. By making clear the nature and process of the nurse-elderly interaction in the selected context and within the context of successful aging, a middle-range theory proposal emerged.

  17. A New Higher-Order Composite Theory for Analysis and Design of High Speed Tilt-Rotor Blades

    Science.gov (United States)

    McCarthy, Thomas Robert

    1996-01-01

    A higher-order theory is developed to model composite box beams with arbitrary wall thicknesses. The theory, based on a refined displacement field, represents a three-dimensional model which approximates the elasticity solution. Therefore, the cross-sectional properties are not reduced to one-dimensional beam parameters. Both inplane and out-of-plane warping are automatically included in the formulation. The model accurately captures the transverse shear stresses through the thickness of each wall while satisfying all stress-free boundary conditions. Several numerical results are presented to validate the present theory. The developed theory is then used to model the load carrying member of a tilt-rotor blade which has thick-walled sections. The composite structural analysis is coupled with an aerodynamic analysis to compute the aeroelastic stability of the blade. Finally, a multidisciplinary optimization procedure is developed to improve the aerodynamic, structural and aeroelastic performance of the tilt-rotor aircraft. The Kreisselmeier-Steinhauser function is used to formulate the multiobjective function problem and a hybrid approximate analysis is used to reduce the computational effort. The optimum results are compared with the baseline values and show significant improvements in the overall performance of the tilt-rotor blade.

  18. Acoustic source localization : Exploring theory and practice

    NARCIS (Netherlands)

    Wind, Jelmer

    2009-01-01

    Over the past few decades, noise pollution became an important issue in modern society. This has led to an increased effort in the industry to reduce noise. Acoustic source localization methods determine the location and strength of the vibrations which are the cause of sound based onmeasurements of

  19. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  20. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  1. A Polytomous Item Response Theory Analysis of Social Physique Anxiety Scale

    Science.gov (United States)

    Fletcher, Richard B.; Crocker, Peter

    2014-01-01

    The present study investigated the social physique anxiety scale's factor structure and item properties using confirmatory factor analysis and item response theory. An additional aim was to identify differences in response patterns between groups (gender). A large sample of high school students aged 11-15 years (N = 1,529) consisting of n =…

  2. Theory of error for target factor analysis with applications to mass spectrometry and nuclear magnetic resonance spectrometry

    International Nuclear Information System (INIS)

    Malinowski, E.R.

    1978-01-01

    Based on the theory of error for abstract factor analysis described earlier, a theory of error for target factor analysis is developed. The theory shows how the error in the data matrix mixes with the error in the target test vector. The apparent error in a target test is found to be a vector sum of the real error in the target vector and the real error in the predicted vector. The theory predicts the magnitudes of these errors without requiring any a priori knowledge of the error in the data matrix or the target vector. A reliability function and a spoil function are developed for the purpose of assessing the validity and the worthiness of a target vector. Examples from model data, mass spectrometry and nuclear magnetic resonance spectrometry are presented. (Auth.)

  3. THEORY IN RELIGION AND AGING: AN OVERVIEW

    Science.gov (United States)

    Levin, Jeff; Chatters, Linda M.; Taylor, Robert Joseph

    2011-01-01

    This paper provides an overview of theory in religion, aging, and health. It offers both a primer on theory and a roadmap for researchers. Four “tenses” of theory are described—distinct ways that theory comes into play in this field: grand theory, mid-range theory, use of theoretical models, and positing of constructs which mediate or moderate putative religious effects. Examples are given of both explicit and implicit uses of theory. Sources of theory for this field are then identified, emphasizing perspectives of sociologists and psychologists, and discussion is given to limitations of theory. Finally, reflections are offered as to why theory matters. PMID:20087662

  4. Analysis of the Structure Ratios of the Funding Sources

    Directory of Open Access Journals (Sweden)

    Maria Daniela Bondoc

    2014-06-01

    Full Text Available The funding sources of the assets and liabilities in the balance sheet include equity capitals and the debts of the entity. The analysis of the structure rates of the funding sources allows for making assessments related to the funding policy, highlighting the financial autonomy and how resources are provided. Using the literature specializing in economic and financial analysis, this paper aims at presenting these rates that focus, on the one hand, to reflect the degree of financial dependence (the rate of financial stability, the rate of global financial autonomy, the rate of on-term financial autonomy and on the other hand the debt structure (the rate of short-term debts, the global indebtedness rate, the on-term indebtedness rate. Based on the financial statements of an entity in the Argeş County, I analysed these indicators, and I drew conclusions and made assessments related to the autonomy, indebtedness and financial stability of the studied entity.

  5. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  6. Propeller thrust analysis using Prandtl's lifting line theory, a comparison between the experimental thrust and the thrust predicted by Prandtl's lifting line theory

    Science.gov (United States)

    Kesler, Steven R.

    The lifting line theory was first developed by Prandtl and was used primarily on analysis of airplane wings. Though the theory is about one hundred years old, it is still used in the initial calculations to find the lift of a wing. The question that guided this thesis was, "How close does Prandtl's lifting line theory predict the thrust of a propeller?" In order to answer this question, an experiment was designed that measured the thrust of a propeller for different speeds. The measured thrust was compared to what the theory predicted. In order to do this experiment and analysis, a propeller needed to be used. A walnut wood ultralight propeller was chosen that had a 1.30 meter (51 inches) length from tip to tip. In this thesis, Prandtl's lifting line theory was modified to account for the different incoming velocity depending on the radial position of the airfoil. A modified equation was used to reflect these differences. A working code was developed based on this modified equation. A testing rig was built that allowed the propeller to be rotated at high speeds while measuring the thrust. During testing, the rotational speed of the propeller ranged from 13-43 rotations per second. The thrust from the propeller was measured at different speeds and ranged from 16-33 Newton's. The test data were then compared to the theoretical results obtained from the lifting line code. A plot in Chapter 5 (the results section) shows the theoretical vs. actual thrust for different rotational speeds. The theory over predicted the actual thrust of the propeller. Depending on the rotational speed, the error was: at low speeds 36%, at low to moderate speeds 84%, and at high speeds the error increased to 195%. Different reasons for these errors are discussed.

  7. Decision theory, the context for risk and reliability analysis

    International Nuclear Information System (INIS)

    Kaplan, S.

    1985-01-01

    According to this model of the decision process then, the optimum decision is that option having the largest expected utility. This is the fundamental model of a decision situation. It is necessary to remark that in order for the model to represent a real-life decision situation, it must include all the options present in that situation, including, for example, the option of not deciding--which is itself a decision, although usually not the optimum one. Similarly, it should include the option of delaying the decision while the authors gather further information. Both of these options have probabilities, outcomes, impacts, and utilities like any option and should be included explicitly in the decision diagram. The reason for doing a quantitative risk or reliability analysis is always that, somewhere underlying there is a decision to be made. The decision analysis therefore always forms the context for the risk or reliability analysis, and this context shapes the form and language of that analysis. Therefore, they give in this section a brief review of the well-known decision theory diagram

  8. Lattice field theories: non-perturbative methods of analysis

    International Nuclear Information System (INIS)

    Weinstein, M.

    1978-01-01

    A lecture is given on the possible extraction of interesting physical information from quantum field theories by studying their semiclassical versions. From the beginning the problem of solving for the spectrum states of any given continuum quantum field theory is considered as a giant Schroedinger problem, and then some nonperturbative methods for diagonalizing the Hamiltonian of the theory are explained without recourse to semiclassical approximations. The notion of a lattice appears as an artifice to handle the problems associated with the familiar infrared and ultraviolet divergences of continuum quantum field theory and in fact for all but gauge theories. 18 references

  9. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  10. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    Science.gov (United States)

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  11. Conservation of ecosystems : theory and practice

    CSIR Research Space (South Africa)

    Siegfried, WR

    1982-09-01

    Full Text Available stream_source_info Conservation of Ecosystems Theory and Practice.pdf.txt stream_content_type text/plain stream_size 102 Content-Encoding ISO-8859-1 stream_name Conservation of Ecosystems Theory and Practice.pdf.txt Content...-Type text/plain; charset=ISO-8859-1 ...

  12. Costs of Adopting a Common European Currency. Analysis in Terms of the Optimum Currency Areas Theory

    Directory of Open Access Journals (Sweden)

    Aura Gabriela SOCOL

    2011-02-01

    Full Text Available This analysis presents a theoretical approach of the possible costs related to a national economy which desires to be part of a monetary union. The analysis is made in terms of the classical optimum currency areas theory, which represents the basis of the monetary union process. The objective of this theory was to make a monetary union possible. This theory shows that the countries can obtain net benefits as a result of having a common currency, thus being able to avoid the possible adjustment problems. As a matter of fact, its great merit is that it identified certain properties of the countries being part of a monetary union, these properties representing real alternative tools for losing the independence of the monetary policy

  13. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Vartanyants, I.A.; Singer, A. [HASYLAB at Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2009-07-15

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  14. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    International Nuclear Information System (INIS)

    Vartanyants, I.A.; Singer, A.

    2009-07-01

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  15. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Science.gov (United States)

    Muthuraman, Muthuraman; Hellriegel, Helge; Hoogenboom, Nienke; Anwar, Abdul Rauf; Mideksa, Kidist Gebremariam; Krause, Holger; Schnitzler, Alfons; Deuschl, Günther; Raethjen, Jan

    2014-01-01

    Electroencephalography (EEG) and magnetoencephalography (MEG) are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz) and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS) with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC). MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR) level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT) task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  16. An application of random field theory to analysis of electron trapping sites in disordered media

    International Nuclear Information System (INIS)

    Hilczer, M.; Bartczak, W.M.

    1993-01-01

    The potential energy surface in a disordered medium is considered a random field and described using the concepts of the mathematical theory of random fields. The preexisting traps for excess electrons are identified with certain regions of excursion (extreme regions) of the potential field. The theory provides an analytical method of statistical analysis of these regions. Parameters of the cavity-averaged potential field, which are provided by computer simulation of a given medium, serve as input data for the analysis. The statistics of preexisting traps are obtained for liquid methanol as a numerical example of the random field method. 26 refs., 6 figs

  17. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  18. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  19. The Impact of the Photocopier on Peer Review and Nursing Theory.

    Science.gov (United States)

    Nicoll, Leslie H

    Two influential publications in nursing, Nursing Research and Perspectives on Nursing Theory, are used to illustrate how a specific technology change-the invention and marketing of the photocopier-influenced knowledge dissemination and information utilization in nursing, perhaps in ways not immediately apparent. Content analysis and historical comparison, using editorials from Nursing Research, historical reports on technology development, and personal reflections on the genesis of Perspectives on Nursing Theory are used to create an argument for the role of technology in peer review, information utilization, and knowledge development in nursing. Multiple forces influence nursing science. Scholars should be alert to data inputs from many sources and respond accordingly.

  20. Interviews with the dead: using meta-life qualitative analysis to validate Hippocrates' theory of humours

    Science.gov (United States)

    Secretion, F; Conjur, G S; Attitude, S P

    1998-01-01

    BACKGROUND: Hippocrates devised his theory of the 4 humours (blood, phlegm, black bile and yellow bile) 24 centuries ago. Since then, medicine has evolved into a complex body of confusing and sometimes contradictory facts. The authors, seeing a need to determine the validity of his theory, hired a psychic. METHODS: The psychic interviewed 4 eminent ancient physicians, including Hippocrates. A randomized double-blind cross-over design was used for this meta-life qualitative analysis. RESULTS: All of the interviewees agreed that the theory of humours is an accurate model to explain disease and personality. INTERPRETATION: Hiring a psychic to conduct after-death interviews with key informants is a useful way to validate scientific theories. PMID:9875254

  1. Brewing Bokashi: Strengthening Student Skills in Dilution Theory through Fermentation Analysis

    Directory of Open Access Journals (Sweden)

    Robert E. Zdor

    2016-05-01

    Full Text Available One of the basic microbiological techniques that students should master is that of using dilution theory to calculate the levels of bacteria in a fluid. This tip reports on using a rice water-milk fermentation mixture termed Bokashi as an easily implemented exercise in the basic microbiological lab to give students multiple opportunities to use dilution theory. Due to the shifts in bacterial community composition over time, a variety of microbes can be cultured using selective and nonselective media. Microscopic observation and the use of GEN III microplates to determine the collective phenotypic pattern of the mixture both give additional opportunities for students to hone their skills in bacterial analysis. Due to the decrease in the pH of the mixture over time, the notion of acid tolerance in bacteria can be explored and assessed using the microplate. By performing multiple rounds of serial dilutions and spread plating, students can practice their skill at using dilution theory several times over the course of the exercise.

  2. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

    Science.gov (United States)

    Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  3. Game theoretic analysis of congestion, safety and security traffic and transportation theory

    CERN Document Server

    Zhuang, Jun

    2015-01-01

    Maximizing reader insights into the interactions between game theory, excessive crowding and safety and security elements in traffic and transportation theory, this book establishes a new research angle by illustrating linkages between different research approaches and through laying the foundations for subsequent analysis. Congestion (excessive crowding) is defined in this work as all kinds of flows; e.g., road/sea/air traffic, people, data, information, water, electricity, and organisms. Analyzing systems where congestion occurs – which may be in parallel, series, interlinked, or interdependent, with flows one way or both ways – this book puts forward new congestion models, breaking new ground by introducing game theory and safety/security. Addressing the multiple actors who may hold different concerns regarding system reliability; e.g. one or several terrorists, a government, various local or regional government agencies, or others with stakes for or against system reliability, this book describes how ...

  4. CONCEPTS OF ENVY IN THE PSYCHOLOGICAL THEORIES OF PERSONALITY

    Directory of Open Access Journals (Sweden)

    Татьяна Викторовна Бескова

    2014-04-01

    Full Text Available The article analyzes the foreign and Russian theorists of personality, representing different psychological directions in which there is a reference to the problem of envy. The problem of envy is discussed in the framework of classical psychoanalysis (S. Freud, M. Klein, individual psychology (A. Adler, analytical psychology (C.G. Jung, concept ofhumanistic psychoanalysis (E. Fromm, social-cultural theory (K. Horney, ego-theory (E. Erikson, A. Peeters, dispositional direction (G. Allport, R. Cattell, humanistic psychology (A. Maslow, existential psychology (V. Frankl. It is shown that in Russian theories of personality the problem of envy is reflected in the works of A.A. Bodalev, V.N. Myasishchev, V.N. Panferov, A.V. Petrovsky.Purpose.To carry out the analysis of psychological theories of the personality to identify the specific of ideas of psychological essence and envy sources.Methodology.Theoretical analysis and systematization of scientific data.Results.Separation and heterogeneity of scientific ideas of envy is revealed, that, on the one hand, allows looking at it from different points of view, and with another – counteracts the integration of knowledge of envy into uniform theoretical system.Practical implications. Research results can be used in the practice of psychological consultation, the psycho-correction of the envious relation, the outreach activity of psychologists.DOI: http://dx.doi.org/10.12731/2218-7405-2013-9-68

  5. Global Sourcing Flexibility

    DEFF Research Database (Denmark)

    Ørberg Jensen, Peter D.; Petersen, Bent

    2013-01-01

    the higher costs (but decreased risk for value chain disruption) embedded in a more flexible global sourcing model that allows the firm to replicate and/or relocate activities across multiple locations. We develop a model and propositions on facilitating and constraining conditions of global sourcing...... sourcing flexibility. Here we draw on prior research in the fields of organizational flexibility, international business and global sourcing as well as case examples and secondary studies. In the second part of the paper, we discuss the implications of global sourcing flexibility for firm strategy...... and operations against the backdrop of the theory-based definition of the construct. We discuss in particular the importance of global sourcing flexibility for operational performance stability, and the trade-off between specialization benefits, emerging from location and service provider specialization, versus...

  6. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  7. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Barletta, M.; Zarimpas, N.; Zarucki, R.

    2010-10-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  8. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts...... the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its...

  9. Analysis of the TMI-2 source range detector response

    International Nuclear Information System (INIS)

    Carew, J.F.; Diamond, D.J.; Eridon, J.M.

    1980-01-01

    In the first few hours following the TMI-2 accident large variations (factors of 10-100) in the source range (SR) detector response were observed. The purpose of this analysis was to quantify the various effects which could contribute to these large variations. The effects evaluated included the transmission of neutrons and photons from the core to detector and the reduction in the multiplication of the Am-Be startup sources, and subsequent reduction in SR detector response, due to core voiding. A one-dimensional ANISN slab model of the TMI-2 core, core externals, pressure vessel and containment has been constructed for calculation of the SR detector response and is presented

  10. Dosimetric analysis of radiation sources to use in dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures. Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  11. Two-dimensional topological field theories coupled to four-dimensional BF theory

    International Nuclear Information System (INIS)

    Montesinos, Merced; Perez, Alejandro

    2008-01-01

    Four-dimensional BF theory admits a natural coupling to extended sources supported on two-dimensional surfaces or string world sheets. Solutions of the theory are in one to one correspondence with solutions of Einstein equations with distributional matter (cosmic strings). We study new (topological field) theories that can be constructed by adding extra degrees of freedom to the two-dimensional world sheet. We show how two-dimensional Yang-Mills degrees of freedom can be added on the world sheet, producing in this way, an interactive (topological) theory of Yang-Mills fields with BF fields in four dimensions. We also show how a world sheet tetrad can be naturally added. As in the previous case the set of solutions of these theories are contained in the set of solutions of Einstein's equations if one allows distributional matter supported on two-dimensional surfaces. These theories are argued to be exactly quantizable. In the context of quantum gravity, one important motivation to study these models is to explore the possibility of constructing a background-independent quantum field theory where local degrees of freedom at low energies arise from global topological (world sheet) degrees of freedom at the fundamental level

  12. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  13. Uncertainty Analysis of Few Group Cross Sections Based on Generalized Perturbation Theory

    International Nuclear Information System (INIS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-01-01

    In this paper, the methodology of the sensitivity and uncertainty analysis code based on GPT was described and the preliminary verification calculations on the PMR200 pin cell problem were carried out. As a result, they are in a good agreement when compared with the results by TSUNAMI. From this study, it is expected that MUSAD code based on GPT can produce the uncertainty of the homogenized few group microscopic cross sections for a core simulator. For sensitivity and uncertainty analyses for general core responses, a two-step method is available and it utilizes the generalized perturbation theory (GPT) for homogenized few group cross sections in the first step and stochastic sampling method for general core responses in the second step. The uncertainty analysis procedure based on GPT in the first step needs the generalized adjoint solution from a cell or lattice code. For this, the generalized adjoint solver has been integrated into DeCART in our previous work. In this paper, MUSAD (Modues of Uncertainty and Sensitivity Analysis for DeCART) code based on the classical perturbation theory was expanded to the function of the sensitivity and uncertainty analysis for few group cross sections based on GPT. First, the uncertainty analysis method based on GPT was described and, in the next section, the preliminary results of the verification calculation on a VHTR pin cell problem were compared with the results by TSUNAMI of SCALE 6.1

  14. Beamformer source analysis and connectivity on concurrent EEG and MEG data during voluntary movements.

    Directory of Open Access Journals (Sweden)

    Muthuraman Muthuraman

    Full Text Available Electroencephalography (EEG and magnetoencephalography (MEG are the two modalities for measuring neuronal dynamics at a millisecond temporal resolution. Different source analysis methods, to locate the dipoles in the brain from which these dynamics originate, have been readily applied to both modalities alone. However, direct comparisons and possible advantages of combining both modalities have rarely been assessed during voluntary movements using coherent source analysis. In the present study, the cortical and sub-cortical network of coherent sources at the finger tapping task frequency (2-4 Hz and the modes of interaction within this network were analysed in 15 healthy subjects using a beamformer approach called the dynamic imaging of coherent sources (DICS with subsequent source signal reconstruction and renormalized partial directed coherence analysis (RPDC. MEG and EEG data were recorded simultaneously allowing the comparison of each of the modalities separately to that of the combined approach. We found the identified network of coherent sources for the finger tapping task as described in earlier studies when using only the MEG or combined MEG+EEG whereas the EEG data alone failed to detect single sub-cortical sources. The signal-to-noise ratio (SNR level of the coherent rhythmic activity at the tapping frequency in MEG and combined MEG+EEG data was significantly higher than EEG alone. The functional connectivity analysis revealed that the combined approach had more active connections compared to either of the modalities during the finger tapping (FT task. These results indicate that MEG is superior in the detection of deep coherent sources and that the SNR seems to be more vital than the sensitivity to theoretical dipole orientation and the volume conduction effect in the case of EEG.

  15. Tracing diffuse anthropogenic Pb sources in rural soils by means of Pb isotope analysis

    NARCIS (Netherlands)

    Walraven, N.; Gaans, P.F.M. van; Veer, G. van der; Os, B.J.H. van; Klaver, G.T.; Vriend, S.P.; Middelburg, J.J.; Davies, G.R.

    2013-01-01

    Knowledge of the cause and source of Pb pollution is important to abate environmental Pb pollution by taking source-related actions. Lead isotope analysis is a potentially powerful tool to identify anthropogenic Pb and its sources in the environment. Spatial information on the variation of

  16. Parental hostility and its sources in psychologically abusive mothers: a test of the three-factor theory.

    Science.gov (United States)

    Lesnik-Oberstein, M; Koers, A J; Cohen, L

    1995-01-01

    A revised version of the three-factor theory of child abuse (Lesnik-Oberstein, Cohen, & Koers, 1982) is presented. Further, we report on a research designed to test three main hypotheses derived from Factor I (1) (a high level of hostility in abusive parents) and its sources. The three main hypotheses are: (1) that psychologically abusive mothers have a high level of hostile feelings (Factor I); (2) that the high level of hostile feelings in abusive mothers is associated with low marital coping skills (resulting in affectionless, violent marriages), a negative childhood upbringing (punitive, uncaring, over controlling), a high level of stress (objective stress), and a high level of strain (low self-esteem, depression, neurotic symptoms, social anxiety, feelings of being wronged); and (3) that maternal psychological child abuse is associated with low marital coping skills, a negative childhood upbringing, a high level of stress and a high level of strain. Forty-four psychologically abusing mothers were compared with 128 nonabusing mothers on a variety of measures and were matched for age and educational level. All the mothers had children who were hospitalized for medical symptoms. The three hypotheses were supported, with the exception of the component of hypothesis 2 concerning the association between objective stress and maternal hostility. The positive results are consistent with the three-factor theory.

  17. SWOT analysis of the renewable energy sources in Romania - case study: solar energy

    Science.gov (United States)

    Lupu, A. G.; Dumencu, A.; Atanasiu, M. V.; Panaite, C. E.; Dumitrașcu, Gh; Popescu, A.

    2016-08-01

    The evolution of energy sector worldwide triggered intense preoccupation on both finding alternative renewable energy sources and environmental issues. Romania is considered to have technological potential and geographical location suitable to renewable energy usage for electricity generation. But this high potential is not fully exploited in the context of policies and regulations adopted globally, and more specific, European Union (EU) environmental and energy strategies and legislation related to renewable energy sources. This SWOT analysis of solar energy source presents the state of the art, potential and future prospects for development of renewable energy in Romania. The analysis concluded that the development of solar energy sector in Romania depends largely on: viability of legislative framework on renewable energy sources, increased subsidies for solar R&D, simplified methodology of green certificates, and educating the public, investors, developers and decision-makers.

  18. USING THE METHODS OF WAVELET ANALYSIS AND SINGULAR SPECTRUM ANALYSIS IN THE STUDY OF RADIO SOURCE BL LAC

    OpenAIRE

    Donskykh, G. I.; Ryabov, M. I.; Sukharev, A. I.; Aller, M.

    2014-01-01

    We investigated the monitoring data of extragalactic source BL Lac. This monitoring was held withUniversityofMichigan26-meter radio  telescope. To study flux density of extragalactic source BL Lac at frequencies of 14.5, 8 and 4.8 GHz, the wavelet analysis and singular spectrum analysis were used. Calculating the integral wavelet spectra allowed revealing long-term  components  (~7-8 years) and short-term components (~ 1-4 years) in BL Lac. Studying of VLBI radio maps (by the program Mojave) ...

  19. Pin-wise Reactor Analysis Based on the Generalized Equivalence Theory

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Heo, Woong; Kim, Yong Hee [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    In this paper, a pin-wise reactor analysis is performed based on the generalized equivalence theory. From the conventional fuel assembly lattice calculations, pin-wise 2-group cross sections and pin DFs are generated. Based on the numerical results on a small PWR benchmark, it is observed that the pin-wise core analysis provide quite accurate prediction on the effective multiplication factor and the peak pin power error is bounded by about 3% in peripheral fuel assemblies facing the baffle-reflector. Also, it was found that relatively large pin power errors occur along the interface between clearly different fuel assemblies. It is expected that the GET-based pin-by-pin core calculation can be further developed as an advanced method for reactor analysis via improving the group constants and discontinuity factors. Recently, high-fidelity multi-dimensional analysis tools are gaining more attention because of their accurate prediction of local parameters for core design and safety assessment. In terms of accuracy, direct whole-core transport is quite promising. However, it is clear that it is still very costly in terms of the computing time and memory requirements. Another possible solution is the pin-by-pin core analysis in which only small fuel pins are homogenized and the 3-D core analysis is still performed using a low-order operator such as the diffusion theory. In this paper, a pin-by-pin core analysis is performed using the hybrid CMFD (HCMFD) method. Hybrid CMFD is a new global-local iteration method that has been developed for efficient parallel calculation of pinby-pin heterogeneous core analysis. For the HCMFD method, the one-node CMFD scheme is combined with a local two-node CMFD method in a non-linear way. Since the SPH method is iterative and SPH factors are not direction dependent, it is clear that SPH method takes more computing cost and cannot take into account the different heterogeneity and transport effects at each pin interface. Unlike the SPH

  20. Theory analysis and simple calculation of travelling wave burnup scheme

    International Nuclear Information System (INIS)

    Zhang Jian; Yu Hong; Gang Zhi

    2012-01-01

    Travelling wave burnup scheme is a new burnup scheme that breeds fuel locally just before it burns. Based on the preliminary theory analysis, the physical imagine was found. Through the calculation of a R-z cylinder travelling wave reactor core with ERANOS code system, the basic physical characteristics of this new burnup scheme were concluded. The results show that travelling wave reactor is feasible in physics, and there are some good features in the reactor physics. (authors)

  1. Game theory.

    Science.gov (United States)

    Dufwenberg, Martin

    2011-03-01

    Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  2. Open Source Parallel Image Analysis and Machine Learning Pipeline, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Continuum Analytics proposes a Python-based open-source data analysis machine learning pipeline toolkit for satellite data processing, weather and climate data...

  3. Theory of quantitative trend analysis and its application to the South African elections

    CSIR Research Space (South Africa)

    Greben, JM

    2006-02-28

    Full Text Available In this paper the author discusses a quantitative theory of trend analysis. Often trends are based on qualitative considerations and subjective assumptions. In the current approach the author makes use of extensive data bases to optimise the so...

  4. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  5. A panorama of discrepancy theory

    CERN Document Server

    Srivastav, Anand; Travaglini, Giancarlo

    2014-01-01

    Discrepancy theory concerns the problem of replacing a continuous object with a discrete sampling. Discrepancy theory is currently at a crossroads between number theory, combinatorics, Fourier analysis, algorithms and complexity, probability theory and numerical analysis. There are several excellent books on discrepancy theory but perhaps no one of them actually shows the present variety of points of view and applications covering the areas "Classical and Geometric Discrepancy Theory", "Combinatorial Discrepancy Theory" and "Applications and Constructions". Our book consists of several chapters, written by experts in the specific areas, and focused on the different aspects of the theory. The book should also be an invitation to researchers and students to find a quick way into the different methods and to motivate interdisciplinary research.

  6. Obisdian sourcing by PIXE analysis at AURA2

    International Nuclear Information System (INIS)

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  7. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  8. Time Series Analysis of Monte Carlo Fission Sources - I: Dominance Ratio Computation

    International Nuclear Information System (INIS)

    Ueki, Taro; Brown, Forrest B.; Parsons, D. Kent; Warsa, James S.

    2004-01-01

    In the nuclear engineering community, the error propagation of the Monte Carlo fission source distribution through cycles is known to be a linear Markov process when the number of histories per cycle is sufficiently large. In the statistics community, linear Markov processes with linear observation functions are known to have an autoregressive moving average (ARMA) representation of orders p and p - 1. Therefore, one can perform ARMA fitting of the binned Monte Carlo fission source in order to compute physical and statistical quantities relevant to nuclear criticality analysis. In this work, the ARMA fitting of a binary Monte Carlo fission source has been successfully developed as a method to compute the dominance ratio, i.e., the ratio of the second-largest to the largest eigenvalues. The method is free of binning mesh refinement and does not require the alteration of the basic source iteration cycle algorithm. Numerical results are presented for problems with one-group isotropic, two-group linearly anisotropic, and continuous-energy cross sections. Also, a strategy for the analysis of eigenmodes higher than the second-largest eigenvalue is demonstrated numerically

  9. Monte Carlo perturbation theory in neutron transport calculations

    International Nuclear Information System (INIS)

    Hall, M.C.G.

    1980-01-01

    The need to obtain sensitivities in complicated geometrical configurations has resulted in the development of Monte Carlo sensitivity estimation. A new method has been developed to calculate energy-dependent sensitivities of any number of responses in a single Monte Carlo calculation with a very small time penalty. This estimation typically increases the tracking time per source particle by about 30%. The method of estimation is explained. Sensitivities obtained are compared with those calculated by discrete ordinates methods. Further theoretical developments, such as second-order perturbation theory and application to k/sub eff/ calculations, are discussed. The application of the method to uncertainty analysis and to the analysis of benchmark experiments is illustrated. 5 figures

  10. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    Science.gov (United States)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  11. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    Science.gov (United States)

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  12. Constrained Null Space Component Analysis for Semiblind Source Separation Problem.

    Science.gov (United States)

    Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn

    2018-02-01

    The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.

  13. Smooth massless limit of field theories

    International Nuclear Information System (INIS)

    Fronsdal, C.

    1980-01-01

    The massless limit of Fierz-Pauli field theories, describing fields with fixed mass and spin interacting with external sources, is examined. Results are obtained for spins, 1, 3/2, 2 and 3 using conventional models, and then for all half-integral spins in a relatively model-independent manner. It is found that the massless limit is smooth provided that the sources satisfy certain conditions. In the massless limit these conditions reduce to the conservation laws required by internal consistency of massless field theory. Smoothness simply requires that quantities that vanish in the massless case approach zero in a certain well-defined manner. (orig.)

  14. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  15. A game theory analysis of green infrastructure stormwater management policies

    Science.gov (United States)

    William, Reshmina; Garg, Jugal; Stillwell, Ashlynn S.

    2017-09-01

    Green stormwater infrastructure has been demonstrated as an innovative water resources management approach that addresses multiple challenges facing urban environments. However, there is little consensus on what policy strategies can be used to best incentivize green infrastructure adoption by private landowners. Game theory, an analysis framework that has historically been under-utilized within the context of stormwater management, is uniquely suited to address this policy question. We used a cooperative game theory framework to investigate the potential impacts of different policy strategies used to incentivize green infrastructure installation. The results indicate that municipal regulation leads to the greatest reduction in pollutant loading. However, the choice of the "best" regulatory approach will depend on a variety of different factors including politics and financial considerations. Large, downstream agents have a disproportionate share of bargaining power. Results also reveal that policy impacts are highly dependent on agents' spatial position within the stormwater network, leading to important questions of social equity and environmental justice.

  16. Theory matters for financial advice!

    OpenAIRE

    Mayer, János; Hens, Thorsten

    2013-01-01

    We show that the optimal asset allocation for an investor depends crucially on the theory with which the investor is modeled. For the same market data and the same client data different theories lead to different portfolios. The market data we consider is standard asset allocation data. The client data is determined by a standard risk profiling question and the theories we apply are mean-variance analysis, expected utility analysis and cumulative prospect theory.

  17. Adding Theoretical Grounding to Grounded Theory: Toward Multi-Grounded Theory

    OpenAIRE

    Göran Goldkuhl; Stefan Cronholm

    2010-01-01

    The purpose of this paper is to challenge some of the cornerstones of the grounded theory approach and propose an extended and alternative approach for data analysis and theory development, which the authors call multi-grounded theory (MGT). A multi-grounded theory is not only empirically grounded; it is also grounded in other ways. Three different grounding processes are acknowledged: theoretical, empirical, and internal grounding. The authors go beyond the pure inductivist approach in GT an...

  18. FECAL SOURCE TRACKING BY ANTIBIOTIC RESISTANCE ANALYSIS ON A WATERSHED EXHIBITING LOW RESISTANCE

    Science.gov (United States)

    The ongoing development of microbial source tracking has made it possible to identify contamination sources with varying accuracy, depending on the method used. The purpose of this study was done to test the efficiency of the antibiotic resistance analysis (ARA) method under low ...

  19. Changing theories of change: strategic shifting in implicit theory endorsement.

    Science.gov (United States)

    Leith, Scott A; Ward, Cindy L P; Giacomin, Miranda; Landau, Enoch S; Ehrlinger, Joyce; Wilson, Anne E

    2014-10-01

    People differ in their implicit theories about the malleability of characteristics such as intelligence and personality. These relatively chronic theories can be experimentally altered, and can be affected by parent or teacher feedback. Little is known about whether people might selectively shift their implicit beliefs in response to salient situational goals. We predicted that, when motivated to reach a desired conclusion, people might subtly shift their implicit theories of change and stability to garner supporting evidence for their desired position. Any motivated context in which a particular lay theory would help people to reach a preferred directional conclusion could elicit shifts in theory endorsement. We examine a variety of motivated situational contexts across 7 studies, finding that people's theories of change shifted in line with goals to protect self and liked others and to cast aspersions on disliked others. Studies 1-3 demonstrate how people regulate their implicit theories to manage self-view by more strongly endorsing an incremental theory after threatening performance feedback or memories of failure. Studies 4-6 revealed that people regulate the implicit theories they hold about favored and reviled political candidates, endorsing an incremental theory to forgive preferred candidates for past gaffes but leaning toward an entity theory to ensure past failings "stick" to opponents. Finally, in Study 7, people who were most threatened by a previously convicted child sex offender (i.e., parents reading about the offender moving to their neighborhood) gravitated most to the entity view that others do not change. Although chronic implicit theories are undoubtedly meaningful, this research reveals a previously unexplored source of fluidity by highlighting the active role people play in managing their implicit theories in response to goals. 2014 APA, all rights reserved

  20. Dynamic Stability Analysis of Autonomous Medium-Voltage Mixed-Source Microgrid

    DEFF Research Database (Denmark)

    Zhao, Zhuoli; Yang, Ping; Guerrero, Josep M.

    2015-01-01

    -space model of the autonomous MV mixed-source microgrid containing diesel generator set (DGS), grid-supporting battery energy storage system (BESS), squirrel cage induction generator (SCIG) wind turbine and network is developed. Sensitivity analysis is carried out to reveal the dynamic stability margin...

  1. Elaborations of grounded theory in information research: arenas/social worlds theory, discourse and situational analysis

    OpenAIRE

    Vasconcelos, A.C.; Sen, B.A.; Rosa, A.; Ellis, D.

    2012-01-01

    This paper explores elaborations of Grounded Theory in relation to Arenas/Social Worlds Theory. The notions of arenas and social worlds were present in early applications of Grounded Theory but have not been as much used or recognised as the general Grounded Theory approach, particularly in the information studies field. The studies discussed here are therefore very unusual in information research. The empirical contexts of these studies are those of (1) the role of discourse in the organisat...

  2. Dynamic response analysis of the LBL Advanced Light Source synchrotron radiation storage ring

    International Nuclear Information System (INIS)

    Leung, K.

    1993-05-01

    This paper presents the dynamic response analysis of the photon source synchrotron radiation storage ring excited by ground motion measured at the Lawrence Berkeley Laboratory advanced light source building site. The high spectral brilliance requirement the photon beams of the advanced light source storage ring specified displacement of the quadrupole focusing magnets in the order of 1 micron in vertical motion.There are 19 magnets supported by a 430-inch steel box beam girder. The girder and all magnets are supported by the kinematic mount system normally used in optical equipment. The kinematic mount called a six-strut magnet support system is now considered as an alternative system for supporting SSC magnets in the Super Collider. The effectively designed and effectively operated six-strut support system is now successfully operated for the Advanced Light Source (ALS) accelerator at the Lawrence Berkeley Laboratory. This paper will present the method of analysis and results of the dynamic motion study at the center of the magnets under the most critical excitation source as recorded at the LBL site

  3. The Rise and Fall of the Cosmic String Theory for Cosmological Perturbations

    International Nuclear Information System (INIS)

    Perivolaropoulos, L.

    2005-01-01

    The cosmic string theory for cosmological fluctuations is a good example of healthy scientific progress in cosmology. It is a well defined physically motivated model that has been tested by cosmological observations and has been ruled out as a primary source of primordial fluctuations. Until about fifteen years ago, the cosmic string theory of cosmological perturbations provided one of the two physically motivated candidate theories for the generation of primordial perturbations. The cosmological data that appeared during the last decade have been compared with the well defined predictions of the theory and have ruled out cosmic strings as a primary source of primordial cosmological perturbations. Since cosmic strings are predicted to form after inflation in a wide range of microphysical theories (including supersymmetric and fundamental string theories) their observational bounds may serve a source of serious constraints for these theories. This is a pedagogical review of the historical development, the main predictions of the cosmic string theory and the constraints that have been imposed on it by cosmological observations. Recent lensing events that could be attributed to lighter cosmic strings are also discussed

  4. Economics of Water Quality Protection from Nonpoint Sources: Theory and Practice

    OpenAIRE

    Ribaudo, Marc; Horan, Richard D.; Smith, Mark E.

    1999-01-01

    Water quality is a major environmental issue. Pollution from nonpoint sources is the single largest remaining source of water quality impairments in the United States. Agriculture is a major source of several nonpoint-source pollutants, including nutrients, sediment, pesticides, and salts. Agricultural nonpoint pollution reduction policies can be designed to induce producers to change their production practices in ways that improve the environmental and related economic consequences of produc...

  5. A system-theory-based model for monthly river runoff forecasting: model calibration and optimization

    Directory of Open Access Journals (Sweden)

    Wu Jianhua

    2014-03-01

    Full Text Available River runoff is not only a crucial part of the global water cycle, but it is also an important source for hydropower and an essential element of water balance. This study presents a system-theory-based model for river runoff forecasting taking the Hailiutu River as a case study. The forecasting model, designed for the Hailiutu watershed, was calibrated and verified by long-term precipitation observation data and groundwater exploitation data from the study area. Additionally, frequency analysis, taken as an optimization technique, was applied to improve prediction accuracy. Following model optimization, the overall relative prediction errors are below 10%. The system-theory-based prediction model is applicable to river runoff forecasting, and following optimization by frequency analysis, the prediction error is acceptable.

  6. Incremental retinal-defocus theory of myopia development--schematic analysis and computer simulation.

    Science.gov (United States)

    Hung, George K; Ciuffreda, Kenneth J

    2007-07-01

    Previous theories of myopia development involved subtle and complex processes such as the sensing and analyzing of chromatic aberration, spherical aberration, spatial gradient of blur, or spatial frequency content of the retinal image, but they have not been able to explain satisfactorily the diverse experimental results reported in the literature. On the other hand, our newly proposed incremental retinal-defocus theory (IRDT) has been able to explain all of these results. This theory is based on a relatively simple and direct mechanism for the regulation of ocular growth. It states that a time-averaged decrease in retinal-image defocus area decreases the rate of release of retinal neuromodulators, which decreases the rate of retinal proteoglycan synthesis with an associated decrease in scleral structural integrity. This increases the rate of scleral growth, and in turn the eye's axial length, which leads to myopia. Our schematic analysis has provided a clear explanation for the eye's ability to grow in the appropriate direction under a wide range of experimental conditions. In addition, the theory has been able to explain how repeated cycles of nearwork-induced transient myopia leads to repeated periods of decreased retinal-image defocus, whose cumulative effect over an extended period of time results in an increase in axial growth that leads to permanent myopia. Thus, this unifying theory forms the basis for understanding the underlying retinal and scleral mechanisms of myopia development.

  7. Theory- Building for Iranian Underground Music Using Grounded Theory

    Directory of Open Access Journals (Sweden)

    Masoud Kowsari

    2013-03-01

    Full Text Available Different genres of underground music are important issues in Iranian youth culture. The purpose of this research was to study masculinity in Iranian- Persian rap music. Therefore, Persian rap music as a part of Iranian popular culture, between 2001 and 2011 was analyzed. We used qualitative research approach. The main method used in this study, was "Constructive Grounded Theory ". So instead of using existing theories as a theoretical framework, the researcher sought to generate local theory from research field. Thus, using theoretical sampling, data compiled from various sources. The multiple data collection techniques such as interviews, observation, online observation, collecting documents and texts were used. Then all the data was coded with using open, axial and selective coding methods. Finally, 62 concepts and 16 categories derived from data and "plural form of masculinity in Iranian-Persian rap music" was defined as the core category. Then according to paradigmatic model, "Substantive Theory" emerged from the data, was presented as "story" and "visual model". Finally seven questions of Strauss and Corbin about the experience in research has been assessed to evaluating research.

  8. The Theory of Optimal Taxation

    DEFF Research Database (Denmark)

    Sørensen, Peter Birch

    The paper discusses the implications of optimal tax theory for the debates on uniform commodity taxation and neutral capital income taxation. While strong administrative and political economy arguments in favor of uniform and neutral taxation remain, recent advances in optimal tax theory suggest...... that the information needed to implement the differentiated taxation prescribed by optimal tax theory may be easier to obtain than previously believed. The paper also points to the strong similarity between optimal commodity tax rules and the rules for optimal source-based capital income taxation...

  9. The theory of optimal taxation

    DEFF Research Database (Denmark)

    Sørensen, Peter Birch

    2007-01-01

    The paper discusses the implications of optimal tax theory for the debates on uniform commodity taxation and neutral capital income taxation. While strong administrative and political economy arguments in favor of uniform and neutral taxation remain, recent advances in optimal tax theory suggest...... that the information needed to implement the differentiated taxation prescribed by optimal tax theory may be easier to obtain than previously believed. The paper also points to the strong similarity between optimal commodity tax rules and the rules for optimal source-based capital income taxation...

  10. Contemporary Theories and International Law-Making

    NARCIS (Netherlands)

    Venzke, I.

    2013-01-01

    Many contemporary theories approach international law-making with a shift in emphasis from the sources of law towards the communicative practices in which a plethora of actors use, claim and speak international law. Whereas earlier approaches would look at the sources as the singular moment of

  11. Unitary unified field theories

    International Nuclear Information System (INIS)

    Sudarshan, E.C.G.

    1976-01-01

    This is an informal exposition of some recent developments. Starting with an examination of the universality of electromagnetic and weak interactions, the attempts at their unification are outlined. The theory of unitary renormalizable self-coupled vector mesons with dynamical sources is formulated for a general group. With masses introduced as variable parameters it is shown that the theory so defined is indeed unitary. Diagrammatic rules are developed in terms of a chosen set of fictitious particles. A number of special examples are outlined including a theory with strongly interacting vector and axial vector mesons and weak mesons. Applications to weak interactions of strange particles is briefly outlined. (Auth.)

  12. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  13. On non-relativistic electron theory

    Energy Technology Data Exchange (ETDEWEB)

    Woolley, R G

    1975-01-01

    A discussion of non-relativistic electron theory, which makes use of the electromagnetic field potentials only as useful working variables in the intermediate stages, is presented. The separation of the (transverse) radiation field from the longitudinal electric field due to the sources is automatic, and as a result, this formalism is often more convenient than the usual Coulomb gauge theory used in molecular physics.

  14. Health Behavior Theory in Physical Activity Game Apps: A Content Analysis.

    Science.gov (United States)

    Payne, Hannah E; Moxley, Victor Ba; MacDonald, Elizabeth

    2015-07-13

    Physical activity games developed for a mobile phone platform are becoming increasingly popular, yet little is known about their content or inclusion of health behavior theory (HBT). The objective of our study was to quantify elements of HBT in physical activity games developed for mobile phones and to assess the relationship between theoretical constructs and various app features. We conducted an analysis of exercise and physical activity game apps in the Apple App Store in the fall of 2014. A total of 52 apps were identified and rated for inclusion of health behavior theoretical constructs using an established theory-based rubric. Each app was coded for 100 theoretical items, containing 5 questions for 20 different constructs. Possible total theory scores ranged from 0 to 100. Descriptive statistics and Spearman correlations were used to describe the HBT score and association with selected app features, respectively. The average HBT score in the sample was 14.98 out of 100. One outlier, SuperBetter, scored higher than the other apps with a score of 76. Goal setting, self-monitoring, and self-reward were the most-reported constructs found in the sample. There was no association between either app price and theory score (P=.5074), or number of gamification elements and theory score (P=.5010). However, Superbetter, with the highest HBT score, was also the most expensive app. There are few content analyses of serious games for health, but a comparison between these findings and previous content analyses of non-game health apps indicates that physical activity mobile phone games demonstrate higher levels of behavior theory. The most common theoretical constructs found in this sample are known to be efficacious elements in physical activity interventions. It is unclear, however, whether app designers consciously design physical activity mobile phone games with specific constructs in mind; it may be that games lend themselves well to inclusion of theory and any

  15. Application of Extreme Value Theory to Crash Data Analysis.

    Science.gov (United States)

    Xu, Lan; Nusholtz, Guy

    2017-11-01

    A parametric model obtained by fitting a set of data to a function generally uses a procedure such as maximum likelihood or least squares. In general this will generate the best estimate for the distribution of the data overall but will not necessarily generate a reasonable estimation for the tail of the distribution unless the function fitted resembles the underlying distribution function. A distribution function can represent an estimate that is significantly different from the actual tail data, while the bulk of the data is reasonably represented by the central part of the fitted distribution. Extreme value theory can be used to improve the predictive capabilities of the fitted function in the tail region. In this study the peak-over-threshold approach from the extreme value theory was utilized to show that it is possible to obtain a better fit of the tail of a distribution than the procedures that use the entire distribution only. Additional constraints, on the current use of the extreme value approach with respect to the selection of the threshold (an estimate of the beginning of the tail region) that minimize the sensitivity to individual data samples associated with the tail section as well as contamination from the central distribution are used. Once the threshold is determined, the maximum likelihood method was used to fit the exceedances with the Generalized Pareto Distribution to obtain the tail distribution. The approach was then used in the analysis of airbag inflator pressure data from tank tests, crash velocity distribution and mass distribution from the field crash data (NASS). From the examples, the extreme (tail) distributions were better estimated with the Generalized Pareto Distribution, than a single overall distribution, along with the probability of the occurrence for a given extreme value, or a rare observation such as a high speed crash. It was concluded that the peak-over-threshold approach from extreme value theory can be a useful tool in

  16. Real analysis a comprehensive course in analysis, part 1

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 1 is devoted to real analysis. From one point of view, it presents the infinitesimal calculus of the twentieth century with the ultimate integral calculus (measure theory)

  17. Deformation due to distributed sources in micropolar thermodiffusive medium

    Directory of Open Access Journals (Sweden)

    Sachin Kaushal

    2010-10-01

    Full Text Available The general solution to the field equations in micropolar generalized thermodiffusive in the context of G-L theory is investigated by applying the Laplace and Fourier transform's as a result of various sources. An application of distributed normal forces or thermal sources or potential sources has been taken to show the utility of the problem. To get the solution in the physical form, a numerical inversion technique has been applied. The transformed components of stress, temperature distribution and chemical potential for G-L theory and CT theory has been depicted graphically and results are compared analytically to show the impact of diffusion, relaxation times and micropolarity on these quantities. Some special case of interest are also deduced from present investigation.

  18. Microwave and RF vacuum electronic power sources

    CERN Document Server

    Carter, Richard G

    2018-01-01

    Do you design and build vacuum electron devices, or work with the systems that use them? Quickly develop a solid understanding of how these devices work with this authoritative guide, written by an author with over fifty years of experience in the field. Rigorous in its approach, it focuses on the theory and design of commercially significant types of gridded, linear-beam, crossed-field and fast-wave tubes. Essential components such as waveguides, resonators, slow-wave structures, electron guns, beams, magnets and collectors are also covered, as well as the integration and reliable operation of devices in microwave and RF systems. Complex mathematical analysis is kept to a minimum, and Mathcad worksheets supporting the book online aid understanding of key concepts and connect the theory with practice. Including coverage of primary sources and current research trends, this is essential reading for researchers, practitioners and graduate students working on vacuum electron devices.

  19. The mathematical theory of signal processing and compression-designs

    Science.gov (United States)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  20. Off-design performance analysis of Kalina cycle for low temperature geothermal source

    International Nuclear Information System (INIS)

    Li, Hang; Hu, Dongshuai; Wang, Mingkun; Dai, Yiping

    2016-01-01

    Highlights: • The off-design performance analysis of Kalina cycle is conducted. • The off-design models are established. • The genetic algorithm is used in the design phase. • The sliding pressure control strategy is applied. - Abstract: Low temperature geothermal sources with brilliant prospects have attracted more and more people’s attention. Kalina cycle system using ammonia water as working fluid could exploit geothermal energy effectively. In this paper, the quantitative analysis of off-design performance of Kalina cycle for the low temperature geothermal source is conducted. The off-design models including turbine, pump and heat exchangers are established preliminarily. Genetic algorithm is used to maximize the net power output and determine the thermodynamic parameters in the design phase. The sliding pressure control strategy applied widely in existing Rankine cycle power plants is adopted to response to the variations of geothermal source mass flow rate ratio (70–120%), geothermal source temperature (116–128 °C) and heat sink temperature (0–35 °C). In the off-design research scopes, the guidance for pump rotational speed adjustment is listed to provide some reference for off-design operation of geothermal power plants. The required adjustment rate of pump rotational speed is more sensitive to per unit geothermal source temperature than per unit heat sink temperature. Influence of the heat sink variation is greater than that of the geothermal source variation on the ranges of net power output and thermal efficiency.

  1. Gadamer's Hermeneutic Contribution to a Theory of Time ...

    African Journals Online (AJOL)

    denise

    The play of a work of art and a festival provide examples of ways in which aesthetic experience is fundamentally a source of our consciousness of time.9. Drawing on the themes of Gadamer's aesthetic theory, we can thus delineate the basic themes of. Gadamer's theory of time-consciousness: a theory that locates Gadamer ...

  2. THE HEURISTIC POTENTIAL OF ANOMIE THEORY IN MODERN CRIMINOLOGY

    Directory of Open Access Journals (Sweden)

    Alexander Vladislavovich Pletnev

    2015-01-01

    Full Text Available This article deals with modern English theories of anomie. They can be used in Russian criminology.  The main goal of article consists in detection of actual theories of anomie and definition of prospects of their use. As modern theories of anomie are poorly submitted in the Russian sociological and criminological literature, the subject of research is actual. This work contains the analysis of opportunities for adoption of modern conceptions of anomie of individual in Russian practice. During research development of the theory of anomie in the history of sociology was considered. The problem of anomie was admitted actual antique Greece. Anomie which is today concerned with normlessness and related to alienation is associated primarily with the works of Durkheim and Merton. Anomia developed in research by MacIver and Srole as a characteristic of individuals and related to the breakdown of the individual’s sense of attachment to society. Results of theoretical research show that theories of anomie of the personality have the greatest heuristic potential for modern Russian science. Other important conclusion of research is one that the anomie can have some sources of emergence. Further studying of this subject is necessary because English-language theories of anomie contain a set of theoretical and empirical results which can be used in the Russian criminology.

  3. Multicriteria analysis for sources of renewable energy using data from remote sensing

    Science.gov (United States)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more

  4. Spatial channel theory: A technique for determining the directional flow of radiation through reactor systems

    International Nuclear Information System (INIS)

    Williams, M.L.; Engle, W.W.

    1977-01-01

    A method is introduced for determining streaming paths through a non-multiplying medium. The concepts of a ''response continuum'' and a pseudo-particle called a contribution are developed to describe the spatial channels through which response flows from a source to a detector. An example application of channel theory to complex shield analysis is cited

  5. Capital Structure Analysis Of EBX Group’s Companies: Combining Theory And Practice

    Directory of Open Access Journals (Sweden)

    Matheus da Costa Gomes

    2017-12-01

    Full Text Available According to market timing theory, companies tend to issue stocks or debts in order to exploit the opportunity window and this behavior is a significant determinant for capital structure. Based on this assertion and recent evidence found in the Brazilian market, this paper has the objective of analyzing the market timing behavior of the six publicly-traded companies in BM&FBOVESPA of the EBX Group (MPX, MMX, OXX, LLX, OSX and CCX, until mid-2013. For this purpose, the methodology used was a case study and as the main source of evidence indicators related to capital structure and stock market. An econometric approach to panel data has also been inserted to strengthen analyzes. The results show that the financial decisions made by managers of the EBX Group indicate attempts to exploit the opportunity windows related to the stock market, mainly measured by Shiller's price-to-earnings ratio (Shiller PE, an aggregate market index, as well as the market-to-book (M/B ratio of each company analyzed. This paper contributes to the discussion of market timing in Brazil, combining theory and practice in an intuitive and dynamic way bringing aspects related to companies and market conditions.

  6. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  7. Equilibrium paths analysis of materials with rheological properties by using the chaos theory

    Science.gov (United States)

    Bednarek, Paweł; Rządkowski, Jan

    2018-01-01

    The numerical equilibrium path analysis of the material with random rheological properties by using standard procedures and specialist computer programs was not successful. The proper solution for the analysed heuristic model of the material was obtained on the base of chaos theory elements and neural networks. The paper deals with mathematical reasons of used computer programs and also are elaborated the properties of the attractor used in analysis. There are presented results of conducted numerical analysis both in a numerical and in graphical form for the used procedures.

  8. MpTheory Java library: a multi-platform Java library for systems biology based on the Metabolic P theory.

    Science.gov (United States)

    Marchetti, Luca; Manca, Vincenzo

    2015-04-15

    MpTheory Java library is an open-source project collecting a set of objects and algorithms for modeling observed dynamics by means of the Metabolic P (MP) theory, that is, a mathematical theory introduced in 2004 for modeling biological dynamics. By means of the library, it is possible to model biological systems both at continuous and at discrete time. Moreover, the library comprises a set of regression algorithms for inferring MP models starting from time series of observations. To enhance the modeling experience, beside a pure Java usage, the library can be directly used within the most popular computing environments, such as MATLAB, GNU Octave, Mathematica and R. The library is open-source and licensed under the GNU Lesser General Public License (LGPL) Version 3.0. Source code, binaries and complete documentation are available at http://mptheory.scienze.univr.it. luca.marchetti@univr.it, marchetti@cosbi.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Ramsey theory for product spaces

    CERN Document Server

    Dodos, Pandelis

    2016-01-01

    Ramsey theory is a dynamic area of combinatorics that has various applications in analysis, ergodic theory, logic, number theory, probability theory, theoretical computer science, and topological dynamics. This book is devoted to one of the most important areas of Ramsey theory-the Ramsey theory of product spaces. It is a culmination of a series of recent breakthroughs by the two authors and their students who were able to lift this theory to the infinite-dimensional case. The book presents many major results and methods in the area, such as Szemerédi's regularity method, the hypergraph removal lemma, and the density Hales-Jewett theorem. This book addresses researchers in combinatorics but also working mathematicians and advanced graduate students who are interested in Ramsey theory. The prerequisites for reading this book are rather minimal: it only requires familiarity, at the graduate level, with probability theory and real analysis. Some familiarity with the basics of Ramsey theory would be beneficial, ...

  10. Analysis of the monitoring system for the spallation neutron source 'SINQ'

    International Nuclear Information System (INIS)

    Badreddin, E.

    1998-01-01

    Petri Net models (PN) and Fault-Tree Analysis (FTA) are employed for the purpose of reliability analysis of the spallation neutron source SINQ. The monitoring and shut-down system (SDS) structure is investigated using a Petri-Net model. The reliability data are processed using a Fault-Tree model of the dominant part. Finally, suggestions for the improvement of system availability are made. (author)

  11. Nuclear structure theory

    CERN Document Server

    Irvine, J M

    1972-01-01

    Nuclear Structure Theory provides a guide to nuclear structure theory. The book is comprised of 23 chapters that are organized into four parts; each part covers an aspect of nuclear structure theory. In the first part, the text discusses the experimentally observed phenomena, which nuclear structure theories need to look into and detail the information that supports those theories. The second part of the book deals with the phenomenological nucleon-nucleon potentials derived from phase shift analysis of nucleon-nucleon scattering. Part III talks about the phenomenological parameters used to de

  12. Topics in phase-shift analysis and higher spin field theory

    International Nuclear Information System (INIS)

    Reisen, J.C.J.M.

    1983-01-01

    The first part of this thesis considers several aspects of the existence of phase-shift ambiguities. The subject is introduced with a few remarks on scattering theory and previous work in this area is discussed. The mathematical restrictions of presenting such problems clearly are considered and the construction of different unitary amplitudes which correspond to the same differential cross section is described. So far, examples of phase-shift ambiguities have only been found for rather special cases but the author shows that these results can be considerably generalized for spinless elastic scattering, leading to properties of phase-shift ambiguities being revealed that were previously absent. These properties are discussed in detail. Phase-shift ambiguities for the spin-0-spin-1/2 elastic scattering are then considered and again generalized. The second part of this thesis is concerned with the investigation of a free field theory for both massive and massless particles with higher spin (1, 2 and 3). A root method has been used which is described and shown to lead to the free field equations and the subsidiary conditions. A field equation and Lagrangian are constructed for massive particles and the former is then used to derive a massless field equation and Lagrangian. The relation between massive and massless field equations is investigated in more detail and particularly the expressions for the amplitude describing exchange of a particle between two external sources are compared. (Auth./C.F.)

  13. Selection of important ecological source patches base on Green Infrastructure theory: A case study of Wuhan city

    Science.gov (United States)

    Ke, Yuanyuan; Yu, Yan; Tong, Yan

    2018-01-01

    Selecting urban ecological patches is of great significance for constructing urban green infrastructure network, protecting urban biodiversity and ecological environment. With the support of GIS technology, a criterion for selecting sources of patches was developed according to existing planning. Then ecological source patches of terrestrial organism, aquatic and amphibious organism were selected in Wuhan city. To increase the connectivity of the ecological patches and achieve greater ecological protection benefits, the green infrastructure networks in Wuhan city were constructed with the minimum path analysis method. Finally, the characteristics of ecological source patches were analyzed with landscape metrics, and ecological protection importance degree of ecological source patches were evaluated comprehensively. The results showed that there were 23 important ecological source patches in Wuhan city, among which Sushan Temple Forest Patch, Lu Lake and Shangshe Lake Wetland Patch were the most important in all kinds of patches for ecological protection. This study can provide a scientific basis for the preservation of urban ecological space, the delineation of natural conservation areas and the protection of biological diversity.

  14. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  15. Renormalization group analysis of the temperature dependent coupling constant in massless theory

    International Nuclear Information System (INIS)

    Yamada, Hirofumi.

    1987-06-01

    A general analysis of finite temperature renormalization group equations for massless theories is presented. It is found that in a direction where momenta and temperature are scaled up with their ratio fixed the coupling constant behaves in the same manner as in zero temperature and that asymptotic freedom at short distances is also maintained at finite temperature. (author)

  16. Geophysical Field Theory

    International Nuclear Information System (INIS)

    Eloranta, E.

    2003-11-01

    The geophysical field theory includes the basic principles of electromagnetism, continuum mechanics, and potential theory upon which the computational modelling of geophysical phenomena is based on. Vector analysis is the main mathematical tool in the field analyses. Electrostatics, stationary electric current, magnetostatics, and electrodynamics form a central part of electromagnetism in geophysical field theory. Potential theory concerns especially gravity, but also electrostatics and magnetostatics. Solid state mechanics and fluid mechanics are central parts in continuum mechanics. Also the theories of elastic waves and rock mechanics belong to geophysical solid state mechanics. The theories of geohydrology and mass transport form one central field theory in geophysical fluid mechanics. Also heat transfer is included in continuum mechanics. (orig.)

  17. Applications of surface analysis and surface theory in tribology

    Science.gov (United States)

    Ferrante, John

    1989-01-01

    Tribology, the study of adhesion, friction and wear of materials, is a complex field which requires a knowledge of solid state physics, surface physics, chemistry, material science, and mechanical engineering. It has been dominated, however, by the more practical need to make equipment work. With the advent of surface analysis and advances in surface and solid-state theory, a new dimension has been added to the analysis of interactions at tribological interfaces. In this paper the applications of tribological studies and their limitations are presented. Examples from research at the NASA Lewis Research Center are given. Emphasis is on fundamental studies involving the effects of monolayer coverage and thick films on friction and wear. A summary of the current status of theoretical calculations of defect energetics is presented. In addition, some new theoretical techniques which enable simplified quantitative calculations of adhesion, fracture, and friction are discussed.

  18. Systematic review of empiricism and theory in domestic minor sex trafficking research.

    Science.gov (United States)

    Twis, Mary K; Shelton, Beth Anne

    2018-01-01

    Empiricism and the application of human behavior theory to inquiry are regarded as markers of high-quality research. Unfortunately, scholars have noted that there are many gaps in theory and empiricism within the human trafficking literature, calling into question the legitimacy of policies and practices that are derived from the available data. To date, there has not been an analysis of the extent to which empirical methods and human behavior theory have been applied to domestic minor sex trafficking (DMST) research as a subcategory of human trafficking inquiry. To fill this gap in the literature, this systematic review was designed to assess the degree to which DMST publications are a) empirical, and b) apply human behavior theory to inquiry. This analysis also focuses on answering research questions related to patterns within DMST study data sources, and patterns of human behavior theory application. The results of this review indicate that a minority of sampled DMST publications are empirical, a minority of those articles that were empirical apply a specific human behavior theory within the research design and reporting of results, a minority of articles utilize data collected directly from DMST victims, and that there are no discernible patterns in the application of human behavior theory to DMST research. This research note suggests that DMST research is limited by the same challenges as the larger body of human trafficking scholarship. Based upon these overarching findings, specific recommendations are offered to DMST researchers who are committed to enhancing the quality of DMST scholarship.

  19. Joint source based analysis of multiple brain structures in studying major depressive disorder

    Science.gov (United States)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  20. Global Sourcing of Heterogeneous Firms: Theory and Evidence

    DEFF Research Database (Denmark)

    Kohler, Wilhelm; Smolka, Marcel

    2015-01-01

    The share of international trade within firm boundaries varies greatly across countries. This column presents new evidence on how the productivity of a firm affects the choice between vertical integration and outsourcing, as well as between foreign and domestic sourcing. The productivity effects...