General distributions in process algebra
Katoen, Joost P.; d' Argenio, P.R.; Brinksma, Hendrik; Hermanns, H.; Katoen, Joost P.
2001-01-01
This paper is an informal tutorial on stochastic process algebras, i.e., process calculi where action occurrences may be subject to a delay that is governed by a (mostly continuous) random variable. Whereas most stochastic process algebras consider delays determined by negative exponential
Experiments to Distribute Map Generalization Processes
Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas
2018-05-01
Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.
General Notes on Processes and Their Spectra
Directory of Open Access Journals (Sweden)
Gustav Cepciansky
2012-01-01
Full Text Available The frequency spectrum performs one of the main characteristics of a process. The aim of the paper is to show the coherence between the process and its own spectrum and how the behaviour and properties of a process itself can be deduced from its spectrum. Processes are categorized and general principles of their spectra calculation and recognition are given. The main stress is put on power spectra of electric and optic signals, as they also perform a kind of processes. These spectra can be directly measured, observed and examined by means of spectral analyzers and they are very important characteristics which can not be omitted at transmission techniques in telecommunication technologies. Further, the paper also deals with non electric processes, mainly with processes and spectra at mass servicing and how these spectra can be utilised in praxis.
A generalized integral fluctuation theorem for general jump processes
International Nuclear Information System (INIS)
Liu Fei; Ouyang Zhongcan; Luo Yupin; Huang Mingchang
2009-01-01
Using the Feynman-Kac and Cameron-Martin-Girsanov formulae, we obtain a generalized integral fluctuation theorem (GIFT) for discrete jump processes by constructing a time-invariable inner product. The existing discrete IFTs can be derived as its specific cases. A connection between our approach and the conventional time-reversal method is also established. Unlike the latter approach that has been extensively employed in the existing literature, our approach can naturally bring out the definition of a time reversal of a Markovian stochastic system. Additionally, we find that the robust GIFT usually does not result in a detailed fluctuation theorem. (fast track communication)
OVPD-processed OLED for general lighting
Bösing, Manuel
2012-01-01
Due to continuous advancements of materials for organic light emitting diodes (OLED) a new field of application currently opens up for OLED technology: General lighting. A significant reduction of OLED production cost might be achieved by employing organic vapor phase deposition (OVPD). OVPD is a novel process for depositing organic thin films from the gas phase. In contrast to the well established process of vacuum thermal evaporation (VTE), OVPD allows to achieve much higher deposition rate...
General simulation algorithm for autocorrelated binary processes.
Serinaldi, Francesco; Lombardo, Federico
2017-02-01
The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.
General simulation algorithm for autocorrelated binary processes
Serinaldi, Francesco; Lombardo, Federico
2017-02-01
The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.
Quantum thermodynamics of general quantum processes.
Binder, Felix; Vinjanampathy, Sai; Modi, Kavan; Goold, John
2015-03-01
Accurately describing work extraction from a quantum system is a central objective for the extension of thermodynamics to individual quantum systems. The concepts of work and heat are surprisingly subtle when generalizations are made to arbitrary quantum states. We formulate an operational thermodynamics suitable for application to an open quantum system undergoing quantum evolution under a general quantum process by which we mean a completely positive and trace-preserving map. We derive an operational first law of thermodynamics for such processes and show consistency with the second law. We show that heat, from the first law, is positive when the input state of the map majorizes the output state. Moreover, the change in entropy is also positive for the same majorization condition. This makes a strong connection between the two operational laws of thermodynamics.
A general software reliability process simulation technique
Tausworthe, Robert C.
1991-01-01
The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.
Noise suppression via generalized-Markovian processes
Marshall, Jeffrey; Campos Venuti, Lorenzo; Zanardi, Paolo
2017-11-01
It is by now well established that noise itself can be useful for performing quantum information processing tasks. We present results which show how one can effectively reduce the error rate associated with a noisy quantum channel by counteracting its detrimental effects with another form of noise. In particular, we consider the effect of adding on top of a purely Markovian (Lindblad) dynamics, a more general form of dissipation, which we refer to as generalized-Markovian noise. This noise has an associated memory kernel and the resulting dynamics are described by an integrodifferential equation. The overall dynamics are characterized by decay rates which depend not only on the original dissipative time scales but also on the new integral kernel. We find that one can engineer this kernel such that the overall rate of decay is lowered by the addition of this noise term. We illustrate this technique for the case where the bare noise is described by a dephasing Pauli channel. We analytically solve this model and show that one can effectively double (or even triple) the length of the channel, while achieving the same fidelity, entanglement, and error threshold. We numerically verify this scheme can also be used to protect against thermal Markovian noise (at nonzero temperature), which models spontaneous emission and excitation processes. A physical interpretation of this scheme is discussed, whereby the added generalized-Markovian noise causes the system to become periodically decoupled from the background Markovian noise.
Generalized epidemic process on modular networks.
Chung, Kihong; Baek, Yongjoo; Kim, Daniel; Ha, Meesoon; Jeong, Hawoong
2014-05-01
Social reinforcement and modular structure are two salient features observed in the spreading of behavior through social contacts. In order to investigate the interplay between these two features, we study the generalized epidemic process on modular networks with equal-sized finite communities and adjustable modularity. Using the analytical approach originally applied to clique-based random networks, we show that the system exhibits a bond-percolation type continuous phase transition for weak social reinforcement, whereas a discontinuous phase transition occurs for sufficiently strong social reinforcement. Our findings are numerically verified using the finite-size scaling analysis and the crossings of the bimodality coefficient.
General programmed system for physiological signal processing
Energy Technology Data Exchange (ETDEWEB)
Tournier, E; Monge, J; Magnet, C; Sonrel, C
1975-01-01
Improvements made to the general programmed signal acquisition and processing system, Plurimat S, are described, the aim being to obtain a less specialized system adapted to the biological and medical field. In this modified system the acquisition will be simplified. The standard processings offered will be integrated to a real advanced language which will enable the user to create his own processings, the loss of speed being compensated by a greater flexibility and universality. The observation screen will be large and the quality of the recording very good so that a large signal fraction may be displayed. The data will be easily indexed and filed for subsequent display and processing. This system will be used for two kinds of task: it can either be specialized, as an integral part of measurement and diagnostic preparation equipment used routinely in clinical work (e.g. vectocardiographic examination), or its versatility can be used for studies of limited duration to gain information in a given field or to study new diagnosis or treatment methods.
Negative ion formation processes: A general review
International Nuclear Information System (INIS)
Alton, G.D.
1990-01-01
The principal negative ion formation processes will be briefly reviewed. Primary emphasis will be placed on the more efficient and universal processes of charge transfer and secondary ion formation through non-thermodynamic surface ionization. 86 refs., 20 figs
General Process for Business Idea Generation
Halinen, Anu
2017-01-01
This thesis presents a process for generating ideas with the intent to propagate new business within a micro-company. Utilizing this newly proposed process, generation of new ideas will be initiated allowing for subsequent business plans to be implemented to grow the existing customer base. Cloudberrywind is a family-owned and family-operated micro company in the Finnish region that offers information technology consulting services and support for project management to improve company efficie...
Renewal processes based on generalized Mittag-Leffler waiting times
Cahoy, Dexter O.; Polito, Federico
2013-03-01
The fractional Poisson process has recently attracted experts from several fields of study. Its natural generalization of the ordinary Poisson process made the model more appealing for real-world applications. In this paper, we generalized the standard and fractional Poisson processes through the waiting time distribution, and showed their relations to an integral operator with a generalized Mittag-Leffler function in the kernel. The waiting times of the proposed renewal processes have the generalized Mittag-Leffler and stretched-squashed Mittag-Leffler distributions. Note that the generalizations naturally provide greater flexibility in modeling real-life renewal processes. Algorithms to simulate sample paths and to estimate the model parameters are derived. Note also that these procedures are necessary to make these models more usable in practice. State probabilities and other qualitative or quantitative features of the models are also discussed.
A general model for membrane-based separation processes
DEFF Research Database (Denmark)
Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil
2009-01-01
behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...
20 CFR 405.701 - Expedited appeals process-general.
2010-04-01
... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Expedited appeals process-general. 405.701 Section 405.701 Employees' Benefits SOCIAL SECURITY ADMINISTRATION ADMINISTRATIVE REVIEW PROCESS FOR ADJUDICATING INITIAL DISABILITY CLAIMS Expedited Appeals Process for Constitutional Issues § 405.701 Expedited...
Learning Theory Estimates with Observations from General Stationary Stochastic Processes.
Hang, Hanyuan; Feng, Yunlong; Steinwart, Ingo; Suykens, Johan A K
2016-12-01
This letter investigates the supervised learning problem with observations drawn from certain general stationary stochastic processes. Here by general, we mean that many stationary stochastic processes can be included. We show that when the stochastic processes satisfy a generalized Bernstein-type inequality, a unified treatment on analyzing the learning schemes with various mixing processes can be conducted and a sharp oracle inequality for generic regularized empirical risk minimization schemes can be established. The obtained oracle inequality is then applied to derive convergence rates for several learning schemes such as empirical risk minimization (ERM), least squares support vector machines (LS-SVMs) using given generic kernels, and SVMs using gaussian kernels for both least squares and quantile regression. It turns out that for independent and identically distributed (i.i.d.) processes, our learning rates for ERM recover the optimal rates. For non-i.i.d. processes, including geometrically [Formula: see text]-mixing Markov processes, geometrically [Formula: see text]-mixing processes with restricted decay, [Formula: see text]-mixing processes, and (time-reversed) geometrically [Formula: see text]-mixing processes, our learning rates for SVMs with gaussian kernels match, up to some arbitrarily small extra term in the exponent, the optimal rates. For the remaining cases, our rates are at least close to the optimal rates. As a by-product, the assumed generalized Bernstein-type inequality also provides an interpretation of the so-called effective number of observations for various mixing processes.
Visual Processing in Generally Gifted and Mathematically Excelling Adolescents
Paz-Baruch, Nurit; Leikin, Roza; Leikin, Mark
2016-01-01
Little empirical data are available concerning the cognitive abilities of gifted individuals in general and especially those who excel in mathematics. We examined visual processing abilities distinguishing between general giftedness (G) and excellence in mathematics (EM). The research population consisted of 190 students from four groups of 10th-…
A general conservative extension theorem in process algebras with inequalities
d' Argenio, P.R.; Verhoef, Chris
1997-01-01
We prove a general conservative extension theorem for transition system based process theories with easy-to-check and reasonable conditions. The core of this result is another general theorem which gives sufficient conditions for a system of operational rules and an extension of it in order to
A General Accelerated Degradation Model Based on the Wiener Process.
Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning
2016-12-06
Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.
Generalized Poisson processes in quantum mechanics and field theory
International Nuclear Information System (INIS)
Combe, P.; Rodriguez, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Hoegh-Krohn, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Sirugue, M.; Sirugue-Collin, M.; Centre National de la Recherche Scientifique, 13 - Marseille
1981-01-01
In section 2 we describe more carefully the generalized Poisson processes, giving a realization of the underlying probability space, and we characterize these processes by their characteristic functionals. Section 3 is devoted to the proof of the previous formula for quantum mechanical systems, with possibly velocity dependent potentials and in section 4 we give an application of the previous theory to some relativistic Bose field models. (orig.)
General Template for the FMEA Applications in Primary Food Processing.
Özilgen, Sibel; Özilgen, Mustafa
Data on the hazards involved in the primary steps of processing cereals, fruit and vegetables, milk and milk products, meat and meat products, and fats and oils are compiled with a wide-ranging literature survey. After determining the common factors from these data, a general FMEA template is offered, and its use is explained with a case study on pasteurized milk production.
Process error rates in general research applications to the Human ...
African Journals Online (AJOL)
Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...
Weldability of general purpose heat source new-process iridium
International Nuclear Information System (INIS)
Kanne, W.R.
1987-01-01
Weldability tests on General Purpose Heat Source (GPHS) iridium capsules showed that a new iridium fabrication process reduced susceptibility to underbead cracking. Seventeen capsules were welded (a total of 255 welds) in four categories and the number of cracks in each weld was measured
Generalized Ornstein-Uhlenbeck processes and associated self-similar processes
International Nuclear Information System (INIS)
Lim, S C; Muniandy, S V
2003-01-01
We consider three types of generalized Ornstein-Uhlenbeck processes: the stationary process obtained from the Lamperti transformation of fractional Brownian motion, the process with stretched exponential covariance and the process obtained from the solution of the fractional Langevin equation. These stationary Gaussian processes have many common properties, such as the fact that their local covariances share a similar structure and they exhibit identical spectral densities at large frequency limit. In addition, the generalized Ornstein-Uhlenbeck processes can be shown to be local stationary representations of fractional Brownian motion. Two new self-similar Gaussian processes, in addition to fractional Brownian motion, are obtained by applying the (inverse) Lamperti transformation to the generalized Ornstein-Uhlenbeck processes. We study some of the properties of these self-similar processes such as the long-range dependence. We give a simulation of their sample paths based on numerical Karhunan-Loeve expansion
Generalized Ornstein-Uhlenbeck processes and associated self-similar processes
Lim, S C
2003-01-01
We consider three types of generalized Ornstein-Uhlenbeck processes: the stationary process obtained from the Lamperti transformation of fractional Brownian motion, the process with stretched exponential covariance and the process obtained from the solution of the fractional Langevin equation. These stationary Gaussian processes have many common properties, such as the fact that their local covariances share a similar structure and they exhibit identical spectral densities at large frequency limit. In addition, the generalized Ornstein-Uhlenbeck processes can be shown to be local stationary representations of fractional Brownian motion. Two new self-similar Gaussian processes, in addition to fractional Brownian motion, are obtained by applying the (inverse) Lamperti transformation to the generalized Ornstein-Uhlenbeck processes. We study some of the properties of these self-similar processes such as the long-range dependence. We give a simulation of their sample paths based on numerical Karhunan-Loeve expansi...
Domain-General Factors Influencing Numerical and Arithmetic Processing
Directory of Open Access Journals (Sweden)
André Knops
2017-12-01
Full Text Available This special issue contains 18 articles that address the question how numerical processes interact with domain-general factors. We start the editorial with a discussion of how to define domain-general versus domain-specific factors and then discuss the contributions to this special issue grouped into two core numerical domains that are subject to domain-general influences (see Figure 1. The first group of contributions addresses the question how numbers interact with spatial factors. The second group of contributions is concerned with factors that determine and predict arithmetic understanding, performance and development. This special issue shows that domain-general (Table 1a as well as domain-specific (Table 1b abilities influence numerical and arithmetic performance virtually at all levels and make it clear that for the field of numerical cognition a sole focus on one or several domain-specific factors like the approximate number system or spatial-numerical associations is not sufficient. Vice versa, in most studies that included domain-general and domain-specific variables, domain-specific numerical variables predicted arithmetic performance above and beyond domain-general variables. Therefore, a sole focus on domain-general aspects such as, for example, working memory, to explain, predict and foster arithmetic learning is also not sufficient. Based on the articles in this special issue we conclude that both domain-general and domain-specific factors contribute to numerical cognition. But the how, why and when of their contribution still needs to be better understood. We hope that this special issue may be helpful to readers in constraining future theory and model building about the interplay of domain-specific and domain-general factors.
A General Accelerated Degradation Model Based on the Wiener Process
Directory of Open Access Journals (Sweden)
Le Liu
2016-12-01
Full Text Available Accelerated degradation testing (ADT is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.
Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz
Vanicat, Matthieu
2018-04-01
We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.
MOTRIMS as a generalized probe of AMO processes
International Nuclear Information System (INIS)
Bredy, R.; Nguyen, H.; Camp, H.; Flechard, X.; De Paola, B.D.
2003-01-01
Magneto-optical trap recoil ion momentum spectroscopy (MOTRIMS) is one of the newest offshoots of the generalized TRIMS approach to ion-atom collisions. By using lasers instead of the more usual supersonic expansion to cool the target, MOTRIMS has demonstrated two distinct advantages over conventional TRIMS. The first is better resolution, now limited by detectors instead of target temperature. The second is its suitability for use in the study of laser-excited targets. In this presentation we will present a third advantage: The use of MOTRIMS as a general-purpose probe of AMO processes in cold atomic clouds of atoms and molecules. Specifically, the projectile ion beam can be used as a probe of processes as diverse as target dressing by femtosecond optical pulses, photo-association (laser-assisted cold collisions) photo-ionization, and electromagnetically-induced transparency. We will present data for the processes we have investigated, and speculations on what we expect to see for the processes we plan to investigate in the future
Markov Jump Processes Approximating a Non-Symmetric Generalized Diffusion
International Nuclear Information System (INIS)
Limić, Nedžad
2011-01-01
Consider a non-symmetric generalized diffusion X(⋅) in ℝ d determined by the differential operator A(x) = -Σ ij ∂ i a ij (x)∂ j + Σ i b i (x)∂ i . In this paper the diffusion process is approximated by Markov jump processes X n (⋅), in homogeneous and isotropic grids G n ⊂ℝ d , which converge in distribution in the Skorokhod space D([0,∞),ℝ d ) to the diffusion X(⋅). The generators of X n (⋅) are constructed explicitly. Due to the homogeneity and isotropy of grids, the proposed method for d≥3 can be applied to processes for which the diffusion tensor {a ij (x)} 11 dd fulfills an additional condition. The proposed construction offers a simple method for simulation of sample paths of non-symmetric generalized diffusion. Simulations are carried out in terms of jump processes X n (⋅). For piece-wise constant functions a ij on ℝ d and piece-wise continuous functions a ij on ℝ 2 the construction and principal algorithm are described enabling an easy implementation into a computer code.
Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)
Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.
2016-05-01
This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.
Lévy processes on a generalized fractal comb
Sandev, Trifce; Iomin, Alexander; Méndez, Vicenç
2016-09-01
Comb geometry, constituted of a backbone and fingers, is one of the most simple paradigm of a two-dimensional structure, where anomalous diffusion can be realized in the framework of Markov processes. However, the intrinsic properties of the structure can destroy this Markovian transport. These effects can be described by the memory and spatial kernels. In particular, the fractal structure of the fingers, which is controlled by the spatial kernel in both the real and the Fourier spaces, leads to the Lévy processes (Lévy flights) and superdiffusion. This generalization of the fractional diffusion is described by the Riesz space fractional derivative. In the framework of this generalized fractal comb model, Lévy processes are considered, and exact solutions for the probability distribution functions are obtained in terms of the Fox H-function for a variety of the memory kernels, and the rate of the superdiffusive spreading is studied by calculating the fractional moments. For a special form of the memory kernels, we also observed a competition between long rests and long jumps. Finally, we considered the fractal structure of the fingers controlled by a Weierstrass function, which leads to the power-law kernel in the Fourier space. This is a special case, when the second moment exists for superdiffusion in this competition between long rests and long jumps.
The role of culture in the general practice consultation process.
Ali, Nasreen; Atkin, Karl; Neal, Richard
2006-11-01
In this paper, we will examine the importance of culture and ethnicity in the general practice consultation process. Good communication is associated with positive health outcomes. We will, by presenting qualitative material from an empirical study, examine the way in which communication within the context of a general practitioner (GP) consultation may be affected by ethnicity and cultural factors. The aim of the study was to provide a detailed understanding of the ways in which white and South Asian patients communicate with white GPs and to explore any similarities and differences in communication. This paper reports on South Asian and white patients' explanations of recent videotaped consultations with their GP. We specifically focus on the ways in which issues of ethnic identity impacted upon the GP consultation process, by exploring how our sample of predominantly white GPs interacted with their South Asian patients and the extent to which the GP listened to the patients' needs, gave patients information, engaged in social conversation and showed friendliness. We then go on to examine patients' suggestions on improvements (if any) to the consultation. We conclude, by showing how a non-essentialist understanding of culture helps to comprehend the consultation process when the patients are from Great Britain's ethnicised communities. Our findings, however, raise generic issues of relevance to all multi-racial and multi-ethnic societies.
Lévy processes on a generalized fractal comb
International Nuclear Information System (INIS)
Sandev, Trifce; Iomin, Alexander; Méndez, Vicenç
2016-01-01
Comb geometry, constituted of a backbone and fingers, is one of the most simple paradigm of a two-dimensional structure, where anomalous diffusion can be realized in the framework of Markov processes. However, the intrinsic properties of the structure can destroy this Markovian transport. These effects can be described by the memory and spatial kernels. In particular, the fractal structure of the fingers, which is controlled by the spatial kernel in both the real and the Fourier spaces, leads to the Lévy processes (Lévy flights) and superdiffusion. This generalization of the fractional diffusion is described by the Riesz space fractional derivative. In the framework of this generalized fractal comb model, Lévy processes are considered, and exact solutions for the probability distribution functions are obtained in terms of the Fox H -function for a variety of the memory kernels, and the rate of the superdiffusive spreading is studied by calculating the fractional moments. For a special form of the memory kernels, we also observed a competition between long rests and long jumps. Finally, we considered the fractal structure of the fingers controlled by a Weierstrass function, which leads to the power-law kernel in the Fourier space. This is a special case, when the second moment exists for superdiffusion in this competition between long rests and long jumps. (paper)
GENERAL ISSUES CONSIDERING BRAND EQUITY WITHIN THE NATION BRANDING PROCESS
Directory of Open Access Journals (Sweden)
Denisa, COTÎRLEA
2014-11-01
Full Text Available The present work-paper was written in order to provide an overview of the intangible values that actively contribute to brand capital formation within the nation branding process; through this article, the author tried to emphasize the differences existent between brand capital and brand equity within the context of the nation branding process, which has became a widely approached subject both in the national and international literature. Also, the evolution of brand capital and brand equity was approached, in order to identify and explain their components and their role, by highlighting the entire process of their evolution under a sequence of steps scheme. The results of this paper are focused on the identification of a structured flowchart through which the process of nation branding -and the brand capital itself- are to be perceived as holistic concepts, integrator and inter-correlated ones, easily understood.The methodology used in order to write the present article resumes to all appropriate methods and techniques used for collecting and processing empirical data and information, respectively to observing, sorting, correlating, categorizing, comparing and analyzing data, so that the addressed theoretical elements could have been founded; in the center of the qualitative thematic research addressed in the present article lie general elements belonging to Romania's image and identity promotion.
A Poisson process approximation for generalized K-5 confidence regions
Arsham, H.; Miller, D. R.
1982-01-01
One-sided confidence regions for continuous cumulative distribution functions are constructed using empirical cumulative distribution functions and the generalized Kolmogorov-Smirnov distance. The band width of such regions becomes narrower in the right or left tail of the distribution. To avoid tedious computation of confidence levels and critical values, an approximation based on the Poisson process is introduced. This aproximation provides a conservative confidence region; moreover, the approximation error decreases monotonically to 0 as sample size increases. Critical values necessary for implementation are given. Applications are made to the areas of risk analysis, investment modeling, reliability assessment, and analysis of fault tolerant systems.
Information in general medical practices: the information processing model.
Crowe, Sarah; Tully, Mary P; Cantrill, Judith A
2010-04-01
The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.
Generalized Hofmann quantum process fidelity bounds for quantum filters
Sedlák, Michal; Fiurášek, Jaromír
2016-04-01
We propose and investigate bounds on the quantum process fidelity of quantum filters, i.e., probabilistic quantum operations represented by a single Kraus operator K . These bounds generalize the Hofmann bounds on the quantum process fidelity of unitary operations [H. F. Hofmann, Phys. Rev. Lett. 94, 160504 (2005), 10.1103/PhysRevLett.94.160504] and are based on probing the quantum filter with pure states forming two mutually unbiased bases. Determination of these bounds therefore requires far fewer measurements than full quantum process tomography. We find that it is particularly suitable to construct one of the probe bases from the right eigenstates of K , because in this case the bounds are tight in the sense that if the actual filter coincides with the ideal one, then both the lower and the upper bounds are equal to 1. We theoretically investigate the application of these bounds to a two-qubit optical quantum filter formed by the interference of two photons on a partially polarizing beam splitter. For an experimentally convenient choice of factorized input states and measurements we study the tightness of the bounds. We show that more stringent bounds can be obtained by more sophisticated processing of the data using convex optimization and we compare our methods for different choices of the input probe states.
Use of general purpose graphics processing units with MODFLOW
Hughes, Joseph D.; White, Jeremy T.
2013-01-01
To evaluate the use of general-purpose graphics processing units (GPGPUs) to improve the performance of MODFLOW, an unstructured preconditioned conjugate gradient (UPCG) solver has been developed. The UPCG solver uses a compressed sparse row storage scheme and includes Jacobi, zero fill-in incomplete, and modified-incomplete lower-upper (LU) factorization, and generalized least-squares polynomial preconditioners. The UPCG solver also includes options for sequential and parallel solution on the central processing unit (CPU) using OpenMP. For simulations utilizing the GPGPU, all basic linear algebra operations are performed on the GPGPU; memory copies between the central processing unit CPU and GPCPU occur prior to the first iteration of the UPCG solver and after satisfying head and flow criteria or exceeding a maximum number of iterations. The efficiency of the UPCG solver for GPGPU and CPU solutions is benchmarked using simulations of a synthetic, heterogeneous unconfined aquifer with tens of thousands to millions of active grid cells. Testing indicates GPGPU speedups on the order of 2 to 8, relative to the standard MODFLOW preconditioned conjugate gradient (PCG) solver, can be achieved when (1) memory copies between the CPU and GPGPU are optimized, (2) the percentage of time performing memory copies between the CPU and GPGPU is small relative to the calculation time, (3) high-performance GPGPU cards are utilized, and (4) CPU-GPGPU combinations are used to execute sequential operations that are difficult to parallelize. Furthermore, UPCG solver testing indicates GPGPU speedups exceed parallel CPU speedups achieved using OpenMP on multicore CPUs for preconditioners that can be easily parallelized.
Kolmogorov's refined similarity hypotheses for turbulence and general stochastic processes
International Nuclear Information System (INIS)
Stolovitzky, G.; Sreenivasan, K.R.
1994-01-01
Kolmogorov's refined similarity hypotheses are shown to hold true for a variety of stochastic processes besides high-Reynolds-number turbulent flows, for which they were originally proposed. In particular, just as hypothesized for turbulence, there exists a variable V whose probability density function attains a universal form. Analytical expressions for the probability density function of V are obtained for Brownian motion as well as for the general case of fractional Brownian motion---the latter under some mild assumptions justified a posteriori. The properties of V for the case of antipersistent fractional Brownian motion with the Hurst exponent of 1/3 are similar in many details to those of high-Reynolds-number turbulence in atmospheric boundary layers a few meters above the ground. The one conspicuous difference between turbulence and the antipersistent fractional Brownian motion is that the latter does not possess the required skewness. Broad implications of these results are discussed
Cortical processes of speech illusions in the general population.
Schepers, E; Bodar, L; van Os, J; Lousberg, R
2016-10-18
There is evidence that experimentally elicited auditory illusions in the general population index risk for psychotic symptoms. As little is known about underlying cortical mechanisms of auditory illusions, an experiment was conducted to analyze processing of auditory illusions in a general population sample. In a follow-up design with two measurement moments (baseline and 6 months), participants (n = 83) underwent the White Noise task under simultaneous recording with a 14-lead EEG. An auditory illusion was defined as hearing any speech in a sound fragment containing white noise. A total number of 256 speech illusions (SI) were observed over the two measurements, with a high degree of stability of SI over time. There were 7 main effects of speech illusion on the EEG alpha band-the most significant indicating a decrease in activity at T3 (t = -4.05). Other EEG frequency bands (slow beta, fast beta, gamma, delta, theta) showed no significant associations with SI. SIs are characterized by reduced alpha activity in non-clinical populations. Given the association of SIs with psychosis, follow-up research is required to examine the possibility of reduced alpha activity mediating SIs in high risk and symptomatic populations.
A general theory for radioactive processes in rare earth compounds
International Nuclear Information System (INIS)
Acevedo, R.; Meruane, T.
1998-01-01
The formal theory of radiative processes in centrosymmetric coordination compounds of the Ln X 3+ is a trivalent lanthanide ion and X -1 =Cl -1 , Br -1 ) is put forward based on a symmetry vibronic crystal field-ligand polarisation model. This research considers a truncated basis set for the intermediate states of the central metal ion and have derived general master equations to account for both the overall observed spectral intensities and the measured relative vibronic intensity distributions for parity forbidden but vibronically allowed electronic transitions. In addition, a procedure which includes the closure approximation over the intermediate electronic states is included in order to estimate quantitative crystal field contribution to the total transition dipole moments of various and selected electronic transitions. This formalism is both general and flexible and it may be employed in any electronic excitations involving f N type configurations for the rare earths in centrosymmetric co-ordination compounds in cubic environments and also in doped host crystals belonging to the space group Fm 3m. (author)
Information processing during general anesthesia: Evidence for unconscious memory
A.E. Bonebakker (Annette); B. Bonke (Benno); J. Klein (Jan); G. Wolters (G.); Th. Stijnen (Theo); J. Passchier (Jan); P.M. Merikle (P.)
1996-01-01
textabstractMemory for words presented during general anesthesia was studied in two experiments. In Experiment 1, surgical patients (n=80) undergoing elective procedures under general anesthesia were presented shortly before and during surgery with words via headphones. At the earliest convenient
Outcrossings of safe regions by generalized hyperbolic processes
DEFF Research Database (Denmark)
Klüppelberg, Claudia; Rasmussen, Morten Grud
2013-01-01
We present a simple Gaussian mixture model in space and time with generalized hyperbolic marginals. Starting with Rice’s celebrated formula for level upcrossings and outcrossings of safe regions we investigate the consequences of the mean-variance mixture model on such quantities. We obtain...
General framework for adsorption processes on dynamic interfaces
International Nuclear Information System (INIS)
Schmuck, Markus; Kalliadasis, Serafim
2016-01-01
We propose a novel and general variational framework modelling particle adsorption mechanisms on evolving immiscible fluid interfaces. A by-product of our thermodynamic approach is that we systematically obtain analytic adsorption isotherms for given equilibrium interfacial geometries. We validate computationally our mathematical methodology by demonstrating the fundamental properties of decreasing interfacial free energies by increasing interfacial particle densities and of decreasing surface pressure with increasing surface area. (paper)
General birth-death processes: probabilities, inference, and applications
Crawford, Forrest Wrenn
2012-01-01
A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. Each particle can give birth to another particle or die, and the rate of births and deaths at any given time depends on how many extant particles there are. Birth-death processes are popular modeling tools in evolution, population biology, genetics, epidemiology, and ecology. Despite the widespread interest in birth-death models, no efficient method exists to evaluate the fini...
International Nuclear Information System (INIS)
Chidume, C.E.; Ofoedu, E.U.
2007-07-01
In this paper, we introduce a new iteration process and prove that it converges strongly to a common fixed point for a finite family of generalized Lipschitz nonlinear mappings in a real reflexive Banach space E with a with uniformly Gateaux differentiable norm if at least one member of the family is pseudo-contractive. We also prove that a slight modification of the process converges to a common zero for a finite family of generalized Lipschitz accretive operators defined on E. Results for nonexpansive families are obtained as easy corollaries. Finally, our new iteration process and our method of proof are of independent interest. (author)
A General Representation Theorem for Integrated Vector Autoregressive Processes
DEFF Research Database (Denmark)
Franchi, Massimo
We study the algebraic structure of an I(d) vector autoregressive process, where d is restricted to be an integer. This is useful to characterize its polynomial cointegrating relations and its moving average representation, that is to prove a version of the Granger representation theorem valid...
Tuned with a tune: Talker normalization via general auditory processes
Directory of Open Access Journals (Sweden)
Erika J C Laing
2012-06-01
Full Text Available Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker’s speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS of a talker’s speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences’ LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by nonspeech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization.
Neural Generalized Predictive Control of a non-linear Process
DEFF Research Database (Denmark)
Sørensen, Paul Haase; Nørgård, Peter Magnus; Ravn, Ole
1998-01-01
The use of neural network in non-linear control is made difficult by the fact the stability and robustness is not guaranteed and that the implementation in real time is non-trivial. In this paper we introduce a predictive controller based on a neural network model which has promising stability qu...... detail and discuss the implementation difficulties. The neural generalized predictive controller is tested on a pneumatic servo sys-tem.......The use of neural network in non-linear control is made difficult by the fact the stability and robustness is not guaranteed and that the implementation in real time is non-trivial. In this paper we introduce a predictive controller based on a neural network model which has promising stability...... qualities. The controller is a non-linear version of the well-known generalized predictive controller developed in linear control theory. It involves minimization of a cost function which in the present case has to be done numerically. Therefore, we develop the numerical algorithms necessary in substantial...
DYNSYL: a general-purpose dynamic simulator for chemical processes
International Nuclear Information System (INIS)
Patterson, G.K.; Rozsa, R.B.
1978-01-01
Lawrence Livermore Laboratory is conducting a safeguards program for the Nuclear Regulatory Commission. The goal of the Material Control Project of this program is to evaluate material control and accounting (MCA) methods in plants that handle special nuclear material (SNM). To this end we designed and implemented the dynamic chemical plant simulation program DYNSYL. This program can be used to generate process data or to provide estimates of process performance; it simulates both steady-state and dynamic behavior. The MCA methods that may have to be evaluated range from sophisticated on-line material trackers such as Kalman filter estimators, to relatively simple material balance procedures. This report describes the overall structure of DYNSYL and includes some example problems. The code is still in the experimental stage and revision is continuing
Towards Device-Independent Information Processing on General Quantum Networks
Lee, Ciarán M.; Hoban, Matty J.
2018-01-01
The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.
International Nuclear Information System (INIS)
Chidume, C.E.; Ofoedu, E.U.
2007-07-01
Let K be a nonempty closed convex subset of a real Banach space E. Let T : K → K be a generalized Lipschitz pseudo-contractive mapping such that F(T) := { x element of K : Tx = x} ≠ 0. Let { α n } n ≥ 1 , { λ n } n ≥ 1 and { θ n } n ≥ 1 be real sequences in (0, 1) such that α n = o( θ n ), lim n →∞ λ n = 0 and λ n ( α n + θ n ) 1 element of K, let the sequence { x n } n ≥ 1 be iteratively generated by x n+1 = (1 - λ n α n )x n + λ n α n Tx n - λ n θ n (x n - x 1 ), n ≥ 1. Then, { x n } n ≥ 1 is bounded. Moreover, if E is a reflexive Banach space with uniformly Gateaux differentiable norm and if Σ n=1 ∞ λ n θ n = ∞ is additionally assumed, then, under mild conditions, left brace# x n } n ≥ 1 converges strongly to some x* element of F(T). (author)
On the 2-orthogonal polynomials and the generalized birth and death processes
Directory of Open Access Journals (Sweden)
Zerouki Ebtissem
2006-01-01
Full Text Available We discuss the connections between the 2-orthogonal polynomials and the generalized birth and death processes. Afterwards, we find the sufficient conditions to give an integral representation of the transition probabilities from these processes.
2010-11-23
... the Attorney General; Certification Process for State Capital Counsel Systems; Removal of Final Rule... only if the Attorney General has certified ``that [the] State has established a mechanism for providing... State to qualify for the special habeas procedures, the Attorney General must determine that ``the State...
2010-05-25
... Office of the Attorney General; Certification Process for State Capital Counsel Systems; Removal of Final Rule AGENCY: Office of the Attorney General, Department of Justice. ACTION: Notice of proposed... the Attorney General has certified ``that [the] State has established a mechanism for providing...
Ekici, Didem Inel
2016-01-01
This study aimed to determine Turkish junior high-school students' perceptions of the general problem-solving process. The Turkish junior high-school students' perceptions of the general problem-solving process were examined in relation to their gender, grade level, age and their grade point with regards to the science course identified in the…
International Nuclear Information System (INIS)
Hirschmann, H.
1983-06-01
The consequences of the basic assumptions of the semi-Markov process as defined from a homogeneous renewal process with a stationary Markov condition are reviewed. The notion of the semi-Markov process is generalized by its redefinition from a nonstationary Markov renewal process. For both the nongeneralized and the generalized case a representation of the first order conditional state probabilities is derived in terms of the transition probabilities of the Markov renewal process. Some useful calculation rules (regeneration rules) are derived for the conditional state probabilities of the semi-Markov process. Compared to the semi-Markov process in its usual definition the generalized process allows the analysis of a larger class of systems. For instance systems with arbitrarily distributed lifetimes of their components can be described. There is also a chance to describe systems which are modified during time by forces or manipulations from outside. (Auth.)
Toward a General Research Process for Using Dubin's Theory Building Model
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
GENERAL ALGORITHMIC SCHEMA OF THE PROCESS OF THE CHILL AUXILIARIES PROJECTION
Directory of Open Access Journals (Sweden)
A. N. Chichko
2006-01-01
Full Text Available The general algorithmic diagram of systematization of the existing approaches to the process of projection is offered and the foundation of computer system of the chill mold arming construction is laid.
PECULIARITIES OF GENERALIZATION OF SIMILAR PHENOMENA IN THE PROCESS OF FISH HEAT TREATMENT
Directory of Open Access Journals (Sweden)
V. A. Pokhol’chenko
2015-01-01
Full Text Available The theoretical presuppositions for the possibility of generalizing and similarity founding in dehydration and wet materials heating processes are studieded in this article. It is offered to carry out the given processes generalization by using dimensionless numbers of similarity. At the detailed analyzing of regularities of heat treatment processes of fish in different modes a significant amount of experienced material was successfully generalized on the basis of dimensionless simplex (similarity numbers. Using the dimensionless simplex allowed to detect a number of simple mathematical models for the studied phenomena. The generalized kinetic models of fish dehydration, the generalized dynamic models (changing moisture diffusion coefficients, the generalized kinetic models of fish heating (the temperature field changing in the products thickness, average volume and center were founded. These generalized mathematical models showed also relationship of dehydration and heating at the processes of fish semi-hot, hot smoking (drying and frying. The relationship of the results from the physical nature of the dehydration process, including a change in the binding energy of the moisture with the material to the extent of the process and the shrinkage impact on the rate of the product moisture removal is given in the article. The factors influencing the internal structure and properties of the raw material changing and retarding the dehydration processes are described there. There was a heating rate dependence of fish products on the chemical composition the geometric dimensions of the object of heating and on the coolant regime parameters. A unique opportunity is opened by using the generalized models, combined with empirically derived equations and the technique of engineering calculation of these processes, to design a rational modes of heat treatment of raw materials and to optimize the performance of thermal equipment.
A generalized fluctuation-dissipation theorem for the one-dimensional diffusion process
International Nuclear Information System (INIS)
Okabe, Y.
1985-01-01
The [α,β,γ]-Langevin equation describes the time evolution of a real stationary process with T-positivity (reflection positivity) originating in the axiomatic quantum field theory. For this [α,β,γ]-Langevin equation a generalized fluctuation-dissipation theorem is proved. We shall obtain, as its application, a generalized fluctuation-dissipation theorem for the one-dimensional non-linear diffusion process, which presents one solution of Ryogo Kubo's problem in physics. (orig.)
Integer valued autoregressive processes with generalized discrete Mittag-Leffler marginals
Directory of Open Access Journals (Sweden)
Kanichukattu K. Jose
2013-05-01
Full Text Available In this paper we consider a generalization of discrete Mittag-Leffler distributions. We introduce and study the properties of a new distribution called geometric generalized discrete Mittag-Leffler distribution. Autoregressive processes with geometric generalized discrete Mittag-Leffler distributions are developed and studied. The distributions are further extended to develop a more general class of geometric generalized discrete semi-Mittag-Leffler distributions. The processes are extended to higher orders also. An application with respect to an empirical data on customer arrivals in a bank counter is also given. Various areas of potential applications like human resource development, insect growth, epidemic modeling, industrial risk modeling, insurance and actuaries, town planning etc are also discussed.
SIMULATION AND PREDICTION OF THE PROCESS BASED ON THE GENERAL LOGISTIC MAPPING
Directory of Open Access Journals (Sweden)
V. V. Skalozub
2013-11-01
Full Text Available Purpose. The aim of the research is to build a model of the generalzed logistic mapping and assessment of the possibilities of its use for the formation of the mathematical description, as well as operational forecasts of parameters of complex dynamic processes described by the time series. Methodology. The research results are obtained on the basis of mathematical modeling and simulation of nonlinear systems using the tools of chaotic dynamics. Findings. A model of the generalized logistic mapping, which is used to interpret the characteristics of dynamic processes was proposed. We consider some examples of representations of processes based on enhanced logistic mapping varying the values of model parameters. The procedures of modeling and interpretation of the data on the investigated processes, represented by the time series, as well as the operational forecasting of parameters using the generalized model of logistic mapping were proposed. Originality. The paper proposes an improved mathematical model, generalized logistic mapping, designed for the study of nonlinear discrete dynamic processes. Practical value. The carried out research using the generalized logistic mapping of railway transport processes, in particular, according to assessment of the parameters of traffic volumes, indicate the great potential of its application in practice for solving problems of analysis, modeling and forecasting complex nonlinear discrete dynamical processes. The proposed model can be used, taking into account the conditions of uncertainty, irregularity, the manifestations of the chaotic nature of the technical, economic and other processes, including the railway ones.
Chen, H L; Wang, J K; Zhang, L L; Wu, Z Y
2000-04-01
Determining and comparing the contents of general flavonoides in four kinds of differently-processed products of Epimedium acuminatum. Determining the contents by ultraviolet spectrophotometry. The contents were found in the following seguence: unprocessed product, clearly-fried product, alcohol-broiled product, salt-broiled product, sheep-fat-broiled product. The average recovery rate was 96.01%, with a 0.74% RSD(n = 5). Heating causes the contents of general flavonoides in the processed products to decrease. These processed products are still often used in clinical treatment, for the reason that the adjuvant features certain coordinating and promoting functions. The study is to be pursued further.
The Burden of the Fellowship Interview Process on General Surgery Residents and Programs.
Watson, Shawna L; Hollis, Robert H; Oladeji, Lasun; Xu, Shin; Porterfield, John R; Ponce, Brent A
This study evaluated the effect of the fellowship interview process in a cohort of general surgery residents. We hypothesized that the interview process would be associated with significant clinical time lost, monetary expenses, and increased need for shift coverage. An online anonymous survey link was sent via e-mail to general surgery program directors in June 2014. Program directors distributed an additional survey link to current residents in their program who had completed the fellowship interview process. United States allopathic general surgery programs. Overall, 50 general surgery program directors; 72 general surgery residents. Program directors reported a fellowship application rate of 74.4%. Residents most frequently attended 8 to 12 interviews (35.2%). Most (57.7%) of residents reported missing 7 or more days of clinical training to attend interviews; these shifts were largely covered by other residents. Most residents (62.3%) spent over $4000 on the interview process. Program directors rated fellowship burden as an average of 6.7 on a 1 to 10 scale of disruption, with 10 being a significant disruption. Most of the residents (57.3%) were in favor of change in the interview process. We identified potential areas for improvement including options for coordinated interviews and improved content on program websites. The surgical fellowship match is relatively burdensome to residents and programs alike, and merits critical assessment for potential improvement. Published by Elsevier Inc.
International Nuclear Information System (INIS)
Cherstvy, Andrey G; Metzler, Ralf
2015-01-01
We study generalized anomalous diffusion processes whose diffusion coefficient D(x, t) ∼ D 0 |x| α t β depends on both the position x of the test particle and the process time t. This process thus combines the features of scaled Brownian motion and heterogeneous diffusion parent processes. We compute the ensemble and time averaged mean squared displacements of this generalized diffusion process. The scaling exponent of the ensemble averaged mean squared displacement is shown to be the product of the critical exponents of the parent processes, and describes both subdiffusive and superdiffusive systems. We quantify the amplitude fluctuations of the time averaged mean squared displacement as function of the length of the time series and the lag time. In particular, we observe a weak ergodicity breaking of this generalized diffusion process: even in the long time limit the ensemble and time averaged mean squared displacements are strictly disparate. When we start to observe this process some time after its initiation we observe distinct features of ageing. We derive a universal ageing factor for the time averaged mean squared displacement containing all information on the ageing time and the measurement time. External confinement is shown to alter the magnitudes and statistics of the ensemble and time averaged mean squared displacements. (paper)
Generalized Inferences about the Mean Vector of Several Multivariate Gaussian Processes
Directory of Open Access Journals (Sweden)
Pilar Ibarrola
2015-01-01
Full Text Available We consider in this paper the problem of comparing the means of several multivariate Gaussian processes. It is assumed that the means depend linearly on an unknown vector parameter θ and that nuisance parameters appear in the covariance matrices. More precisely, we deal with the problem of testing hypotheses, as well as obtaining confidence regions for θ. Both methods will be based on the concepts of generalized p value and generalized confidence region adapted to our context.
Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory
Veal, William R.; Taylor, Dawne; Rogers, Amy L.
2009-03-01
Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.
A generalized logarithmic image processing model based on the gigavision sensor model.
Deng, Guang
2012-03-01
The logarithmic image processing (LIP) model is a mathematical theory providing generalized linear operations for image processing. The gigavision sensor (GVS) is a new imaging device that can be described by a statistical model. In this paper, by studying these two seemingly unrelated models, we develop a generalized LIP (GLIP) model. With the LIP model being its special case, the GLIP model not only provides new insights into the LIP model but also defines new image representations and operations for solving general image processing problems that are not necessarily related to the GVS. A new parametric LIP model is also developed. To illustrate the application of the new scalar multiplication operation, we propose an energy-preserving algorithm for tone mapping, which is a necessary step in image dehazing. By comparing with results using two state-of-the-art algorithms, we show that the new scalar multiplication operation is an effective tool for tone mapping.
The process of patient enablement in general practice nurse consultations: a grounded theory study.
Desborough, Jane; Banfield, Michelle; Phillips, Christine; Mills, Jane
2017-05-01
The aim of this study was to gain insight into the process of patient enablement in general practice nursing consultations. Enhanced roles for general practice nurses may benefit patients through a range of mechanisms, one of which may be increasing patient enablement. In studies with general practitioners enhanced patient enablement has been associated with increases in self-efficacy and skill development. This study used a constructivist grounded theory design. In-depth interviews were conducted with 16 general practice nurses and 23 patients from 21 general practices between September 2013 - March 2014. Data generation and analysis were conducted concurrently using constant comparative analysis and theoretical sampling focussing on the process and outcomes of patient enablement. Use of the storyline technique supported theoretical coding and integration of the data into a theoretical model. A clearly defined social process that fostered and optimised patient enablement was constructed. The theory of 'developing enabling healthcare partnerships between nurses and patients in general practice' incorporates three stages: triggering enabling healthcare partnerships, tailoring care and the manifestation of patient enablement. Patient enablement was evidenced through: 1. Patients' understanding of their unique healthcare requirements informing their health seeking behaviours and choices; 2. Patients taking an increased lead in their partnership with a nurse and seeking choices in their care and 3. Patients getting health care that reflected their needs, preferences and goals. This theoretical model is in line with a patient-centred model of health care and is particularly suited to patients with chronic disease. © 2016 John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
Olexandr Tyhorskyy
2015-08-01
Full Text Available Purpose: to improve the method of training highly skilled bodybuilders during the general preparatory phase. Material and Methods: the study involved eight highly skilled athletes, members of the team of Ukraine on bodybuilding. Results: comparative characteristics of the most commonly used methods of training process in bodybuilding. Developed and substantiated the optimal method of training highly skilled bodybuilders during the general preparatory phase of the preparatory period, which can increase body weight through muscle athletes component. Conclusions: based on studies, recommended the optimum method of training highly skilled bodybuilders depending on mezotsykles and microcycles general preparatory phase
Schenck, Natalya A.; Horvath, Philip A.; Sinha, Amit K.
2018-02-01
While the literature on price discovery process and information flow between dominant and satellite market is exhaustive, most studies have applied an approach that can be traced back to Hasbrouck (1995) or Gonzalo and Granger (1995). In this paper, however, we propose a Generalized Langevin process with asymmetric double-well potential function, with co-integrated time series and interconnected diffusion processes to model the information flow and price discovery process in two, a dominant and a satellite, interconnected markets. A simulated illustration of the model is also provided.
Profile of science process skills of Preservice Biology Teacher in General Biology Course
Susanti, R.; Anwar, Y.; Ermayanti
2018-04-01
This study aims to obtain portrayal images of science process skills among preservice biology teacher. This research took place in Sriwijaya University and involved 41 participants. To collect the data, this study used multiple choice test comprising 40 items to measure the mastery of science process skills. The data were then analyzed in descriptive manner. The results showed that communication aspect outperfomed the other skills with that 81%; while the lowest one was identifying variables and predicting (59%). In addition, basic science process skills was 72%; whereas for integrated skills was a bit lower, 67%. In general, the capability of doing science process skills varies among preservice biology teachers.
General induction at companies - between an administrative process and a sociological phenomenon
Directory of Open Access Journals (Sweden)
Héctor L. Bermúdez Restrepo
2012-12-01
Full Text Available From the example of the process of general induction into the organization and using certain sociological resources, shows paradox are specialists in human management: is to carefor the motivation and the welfare of workers to achieve its high performance, their fidelity and his tenure at the company; However, current mutations of the social architecture in general and of work in particular –as structure of organized action– force thinking that organizational loyalty tends to be increasingly unlikely and that, conversely, the current personnel administration processes appear made inappropriate notions and appear to contribute directly to the adversities of human beings in organizational settings.
Alloza, Clara; Cox, Simon R; Duff, Barbara; Semple, Scott I; Bastin, Mark E; Whalley, Heather C; Lawrie, Stephen M
2016-08-30
Several authors have proposed that schizophrenia is the result of impaired connectivity between specific brain regions rather than differences in local brain activity. White matter abnormalities have been suggested as the anatomical substrate for this dysconnectivity hypothesis. Information processing speed may act as a key cognitive resource facilitating higher order cognition by allowing multiple cognitive processes to be simultaneously available. However, there is a lack of established associations between these variables in schizophrenia. We hypothesised that the relationship between white matter and general intelligence would be mediated by processing speed. White matter water diffusion parameters were studied using Tract-based Spatial Statistics and computed within 46 regions-of-interest (ROI). Principal component analysis was conducted on these white matter ROI for fractional anisotropy (FA) and mean diffusivity, and on neurocognitive subtests to extract general factors of white mater structure (gFA, gMD), general intelligence (g) and processing speed (gspeed). There was a positive correlation between g and gFA (r= 0.67, p =0.001) that was partially and significantly mediated by gspeed (56.22% CI: 0.10-0.62). These findings suggest a plausible model of structure-function relations in schizophrenia, whereby white matter structure may provide a neuroanatomical substrate for general intelligence, which is partly supported by speed of information processing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
On-line validation of linear process models using generalized likelihood ratios
International Nuclear Information System (INIS)
Tylee, J.L.
1981-12-01
A real-time method for testing the validity of linear models of nonlinear processes is described and evaluated. Using generalized likelihood ratios, the model dynamics are continually monitored to see if the process has moved far enough away from the nominal linear model operating point to justify generation of a new linear model. The method is demonstrated using a seventh-order model of a natural circulation steam generator
General definitions of chaos for continuous and discrete-time processes
Vieru, Andrei
2008-01-01
A precise definition of chaos for discrete processes based on iteration already exists. We shall first reformulate it in a more general frame, taking into account the fact that discrete chaotic behavior is neither necessarily based on iteration nor strictly related to compact metric spaces or to bounded functions. Then we shall apply the central idea of this definition to continuous processes. We shall try to see what chaos is, regardless of the way it is generated.
2011-01-25
... and Development (HFM-40), Center for Biologics Evaluation and Research (CBER), Food and Drug...] Guidance for Industry on Process Validation: General Principles and Practices; Availability AGENCY: Food... of Drug Information, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New...
Henderson, Emily J; Rubin, Greg P
2013-05-01
To evaluate the utility of Isabel, an online diagnostic decision support system developed by Isabel Healthcare primarily for secondary medical care, in the general practice setting. Focus groups were conducted with clinicians to understand why and how they used the system. A modified online post-use survey asked practitioners about its impact on their decision-making. Normalization process theory (NPT) was used as a theoretical framework to determine whether the system could be incorporated into routine clinical practice. The system was introduced by NHS County Durham and Darlington in the UK in selected general practices as a three-month pilot. General practitioners and nurse practitioners who had access to Isabel as part of the Primary Care Trust's pilot. General practitioners' views, experiences and usage of the system. Seven general practices agreed to pilot Isabel. Two practices did not subsequently use it. The remaining five practices conducted searches on 16 patients. Post-use surveys (n = 10) indicated that Isabel had little impact on diagnostic decision-making. Focus group participants stated that, although the diagnoses produced by Isabel in general did not have an impact on their decision-making, they would find the tool useful if it were better tailored to the primary care setting. Our analysis concluded that normalization was not likely to occur in its current form. Isabel was of limited utility in this short pilot study and may need further modification for use in general practice.
Process mapping as a framework for performance improvement in emergency general surgery.
DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad
2018-02-01
Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.
Is general intelligence little more than the speed of higher-order processing?
Schubert, Anna-Lena; Hagemann, Dirk; Frischkorn, Gidon T
2017-10-01
Individual differences in the speed of information processing have been hypothesized to give rise to individual differences in general intelligence. Consistent with this hypothesis, reaction times (RTs) and latencies of event-related potential have been shown to be moderately associated with intelligence. These associations have been explained either in terms of individual differences in some brain-wide property such as myelination, the speed of neural oscillations, or white-matter tract integrity, or in terms of individual differences in specific processes such as the signal-to-noise ratio in evidence accumulation, executive control, or the cholinergic system. Here we show in a sample of 122 participants, who completed a battery of RT tasks at 2 laboratory sessions while an EEG was recorded, that more intelligent individuals have a higher speed of higher-order information processing that explains about 80% of the variance in general intelligence. Our results do not support the notion that individuals with higher levels of general intelligence show advantages in some brain-wide property. Instead, they suggest that more intelligent individuals benefit from a more efficient transmission of information from frontal attention and working memory processes to temporal-parietal processes of memory storage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Directory of Open Access Journals (Sweden)
Roopa Shivashankar
2016-01-01
Full Text Available Aim: To assess the level of adherence to diabetes care processes, and associated clinic and patient factors at general practices in Delhi, India. Methods: We interviewed physicians (n = 23 and patients with diabetes (n = 406, and reviewed patient charts at general practices (government = 5; private = 18. We examined diabetes care processes, specifically measurement of weight, blood pressure (BP, glycated hemoglobin (HbA1c, lipids, electrocardiogram, dilated eye, and a foot examination in the last one year. We analyzed clinic and patient factors associated with a number of care processes achieved using multilevel Poisson regression model. Results: The average number of clinic visits per patient was 8.8/year (standard deviation = 5.7, and physicians had access to patient's previous records in only 19.7% of patients. Dilated eye exam, foot exam, and electrocardiogram were completed in 7.4%, 15.1%, and 29.1% of patients, respectively. An estimated 51.7%, 88.4%, and 28.1% had ≥1 measurement of HbA1c, BP, and lipids, respectively. Private clinics, physician access to patient's previous records, use of nonphysicians, patient education, and the presence of diabetes complication were positively associated with a number of care processes in the multivariable model. Conclusion: Adherence to diabetes care processes was suboptimal. Encouraging implementation of quality improvement strategies like Chronic Care Model elements at general practices may improve diabetes care.
Generalization bounds of ERM-based learning processes for continuous-time Markov chains.
Zhang, Chao; Tao, Dacheng
2012-12-01
Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.
Directory of Open Access Journals (Sweden)
Lawrence I. EDET
2015-09-01
Full Text Available The general account of Nigeria’s post-independence electoral processes has always been characterized by violence. Nigeria’s 2015 general elections marked the fifth multi-party elections in the country and the second handover of civilian administrations since the inception of the Fourth Republic democratic experiment in 1999. This account cannot be analyzed without issues of electoral violence. Electoral violence had been a permanent feature of Nigeria’s democratic process, except 2015 general elections where the international observers described as a “significant improvement” over the previous elections in terms of violence related cases. Electoral related violence in the country particularly in 2011 got to an unprecedented dimension resulting in destruction of lives and property worth millions of naira. This paper expatiates on electoral violence and its general implications on the democratization process in the country, with major emphasis on the 2011 and 2015 general elections. The paper argued that the high incidence of pre and post electoral violence in the country within the periods has to do with the way Nigerian politicians regard politics, weak political institutions and weak electoral management body as well as bias nature of the security agencies, etc. However, the paper examined the general implications of electoral violence on democratization process and how the country can handle the electoral process to avoid threats associated with the electoral violence. Archival analysis, which widely extracted data from newspapers, journals, workshop papers, books, as well as publications of non-governmental organizations was adopted for the study. The major significance of this study is to expose the negative implications associated with electoral violence and how it can be curbed. The position canvassed in this paper will serve as a useful political literature for political leaders, policy makers and the general reading public who
Toward a model framework of generalized parallel componential processing of multi-symbol numbers.
Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph
2015-05-01
In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining and investigating a sign-decade compatibility effect for the comparison of positive and negative numbers, which extends the unit-decade compatibility effect in 2-digit number processing. Then, we evaluated whether the model is capable of accounting for previous findings in negative number processing. In a magnitude comparison task, in which participants had to single out the larger of 2 integers, we observed a reliable sign-decade compatibility effect with prolonged reaction times for incompatible (e.g., -97 vs. +53; in which the number with the larger decade digit has the smaller, i.e., negative polarity sign) as compared with sign-decade compatible number pairs (e.g., -53 vs. +97). Moreover, an analysis of participants' eye fixation behavior corroborated our model of parallel componential processing of multi-symbol numbers. These results are discussed in light of concurrent theoretical notions about negative number processing. On the basis of the present results, we propose a generalized integrated model framework of parallel componential multi-symbol processing. (c) 2015 APA, all rights reserved).
Bio-inspired Artificial Intelligence: А Generalized Net Model of the Regularization Process in MLP
Directory of Open Access Journals (Sweden)
Stanimir Surchev
2013-10-01
Full Text Available Many objects and processes inspired by the nature have been recreated by the scientists. The inspiration to create a Multilayer Neural Network came from human brain as member of the group. It possesses complicated structure and it is difficult to recreate, because of the existence of too many processes that require different solving methods. The aim of the following paper is to describe one of the methods that improve learning process of Artificial Neural Network. The proposed generalized net method presents Regularization process in Multilayer Neural Network. The purpose of verification is to protect the neural network from overfitting. The regularization is commonly used in neural network training process. Many methods of verification are present, the subject of interest is the one known as Regularization. It contains function in order to set weights and biases with smaller values to protect from overfitting.
Generalization of the photo process window and its application to OPC test pattern design
Eisenmann, Hans; Peter, Kai; Strojwas, Andrzej J.
2003-07-01
From the early development phase up to the production phase, test pattern play a key role for microlithography. The requirement for test pattern is to represent the design well and to cover the space of all process conditions, e.g. to investigate the full process window and all other process parameters. This paper shows that the current state-of-the-art test pattern do not address these requirements sufficiently and makes suggestions for a better selection of test pattern. We present a new methodology to analyze an existing layout (e.g. logic library, test pattern or full chip) for critical layout situations which does not need precise process data. We call this method "process space decomposition", because it is aimed at decomposing the process impact to a layout feature into a sum of single independent contributions, the dimensions of the process space. This is a generalization of the classical process window, which examines defocus and exposure dependency of given test pattern, e.g. CD value of dense and isolated lines. In our process space we additionally define the dimensions resist effects, etch effects, mask error and misalignment, which describe the deviation of the printed silicon pattern from its target. We further extend it by the pattern space using a product based layout (library, full chip or synthetic test pattern). The criticality of pattern is defined by their deviation due to aerial image, their sensitivity to the respective dimension or several combinations of these. By exploring the process space for a given design, the method allows to find the most critical patterns independent of specific process parameters. The paper provides examples for different applications of the method: (1) selection of design oriented test pattern for lithography development (2) test pattern reduction in process characterization (3) verification/optimization of printability and performance of post processing procedures (like OPC) (4) creation of a sensitive process
Generalized random walk algorithm for the numerical modeling of complex diffusion processes
Vamos, C; Vereecken, H
2003-01-01
A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested.
Generalized random walk algorithm for the numerical modeling of complex diffusion processes
International Nuclear Information System (INIS)
Vamos, Calin; Suciu, Nicolae; Vereecken, Harry
2003-01-01
A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested
Energy Technology Data Exchange (ETDEWEB)
Loughry, Thomas A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-02-01
As the volume of data acquired by space-based sensors increases, mission data compression/decompression and forward error correction code processing performance must likewise scale. This competency development effort was explored using the General Purpose Graphics Processing Unit (GPGPU) to accomplish high-rate Rice Decompression and high-rate Reed-Solomon (RS) decoding at the satellite mission ground station. Each algorithm was implemented and benchmarked on a single GPGPU. Distributed processing across one to four GPGPUs was also investigated. The results show that the GPGPU has considerable potential for performing satellite communication Data Signal Processing, with three times or better performance improvements and up to ten times reduction in cost over custom hardware, at least in the case of Rice Decompression and Reed-Solomon Decoding.
General purpose graphic processing unit implementation of adaptive pulse compression algorithms
Cai, Jingxiao; Zhang, Yan
2017-07-01
This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.
Audit and account billing process in a private general hospital: a case study
Directory of Open Access Journals (Sweden)
Raquel Silva Bicalho Zunta
2017-12-01
Full Text Available Our study aimed to map, describe and, validate the audit, account billing and billing reports processes in a large, private general hospital. An exploratory, descriptive, case report study. We conducted non-participatory observation moments in Internal Audit Sectors and Billing Reports from the hospital, aiming to map the processes which were the study objects. The data obtained was validated by internal and external audit specialists in hospital bills. The described and illustrated processes in three flow-charts favor professionals to rationalize their activities and the time spent in hospital billing, avoiding or minimizing the occurrence of flaws and, generating more effective financial results. The mapping, the description and the audit validation process and billing and, the billing reports propitiated more visibility and legitimacy to actions developed by auditor nurses.
Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units
International Nuclear Information System (INIS)
Perumalla, Kalyan S.
2006-01-01
Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.
Introduction of a pyramid guiding process for general musculoskeletal physical rehabilitation
Directory of Open Access Journals (Sweden)
Stark Timothy W
2006-06-01
Full Text Available Abstract Successful instruction of a complicated subject as Physical Rehabilitation demands organization. To understand principles and processes of such a field demands a hierarchy of steps to achieve the intended outcome. This paper is intended to be an introduction to a proposed pyramid scheme of general physical rehabilitation principles. The purpose of the pyramid scheme is to allow for a greater understanding for the student and patient. As the respected Food Guide Pyramid accomplishes, the student will further appreciate and apply supported physical rehabilitation principles and the patient will understand that there is a progressive method to their functional healing process.
Directory of Open Access Journals (Sweden)
Ruslan Skrynkovskyy
2017-12-01
Full Text Available The purpose of the article is to improve the model of the enterprise (institution, organization management process on the basis of general management functions. The graphic model of the process of management according to the process-structured management is presented. It has been established that in today's business environment, the model of the management process should include such general management functions as: 1 controlling the achievement of results; 2 planning based on the main goal; 3 coordination and corrective actions (in the system of organization of work and production; 4 action as a form of act (conscious, volitional, directed; 5 accounting system (accounting, statistical, operational-technical and managerial; 6 diagnosis (economic, legal with such subfunctions as: identification of the state and capabilities; analysis (economic, legal, systemic with argumentation; assessment of the state, trends and prospects of development. The prospect of further research in this direction is: 1 the formation of a system of interrelation of functions and management methods, taking into account the presented research results; 2 development of the model of effective and efficient communication business process of the enterprise.
MacNamara, Annmarie; Proudfit, Greg Hajcak
2014-01-01
Generalized Anxiety Disorder (GAD) may be characterized by emotion regulation deficits attributable to an imbalance between top-down (i.e., goal-driven) and bottom-up (i.e., stimulus-driven) attention. In prior work, these attentional processes were examined by presenting unpleasant and neutral pictures within a working memory paradigm. The late positive potential (LPP) measured attention toward task-irrelevant pictures. Results from this prior work showed that working memory load reduced the...
The Green-Kubo formula for general Markov processes with a continuous time parameter
International Nuclear Information System (INIS)
Yang Fengxia; Liu Yong; Chen Yong
2010-01-01
For general Markov processes, the Green-Kubo formula is shown to be valid under a mild condition. A class of stochastic evolution equations on a separable Hilbert space and three typical infinite systems of locally interacting diffusions on Z d (irreversible in most cases) are shown to satisfy the Green-Kubo formula, and the Einstein relations for these stochastic evolution equations are shown explicitly as a corollary.
International Nuclear Information System (INIS)
Deschaud, B.; Peyrusse, O.; Rosmej, F.B.
2014-01-01
Generalized atomic processes are proposed to establish a consistent description from the free-atom approach to the heated and even up to the cold solid. It is based on a rigorous introduction of the Fermi-Dirac statistics, Pauli blocking factors and on the respect of the principle of detailed balance via the introduction of direct and inverse processes. A probability formalism driven by the degeneracy of the free electrons enables to establish a link of atomic rates valid from the heated atom up to the cold solid. This allows to describe photoionization processes in atomic population kinetics and subsequent solid matter heating on a femtosecond time scale. The Auger effect is linked to the 3-body recombination via a generalized 3-body recombination that is identified as a key mechanism, along with the collisional ionization, that follows energy deposition by photoionization of inner shells when short, intense and high-energy radiation interacts with matter. Detailed simulations are carried out for aluminum that highlight the importance of the generalized approach. (authors)
General methodology for exergy balance in ProSimPlus® process simulator
International Nuclear Information System (INIS)
Ghannadzadeh, Ali; Thery-Hetreux, Raphaële; Baudouin, Olivier; Baudet, Philippe; Floquet, Pascal; Joulia, Xavier
2012-01-01
This paper presents a general methodology for exergy balance in chemical and thermal processes integrated in ProSimPlus ® as a well-adapted process simulator for energy efficiency analysis. In this work, as well as using the general expressions for heat and work streams, the whole exergy balance is presented within only one software in order to fully automate exergy analysis. In addition, after exergy balance, the essential elements such as source of irreversibility for exergy analysis are presented to help the user for modifications on either process or utility system. The applicability of the proposed methodology in ProSimPlus ® is shown through a simple scheme of Natural Gas Liquids (NGL) recovery process and its steam utility system. The methodology does not only provide the user with necessary exergetic criteria to pinpoint the source of exergy losses, it also helps the user to find the way to reduce the exergy losses. These features of the proposed exergy calculator make it preferable for its implementation in ProSimPlus ® to define the most realistic and profitable retrofit projects on the existing chemical and thermal plants. -- Highlights: ► A set of new expressions for calculation of exergy of material streams is developed. ► A general methodology for exergy balance in ProSimPlus ® is presented. ► A panel of solutions based on exergy analysis is provided to help the user for modifications on process flowsheets. ► The exergy efficiency is chosen as a variable in a bi-criteria optimization.
ABOUT THE GENERAL CONCEPT OF THE UNIVERSAL STORAGE SYSTEM AND PRACTICE-ORIENTED DATA PROCESSING
Directory of Open Access Journals (Sweden)
L. V. Rudikova
2017-01-01
Full Text Available Approaches evolution and concept of data accumulation in warehouse and subsequent Data Mining use is perspective due to the fact that, Belarusian segment of the same IT-developments is organizing. The article describes the general concept for creation a system of storage and practice-oriented data analysis, based on the data warehousing technology. The main aspect in universal system design on storage layer and working with data is approach uses extended data warehouse, based on universal platform of stored data, which grants access to storage and subsequent data analysis different structure and subject domains have compound’s points (nodes and extended functional with data structure choice option for data storage and subsequent intrasystem integration. Describe the universal system general architecture of storage and analysis practice-oriented data, structural elements. Main components of universal system for storage and processing practice-oriented data are: online data sources, ETL-process, data warehouse, subsystem of analysis, users. An important place in the system is analytical processing of data, information search, document’s storage and providing a software interface for accessing the functionality of the system from the outside. An universal system based on describing concept will allow collection information of different subject domains, get analytical summaries, do data processing and apply appropriate Data Mining methods and algorithms.
Mahomed, Rosemary; St John, Winsome; Patterson, Elizabeth
2012-11-01
To investigate the process of patient satisfaction with nurse-led chronic disease management in Australian general practice. Nurses working in the primary care context of general practice, referred to as practice nurses, are expanding their role in chronic disease management; this is relatively new to Australia. Therefore, determining patient satisfaction with this trend is pragmatically and ethically important. However, the concept of patient satisfaction is not well understood particularly in relation to care provided by practice nurses. A grounded theory study underpinned by a relativist ontological position and a relativist epistemology. Grounded theory was used to develop a theory from data collected through in-depth interviews with 38 participants between November 2007-April 2009. Participants were drawn from a larger project that trialled a practice nurse-led, collaborative model of chronic disease management in three Australian general practices. Theoretical sampling, data collection, and analysis were conducted concurrently consistent with grounded theory methods. Patients undergo a cyclical process of Navigating Care involving three stages, Determining Care Needs, Forming Relationship, and Having Confidence. The latter two processes are inter-related and a feedback loop from them informs subsequent cycles of Determining Care Needs. If any of these steps fails to develop adequately, patients are likely to opt out of nurse-led care. Navigating Care explains how and why time, communication, continuity, and trust in general practitioners and nurses are important to patient satisfaction. It can be used in identifying suitable patients for practice nurse-led care and to inform the practice and organization of practice nurse-led care to enhance patient satisfaction. © 2012 Blackwell Publishing Ltd.
41 CFR 102-37.50 - What is the general process for requesting surplus property for donation?
2010-07-01
... process for requesting surplus property for donation? 102-37.50 Section 102-37.50 Public Contracts and... REGULATION PERSONAL PROPERTY 37-DONATION OF SURPLUS PERSONAL PROPERTY General Provisions Donation Overview § 102-37.50 What is the general process for requesting surplus property for donation? The process for...
Generalized enthalpy model of a high-pressure shift freezing process
Smith, N. A. S.
2012-05-02
High-pressure freezing processes are a novel emerging technology in food processing, offering significant improvements to the quality of frozen foods. To be able to simulate plateau times and thermal history under different conditions, in this work, we present a generalized enthalpy model of the high-pressure shift freezing process. The model includes the effects of pressure on conservation of enthalpy and incorporates the freezing point depression of non-dilute food samples. In addition, the significant heat-transfer effects of convection in the pressurizing medium are accounted for by solving the two-dimensional Navier-Stokes equations. We run the model for several numerical tests where the food sample is agar gel, and find good agreement with experimental data from the literature. © 2012 The Royal Society.
Directory of Open Access Journals (Sweden)
Svitlana G. Lytvynova
2018-04-01
Full Text Available The article analyzes the historical aspect of the formation of computer modeling as one of the perspective directions of educational process development. The notion of “system of computer modeling”, conceptual model of system of computer modeling (SCMod, its components (mathematical, animation, graphic, strategic, functions, principles and purposes of use are grounded. The features of the organization of students work using SCMod, individual and group work, the formation of subject competencies are described; the aspect of students’ motivation to learning is considered. It is established that educational institutions can use SCMod at different levels and stages of training and in different contexts, which consist of interrelated physical, social, cultural and technological aspects. It is determined that the use of SCMod in general secondary school would increase the capacity of teachers to improve the training of students in natural and mathematical subjects and contribute to the individualization of the learning process, in order to meet the pace, educational interests and capabilities of each particular student. It is substantiated that the use of SCMod in the study of natural-mathematical subjects contributes to the formation of subject competencies, develops the skills of analysis and decision-making, increases the level of digital communication, develops vigilance, raises the level of knowledge, increases the duration of attention of students. Further research requires the justification of the process of forming students’ competencies in natural-mathematical subjects and designing cognitive tasks using SCMod.
Moule, Pam; Clompus, Susan; Fieldhouse, Jon; Ellis-Jones, Julie; Barker, Jacqueline
2018-05-25
Underuse of anticoagulants in atrial fibrillation is known to increase the risk of stroke and is an international problem. The National Institute for Health Care and Excellence guidance CG180 seeks to reduce atrial fibrillation related strokes through prescriptions of Non-vitamin K antagonist Oral Anticoagulants. A quality improvement programme was established by the West of England Academic Health Science Network (West of England AHSN) to implement this guidance into General Practice. A realist evaluation identified whether the quality improvement programme worked, determining how and in what circumstances. Six General Practices in 1 region, became the case study sites. Quality improvement team, doctor, and pharmacist meetings within each of the General Practices were recorded at 3 stages: initial planning, review, and final. Additionally, 15 interviews conducted with the practice leads explored experiences of the quality improvement process. Observation and interview data were analysed and compared against the initial programme theory. The quality improvement resources available were used variably, with the training being valued by all. The initial programme theories were refined. In particular, local workload pressures and individual General Practitioner experiences and pre-conceived ideas were acknowledged. Where key motivators were in place, such as prior experience, the programme achieved optimal outcomes and secured a lasting quality improvement legacy. The employment of a quality improvement programme can deliver practice change and improvement legacy outcomes when particular mechanisms are employed and in contexts where there is a commitment to improve service. © 2018 John Wiley & Sons, Ltd.
A Dirichlet process mixture of generalized Dirichlet distributions for proportional data modeling.
Bouguila, Nizar; Ziou, Djemel
2010-01-01
In this paper, we propose a clustering algorithm based on both Dirichlet processes and generalized Dirichlet distribution which has been shown to be very flexible for proportional data modeling. Our approach can be viewed as an extension of the finite generalized Dirichlet mixture model to the infinite case. The extension is based on nonparametric Bayesian analysis. This clustering algorithm does not require the specification of the number of mixture components to be given in advance and estimates it in a principled manner. Our approach is Bayesian and relies on the estimation of the posterior distribution of clusterings using Gibbs sampler. Through some applications involving real-data classification and image databases categorization using visual words, we show that clustering via infinite mixture models offers a more powerful and robust performance than classic finite mixtures.
Forbidden Raman scattering processes. I. General considerations and E1--M1 scattering
International Nuclear Information System (INIS)
Harney, R.C.
1979-01-01
The generalized theory of forbidden Raman scattering processes is developed in terms of the multipole expansion of the electromagnetic interaction Hamiltonian. Using the general expressions, the theory of electric dipole--magnetic dipole (E1--M1) Raman scattering is derived in detail. The 1 S 0 → 3 P 1 E1--M1 Raman scattering cross section in atomic magnesium is calculated for two applicable laser wavelengths using published f-value data. Since resonantly enhanced cross sections larger than 10 -29 cm 2 /sr are predicted it should be possible to experimentally observe this scattering phenomenon. In addition, by measuring the frequency dependence of the cross section near resonance, it may be possible to directly determine the relative magnitudes of the Axp and AxA contributions to the scattering cross section. Finally, possible applications of the effect in atomic and molecular physics are discussed
Non-rigid ultrasound image registration using generalized relaxation labeling process
Lee, Jong-Ha; Seong, Yeong Kyeong; Park, MoonHo; Woo, Kyoung-Gu; Ku, Jeonghun; Park, Hee-Jun
2013-03-01
This research proposes a novel non-rigid registration method for ultrasound images. The most predominant anatomical features in medical images are tissue boundaries, which appear as edges. In ultrasound images, however, other features can be identified as well due to the specular reflections that appear as bright lines superimposed on the ideal edge location. In this work, an image's local phase information (via the frequency domain) is used to find the ideal edge location. The generalized relaxation labeling process is then formulated to align the feature points extracted from the ideal edge location. In this work, the original relaxation labeling method was generalized by taking n compatibility coefficient values to improve non-rigid registration performance. This contextual information combined with a relaxation labeling process is used to search for a correspondence. Then the transformation is calculated by the thin plate spline (TPS) model. These two processes are iterated until the optimal correspondence and transformation are found. We have tested our proposed method and the state-of-the-art algorithms with synthetic data and bladder ultrasound images of in vivo human subjects. Experiments show that the proposed method improves registration performance significantly, as compared to other state-of-the-art non-rigid registration algorithms.
Crawford, Forrest W.; Suchard, Marc A.
2011-01-01
A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with n current particles, a new particle is born with instantaneous rate λn and a particle dies with instantaneous rate μn. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics. PMID:21984359
Kerr-de Sitter spacetime, Penrose process, and the generalized area theorem
Bhattacharya, Sourav
2018-04-01
We investigate various aspects of energy extraction via the Penrose process in the Kerr-de Sitter spacetime. We show that the increase in the value of a positive cosmological constant, Λ , always reduces the efficiency of this process. The Kerr-de Sitter spacetime has two ergospheres associated with the black hole and the cosmological event horizons. We prove by analyzing turning points of the trajectory that the Penrose process in the cosmological ergoregion is never possible. We next show that in this process both the black hole and cosmological event horizons' areas increase, and the latter becomes possible when the particle coming from the black hole ergoregion escapes through the cosmological event horizon. We identify a new, local mass function instead of the mass parameter, to prove this generalized area theorem. This mass function takes care of the local spacetime energy due to the cosmological constant as well, including that which arises due to the frame-dragging effect due to spacetime rotation. While the current observed value of Λ is quite small, its effect in this process could be considerable in the early Universe scenario where its value is much larger, where the two horizons could have comparable sizes. In particular, the various results we obtain here are also evaluated in a triply degenerate limit of the Kerr-de Sitter spacetime we find, in which radial values of the inner, the black hole and the cosmological event horizons are nearly coincident.
Directory of Open Access Journals (Sweden)
Olexandr Tyhorskyy
2015-10-01
Full Text Available Purpose: to improve the method of training highly skilled bodybuilders. Material and Methods: the study involved eight highly skilled athletes, members of the team of Ukraine on bodybuilding. Results: comparative characteristics of the most commonly used methods of training process in bodybuilding. Developed and substantiated the optimal method of training highly skilled bodybuilders during the general preparatory phase of the preparatory period, which can increase body weight through muscle athletes component. Conclusions: dynamic load factor to raise the intensity of training loads allows orientation help to increase volumes shoulder muscles
Correia, J R C C C; Martins, C J A P
2017-10-01
Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.
Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process
Directory of Open Access Journals (Sweden)
Chuancun Yin
2015-01-01
Full Text Available We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy.
Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process
Yuen, Kam Chuen; Shen, Ying
2015-01-01
We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy. PMID:26351655
Correia, J. R. C. C. C.; Martins, C. J. A. P.
2017-10-01
Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.
Generalized Fractional Processes with Long Memory and Time Dependent Volatility Revisited
Directory of Open Access Journals (Sweden)
M. Shelton Peiris
2016-09-01
Full Text Available In recent years, fractionally-differenced processes have received a great deal of attention due to their flexibility in financial applications with long-memory. This paper revisits the class of generalized fractionally-differenced processes generated by Gegenbauer polynomials and the ARMA structure (GARMA with both the long-memory and time-dependent innovation variance. We establish the existence and uniqueness of second-order solutions. We also extend this family with innovations to follow GARCH and stochastic volatility (SV. Under certain regularity conditions, we give asymptotic results for the approximate maximum likelihood estimator for the GARMA-GARCH model. We discuss a Monte Carlo likelihood method for the GARMA-SV model and investigate finite sample properties via Monte Carlo experiments. Finally, we illustrate the usefulness of this approach using monthly inflation rates for France, Japan and the United States.
General description of few-body break-up processes at threshold
International Nuclear Information System (INIS)
Barrachina, R.O.
2004-01-01
Full text: In this communication we present a general description of the behavior of fragmentation processes near threshold by analyzing the break-up into two, three and N bodies in steps of increasing complexity. In particular, we describe the effects produced by an N-body threshold behavior in N+1 body break-up processes, as it occurs in situations where one of the fragments acquires almost all the excess energy of the system. Furthermore, we relate the appearance of cusps and discontinuities in single-particle multiply differential cross sections to the threshold behavior of the remaining particles, and apply these ideas to different systems from atomic, molecular and nuclear collision physics. We finally show that, even though the study of ultracold collisions represents the direct way of gathering information on a break-up system near threshold, the analysis of high-energy collisions provides an alternative, and sometimes advantageous, approach
Naylor, Larissa; Coombes, Martin; Sewell, Jack; White, Anissia
2014-05-01
Coastal processes shape the coast into a variety of eye-catching and enticing landforms that attract people to marvel at, relax and enjoy coastal geomorphology. Field guides to explain these processes (and the geodiversity that results) to the general public and children are few and far between. In contrast, there is a relative wealth of resources and organised activities introducing people to coastal wildlife, especially on rocky shores. These biological resources typically focus on the biology and climatic controls on their distribution, rather than how the biology interacts with its physical habitat. As an outcome of two recent rock coast biogeomorphology projects (www.biogeomorph.org/coastal/coastaldefencedbiodiversity and www.biogeomorph.org/coastal/bioprotection ), we produced the first known guide to understanding how biogeomorphological processes help create coastal landforms. The 'Shore Shapers' guide (www.biogeomorph.org/coastal/shoreshapers) is designed to: a) bring biotic-geomorphic interactions to life and b) introduce some of the geomorphological and geological controls on biogeomorphic processes and landform development. The guide provides scientific information in an accessible and interactive way - to help sustain children's interest and extend their learning. We tested a draft version of our guide with children, the general public and volunteers on rocky shore rambles using social science techniques and of 74 respondents, 75.6% were more interested in understanding how rock pools (i.e. coastal landforms) develop after seeing the guide. Respondents' opinions about key bioprotective species also changed as a result of seeing the guide - 58% of people found barnacles unattractive before they saw the guide whilst 36% of respondents were more interested in barnacles after seeing the guide. These results demonstrate that there is considerable interest in more educational materials on coastal biogeomorphology and geodiversity.
Degradation data analysis based on a generalized Wiener process subject to measurement error
Li, Junxing; Wang, Zhihua; Zhang, Yongbo; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar
2017-09-01
Wiener processes have received considerable attention in degradation modeling over the last two decades. In this paper, we propose a generalized Wiener process degradation model that takes unit-to-unit variation, time-correlated structure and measurement error into considerations simultaneously. The constructed methodology subsumes a series of models studied in the literature as limiting cases. A simple method is given to determine the transformed time scale forms of the Wiener process degradation model. Then model parameters can be estimated based on a maximum likelihood estimation (MLE) method. The cumulative distribution function (CDF) and the probability distribution function (PDF) of the Wiener process with measurement errors are given based on the concept of the first hitting time (FHT). The percentiles of performance degradation (PD) and failure time distribution (FTD) are also obtained. Finally, a comprehensive simulation study is accomplished to demonstrate the necessity of incorporating measurement errors in the degradation model and the efficiency of the proposed model. Two illustrative real applications involving the degradation of carbon-film resistors and the wear of sliding metal are given. The comparative results show that the constructed approach can derive a reasonable result and an enhanced inference precision.
Energy Technology Data Exchange (ETDEWEB)
Allu, Srikanth [ORNL; Velamur Asokan, Badri [Exxon Mobil Research and Engineering; Shelton, William A [Louisiana State University; Philip, Bobby [ORNL; Pannala, Sreekanth [ORNL
2014-01-01
A generalized three dimensional computational model based on unied formulation of electrode- electrolyte-electrode system of a electric double layer supercapacitor has been developed. The model accounts for charge transport across the solid-liquid system. This formulation based on volume averaging process is a widely used concept for the multiphase ow equations ([28] [36]) and is analogous to porous media theory typically employed for electrochemical systems [22] [39] [12]. This formulation is extended to the electrochemical equations for a supercapacitor in a consistent fashion, which allows for a single-domain approach with no need for explicit interfacial boundary conditions as previously employed ([38]). In this model it is easy to introduce the spatio-temporal variations, anisotropies of physical properties and it is also conducive for introducing any upscaled parameters from lower length{scale simulations and experiments. Due to the irregular geometric congurations including porous electrode, the charge transport and subsequent performance characteristics of the super-capacitor can be easily captured in higher dimensions. A generalized model of this nature also provides insight into the applicability of 1D models ([38]) and where multidimensional eects need to be considered. In addition, simple sensitivity analysis on key input parameters is performed in order to ascertain the dependence of the charge and discharge processes on these parameters. Finally, we demonstarted how this new formulation can be applied to non-planar supercapacitors
Directory of Open Access Journals (Sweden)
Michele Biasutti
2017-06-01
Full Text Available Improvisation is an articulated multidimensional activity based on an extemporaneous creative performance. Practicing improvisation, participants expand sophisticated skills such as sensory and perceptual encoding, memory storage and recall, motor control, and performance monitoring. Improvisation abilities have been developed following several methodologies mainly with a product-oriented perspective. A model framed under the socio-cultural theory of learning for designing didactic activities on processes instead of outcomes is presented in the current paper. The challenge is to overcome the mere instructional dimension of some practices of teaching improvisation by designing activities that stimulate self-regulated learning strategies in the students. In the article the present thesis is declined in three ways, concerning the following three possible areas of application: (1 high-level musical learning, (2 musical pedagogy with children, (3 general pedagogy. The applications in the music field focusing mainly on an expert's use of improvisation are discussed. The last section considers how these ideas should transcend music studies, presenting the benefits and the implications of improvisation activities for general learning. Moreover, the application of music education to the following cognitive processes are discussed: anticipation, use of repertoire, emotive communication, feedback and flow. These characteristics could be used to outline a pedagogical method for teaching music improvisation based on the development of reflection, reasoning, and meta-cognition.
DNA Processing and Reassembly on General Purpose FPGA-based Development Boards
Directory of Open Access Journals (Sweden)
SZÁSZ Csaba
2017-05-01
Full Text Available The great majority of researchers involved in microelectronics generally agree that many scientific challenges in life sciences have associated with them a powerful computational requirement that must be solved before scientific progress can be made. The current trend in Deoxyribonucleic Acid (DNA computing technologies is to develop special hardware platforms capable to provide the needed processing performance at lower cost. In this endeavor the FPGA-based (Field Programmable Gate Arrays configurations aimed to accelerate genome sequencing and reassembly plays a leading role. This paper emphasizes benefits and advantages using general purpose FPGA-based development boards in DNA reassembly applications beside the special hardware architecture solutions. An original approach is unfolded which outlines the versatility of high performance ready-to-use manufacturer development platforms endowed with powerful hardware resources fully optimized for high speed processing applications. The theoretical arguments are supported via an intuitive implementation example where the designer it is discharged from any hardware development effort and completely assisted in exclusive concentration only on software design issues providing greatly reduced application development cycles. The experiments prove that such boards available on the market are suitable to fulfill in all a wide range of DNA sequencing and reassembly applications.
General emotion processing in social anxiety disorder: neural issues of cognitive control.
Brühl, Annette Beatrix; Herwig, Uwe; Delsignore, Aba; Jäncke, Lutz; Rufer, Michael
2013-05-30
Anxiety disorders are characterized by deficient emotion regulation prior to and in anxiety-evoking situations. Patients with social anxiety disorder (SAD) have increased brain activation also during the anticipation and perception of non-specific emotional stimuli pointing to biased general emotion processing. In the current study we addressed the neural correlates of emotion regulation by cognitive control during the anticipation and perception of non-specific emotional stimuli in patients with SAD. Thirty-two patients with SAD underwent functional magnetic resonance imaging during the announced anticipation and perception of emotional stimuli. Half of them were trained and instructed to apply reality-checking as a control strategy, the others anticipated and perceived the stimuli. Reality checking significantly (pperception of negative emotional stimuli. The medial prefrontal cortex was comparably active in both groups (p>0.50). The results suggest that cognitive control in patients with SAD influences emotion processing structures, supporting the usefulness of emotion regulation training in the psychotherapy of SAD. In contrast to studies in healthy subjects, cognitive control was not associated with increased activation of prefrontal regions in SAD. This points to possibly disturbed general emotion regulating circuits in SAD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Biasutti, Michele
2017-01-01
Improvisation is an articulated multidimensional activity based on an extemporaneous creative performance. Practicing improvisation, participants expand sophisticated skills such as sensory and perceptual encoding, memory storage and recall, motor control, and performance monitoring. Improvisation abilities have been developed following several methodologies mainly with a product-oriented perspective. A model framed under the socio-cultural theory of learning for designing didactic activities on processes instead of outcomes is presented in the current paper. The challenge is to overcome the mere instructional dimension of some practices of teaching improvisation by designing activities that stimulate self-regulated learning strategies in the students. In the article the present thesis is declined in three ways, concerning the following three possible areas of application: (1) high-level musical learning, (2) musical pedagogy with children, (3) general pedagogy. The applications in the music field focusing mainly on an expert's use of improvisation are discussed. The last section considers how these ideas should transcend music studies, presenting the benefits and the implications of improvisation activities for general learning. Moreover, the application of music education to the following cognitive processes are discussed: anticipation, use of repertoire, emotive communication, feedback and flow. These characteristics could be used to outline a pedagogical method for teaching music improvisation based on the development of reflection, reasoning, and meta-cognition.
Perusich, Stephen; Moos, Thomas; Muscatello, Anthony
2011-01-01
This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not
A national general pediatric clerkship curriculum: the process of development and implementation.
Olson, A L; Woodhead, J; Berkow, R; Kaufman, N M; Marshall, S G
2000-07-01
To describe a new national general pediatrics clerkship curriculum, the development process that built national support for its use, and current progress in implementing the curriculum in pediatric clerkships at US allopathic medical schools. CURRICULUM DEVELOPMENT: A curriculum project team of pediatric clerkship directors and an advisory committee representing professional organizations invested in pediatric student education developed the format and content in collaboration with pediatric educators from the Council on Medical Student Education in Pediatrics (COMSEP) and the Ambulatory Pediatric Association (APA). An iterative process or review by clerkship directors, pediatric departmental chairs, and students finalized the content and built support for the final product. The national dissemination process resulted in consensus among pediatric educators that this curriculum should be used as the national curricular guideline for clerkships. MONITORING IMPLEMENTATION: Surveys were mailed to all pediatric clerkship directors before dissemination (November 1994), and in the first and third academic years after national dissemination (March 1996 and September 1997). The 3 surveys assessed schools' implementation of specific components of the curriculum. The final survey also assessed ways the curriculum was used and barriers to implementation. The final curriculum provided objectives and competencies for attitudes, skills, and 18 knowledge areas of general pediatrics. A total of 216 short clinical cases were also provided as an alternative learning method. An accompanying resource manual provided suggested strategies for implementation, teaching, and evaluation. A total of 103 schools responded to survey 1; 84 schools to survey 2; and 85 schools responded to survey 3 from the 125 medical schools surveyed. Before dissemination, 16% of schools were already using the clinical cases. In the 1995-1996 academic year, 70% of schools were using some or all of the curricular
Generalized hardware post-processing technique for chaos-based pseudorandom number generators
Barakat, Mohamed L.
2013-06-01
This paper presents a generalized post-processing technique for enhancing the pseudorandomness of digital chaotic oscillators through a nonlinear XOR-based operation with rotation and feedback. The technique allows full utilization of the chaotic output as pseudorandom number generators and improves throughput without a significant area penalty. Digital design of a third-order chaotic system with maximum function nonlinearity is presented with verified chaotic dynamics. The proposed post-processing technique eliminates statistical degradation in all output bits, thus maximizing throughput compared to other processing techniques. Furthermore, the technique is applied to several fully digital chaotic oscillators with performance surpassing previously reported systems in the literature. The enhancement in the randomness is further examined in a simple image encryption application resulting in a better security performance. The system is verified through experiment on a Xilinx Virtex 4 FPGA with throughput up to 15.44 Gbit/s and logic utilization less than 0.84% for 32-bit implementations. © 2013 ETRI.
A Business Process Management System based on a General Optimium Criterion
Directory of Open Access Journals (Sweden)
Vasile MAZILESCU
2009-01-01
Full Text Available Business Process Management Systems (BPMS provide a broadrange of facilities to manage operational business processes. These systemsshould provide support for the complete Business Process Management (BPMlife-cycle [16]: (redesign, configuration, execution, control, and diagnosis ofprocesses. BPMS can be seen as successors of Workflow Management (WFMsystems. However, already in the seventies people were working on officeautomation systems which are comparable with today’s WFM systems.Recently, WFM vendors started to position their systems as BPMS. Our paper’sgoal is a proposal for a Tasks-to-Workstations Assignment Algorithm (TWAAfor assembly lines which is a special implementation of a stochastic descenttechnique, in the context of BPMS, especially at the control level. Both cases,single and mixed-model, are treated. For a family of product models having thesame generic structure, the mixed-model assignment problem can be formulatedthrough an equivalent single-model problem. A general optimum criterion isconsidered. As the assembly line balancing, this kind of optimisation problemleads to a graph partitioning problem meeting precedence and feasibilityconstraints. The proposed definition for the "neighbourhood" function involvesan efficient way for treating the partition and precedence constraints. Moreover,the Stochastic Descent Technique (SDT allows an implicit treatment of thefeasibility constraint. The proposed algorithm converges with probability 1 toan optimal solution.
International Nuclear Information System (INIS)
Nascimento, M.A.C. do
1992-01-01
A Generalized Multi Structural (GMS) wave function is presented which combines the advantages of the SCF-MO and VB models, preserving the classical chemical structures but optimizing the orbitals in a self-consistent way. This wave function is particularly suitable to treat situations where the description of the molecular state requires localized wave functions. It also provides a very convenient way of treating the electron correlation problem, avoiding large CI expansions. The final wave functions are much more compact and easier to interpret than the ones obtained by the conventional methods, using orthogonal orbitals. Applications of the GMS wave function to the study of the photoelectron spectra of the trans-glyoxal molecule and to electron impact excitation processes in the nitrogen molecule are presented as an illustration of the method. (author)
''Sheiva'' : a general purpose multi-parameter data acquisition and processing system at VECC
International Nuclear Information System (INIS)
Viyogi, Y.P.; Ganguly, N.K.
1982-01-01
A general purpose interactive software to be used with the PDP-15/76 on-line computer at VEC Centre for the acquisition and processing of data in nuclear physics experiments is described. The program can accommodate a maximum of thirty two inputs although the present hardware limits the number of inputs to eight. Particular emphasis is given to the problems of flexibility and ease of operation, memory optimisation and techniques dealing with experimenter-computer interaction. Various graphical methods for one- and two-dimensional data presentation are discussed. Specific problems of particle identification using detector telescopes have been dealt with carefully to handle experiments using several detector telescopes and those involving light particle-heavy particle coincidence studies. Steps needed to tailor this program towards utilisation for special experiments are also described. (author)
Attention allocation: Relationships to general working memory or specific language processing.
Archibald, Lisa M D; Levee, Tyler; Olino, Thomas
2015-11-01
Attention allocation, updating working memory, and language processing are interdependent cognitive tasks related to the focused direction of limited resources, refreshing and substituting information in the current focus of attention, and receiving/sending verbal communication, respectively. The current study systematically examined the relationship among executive attention, working memory executive skills, and language abilities while adjusting for individual differences in short-term memory. School-age children completed a selective attention task requiring them to recall whether a presented shape was in the same place as a previous target shape shown in an array imposing a low or high working memory load. Results revealed a selective attention cost when working above but not within memory span capacity. Measures of general working memory were positively related to overall task performance, whereas language abilities were related to response time. In particular, higher language skills were associated with faster responses under low load conditions. These findings suggest that attentional control and storage demands have an additive impact on working memory resources but provide only limited evidence for a domain-general mechanism in language learning. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
Yang, Chih-Hao; Huang, Chiung-Chun; Hsu, Kuei-Sen
2011-09-01
Repetitive replay of fear memories may precipitate the occurrence of post-traumatic stress disorder and other anxiety disorders. Hence, the suppression of fear memory retrieval may help prevent and treat these disorders. The formation of fear memories is often linked to multiple environmental cues and these interconnected cues may act as reminders for the recall of traumatic experiences. However, as a convenience, a simple paradigm of one cue pairing with the aversive stimulus is usually used in studies of fear conditioning in animals. Here, we built a more complex fear conditioning model by presenting several environmental stimuli during fear conditioning and characterize the effectiveness of extinction training and the disruption of reconsolidation process on the expression of learned fear responses. We demonstrate that extinction training with a single-paired cue resulted in cue-specific attenuation of fear responses but responses to other cures were unchanged. The cue-specific nature of the extinction persisted despite training sessions combined with D-cycloserine treatment reveals a significant weakness in extinction-based treatment. In contrast, the inhibition of the dorsal hippocampus (DH) but not the basolateral amygdala (BLA)-dependent memory reconsolidation process using either protein synthesis inhibitors or genetic disruption of cAMP-response-element-binding protein-mediated transcription comprehensively disrupted the learned connections between fear responses and all paired environmental cues. These findings emphasize the distinct role of the DH and the BLA in the reconsolidation process of fear memories and further indicate that the disruption of memory reconsolidation process in the DH may result in generalization of fear inhibition.
A general-purpose process modelling framework for marine energy systems
International Nuclear Information System (INIS)
Dimopoulos, George G.; Georgopoulou, Chariklia A.; Stefanatos, Iason C.; Zymaris, Alexandros S.; Kakalis, Nikolaos M.P.
2014-01-01
Highlights: • Process modelling techniques applied in marine engineering. • Systems engineering approaches to manage the complexity of modern ship machinery. • General purpose modelling framework called COSSMOS. • Mathematical modelling of conservation equations and related chemical – transport phenomena. • Generic library of ship machinery component models. - Abstract: High fuel prices, environmental regulations and current shipping market conditions impose ships to operate in a more efficient and greener way. These drivers lead to the introduction of new technologies, fuels, and operations, increasing the complexity of modern ship energy systems. As a means to manage this complexity, in this paper we present the introduction of systems engineering methodologies in marine engineering via the development of a general-purpose process modelling framework for ships named as DNV COSSMOS. Shifting the focus from components – the standard approach in shipping- to systems, widens the space for optimal design and operation solutions. The associated computer implementation of COSSMOS is a platform that models, simulates and optimises integrated marine energy systems with respect to energy efficiency, emissions, safety/reliability and costs, under both steady-state and dynamic conditions. DNV COSSMOS can be used in assessment and optimisation of design and operation problems in existing vessels, new builds as well as new technologies. The main features and our modelling approach are presented and key capabilities are illustrated via two studies on the thermo-economic design and operation optimisation of a combined cycle system for large bulk carriers, and the transient operation simulation of an electric marine propulsion system
Generalized Least Energy of Separation for Desalination and Other Chemical Separation Processes
Directory of Open Access Journals (Sweden)
Karan H. Mistry
2013-05-01
Full Text Available Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies driven by different combinations of heat, work, and chemical energy. This paper develops a consistent basis for comparing the energy consumption of such technologies using Second Law efficiency. The Second Law efficiency for a chemical separation process is defined in terms of the useful exergy output, which is the minimum least work of separation required to extract a unit of product from a feed stream of a given composition. For a desalination process, this is the minimum least work of separation for producing one kilogram of product water from feed of a given salinity. While definitions in terms of work and heat input have been proposed before, this work generalizes the Second Law efficiency to allow for systems that operate on a combination of energy inputs, including fuel. The generalized equation is then evaluated through a parametric study considering work input, heat inputs at various temperatures, and various chemical fuel inputs. Further, since most modern, large-scale desalination plants operate in cogeneration schemes, a methodology for correctly evaluating Second Law efficiency for the desalination plant based on primary energy inputs is demonstrated. It is shown that, from a strictly energetic point of view and based on currently available technology, cogeneration using electricity to power a reverse osmosis system is energetically superior to thermal systems such as multiple effect distillation and multistage flash distillation, despite the very low grade heat input normally applied in those systems.
Pomeroy, Sylvia E M; Cant, Robyn P
2010-01-01
The aim of this project was to describe general practitioners' (GPs') decision-making process for reducing nutrition risk in cardiac patients through referring a patient to a dietitian. The setting was primary care practices in Victoria. The method we employed was mixed methods research: in Study 1, 30 GPs were interviewed. Recorded interviews were transcribed and narratives analysed thematically. Study 2 involved a survey of statewide random sample of GPs. Frequencies and analyses of variance were used to explore the impact of demographic variables on decisions to refer. We found that the referral decision involved four elements: (i) synthesising management information; (ii) forecasting outcomes; (iii) planning management; and (iv) actioning referrals. GPs applied cognitive and collaborative strategies to develop a treatment plan. In Study 2, doctors (248 GPs, 30%) concurred with identified barriers/enabling factors for patients' referral. There was no association between GPs' sex, age or hours worked per week and referral factors. We conclude that a GP's judgment to offer a dietetic referral to an adult patient is a four element reasoning process. Attention to how these elements interact may assist clinical decision making. Apart from the sole use of prescribed medications/surgical procedures for cardiac care, patients offered a dietetic referral were those who were considered able to commit to dietary change and who were willing to attend a dietetic consultation. Improvements in provision of patients' nutrition intervention information to GPs are needed. Further investigation is justified to determine how to resolve this practice gap.
The amblyopic deficit for 2nd order processing: Generality and laterality.
Gao, Yi; Reynaud, Alexandre; Tang, Yong; Feng, Lixia; Zhou, Yifeng; Hess, Robert F
2015-09-01
A number of previous reports have suggested that the processing of second-order stimuli by the amblyopic eye (AE) is defective and that the fellow non-amblyopic eye (NAE) also exhibits an anomaly. Second-order stimuli involve extra-striate as well as striate processing and provide a means of exploring the extent of the cortical anomaly in amblyopia using psychophysics. We use a range of different second-order stimuli to investigate how general the deficit is for detecting second-order stimuli in adult amblyopes. We compare these results to our previously published adult normative database using the same stimuli and approach to determine the extent to which the detection of these stimuli is defective for both amblyopic and non-amblyopic eye stimulation. The results suggest that the second-order deficit affects a wide range of second-order stimuli, and by implication a large area of extra-striate cortex, both dorsally and ventrally. The NAE is affected only in motion-defined form judgments, suggesting a difference in the degree to which ocular dominance is disrupted in dorsal and ventral extra-striate regions. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Fast General-Purpose Clustering Algorithm Based on FPGAs for High-Throughput Data Processing
Annovi, A; The ATLAS collaboration; Castegnaro, A; Gatta, M
2012-01-01
We present a fast general-purpose algorithm for high-throughput clustering of data ”with a two dimensional organization”. The algorithm is designed to be implemented with FPGAs or custom electronics. The key feature is a processing time that scales linearly with the amount of data to be processed. This means that clustering can be performed in pipeline with the readout, without suffering from combinatorial delays due to looping multiple times through all the data. This feature makes this algorithm especially well suited for problems where the data has high density, e.g. in the case of tracking devices working under high-luminosity condition such as those of LHC or Super-LHC. The algorithm is organized in two steps: the first step (core) clusters the data; the second step analyzes each cluster of data to extract the desired information. The current algorithm is developed as a clustering device for modern high-energy physics pixel detectors. However, the algorithm has much broader field of applications. In ...
TRIO-EF a general thermal hydraulics computer code applied to the Avlis process
International Nuclear Information System (INIS)
Magnaud, J.P.; Claveau, M.; Coulon, N.; Yala, P.; Guilbaud, D.; Mejane, A.
1993-01-01
TRIO(EF is a general purpose Fluid Mechanics 3D Finite Element Code. The system capabilities cover areas such as steady state or transient, laminar or turbulent, isothermal or temperature dependent fluid flows; it is applicable to the study of coupled thermo-fluid problems involving heat conduction and possibly radiative heat transfer. It has been used to study the thermal behaviour of the AVLIS process separation module. In this process, a linear electron beam impinges the free surface of a uranium ingot, generating a two dimensional curtain emission of vapour from a water-cooled crucible. The energy transferred to the metal causes its partial melting, forming a pool where strong convective motion increases heat transfer towards the crucible. In the upper part of the Separation Module, the internal structures are devoted to two main functions: vapor containment and reflux, irradiation and physical separation. They are subjected to very high temperature levels and heat transfer occurs mainly by radiation. Moreover, special attention has to be paid to electron backscattering. These two major points have been simulated numerically with TRIO-EF and the paper presents and comments the results of such a computation, for each of them. After a brief overview of the computer code, two examples of the TRIO-EF capabilities are given: a crucible thermal hydraulics model, a thermal analysis of the internal structures
Giménez, Nuria; Pedrazas, David; Redondo, Susana; Quintana, Salvador
2016-10-01
Adequate information for patients and respect for their autonomy are mandatory in research. This article examined insights of researchers, patients and general practitioners (GPs) on the informed consent process in clinical trials, and the role of the GP. A cross-sectional study using three questionnaires, informed consent reviews, medical records, and hospital discharge reports. GPs, researchers and patients involved in clinical trials. Included, 504 GPs, 108 researchers, and 71 patients. Consulting the GP was recommended in 50% of the informed consents. Participation in clinical trials was shown in 33% of the medical records and 3% of the hospital discharge reports. GPs scored 3.54 points (on a 1-10 scale) on the assessment of the information received by the principal investigator. The readability of the informed consent sheet was rated 8.03 points by researchers, and the understanding was rated 7.68 points by patients. Patient satisfaction was positively associated with more time for reflection. GPs were not satisfied with the information received on the participation of patients under their in clinical trials. Researchers were satisfied with the information they offered to patients, and were aware of the need to improve the information GPs received. Patients collaborated greatly towards biomedical research, expressed satisfaction with the overall process, and minimised the difficulties associated with participation. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Jarrold, Christopher; Tam, Helen; Baddeley, Alan D; Harvey, Caroline E
2011-05-01
Two studies that examine whether the forgetting caused by the processing demands of working memory tasks is domain-general or domain-specific are presented. In each, separate groups of adult participants were asked to carry out either verbal or nonverbal operations on exactly the same processing materials while maintaining verbal storage items. The imposition of verbal processing tended to produce greater forgetting even though verbal processing operations took no longer to complete than did nonverbal processing operations. However, nonverbal processing did cause forgetting relative to baseline control conditions, and evidence from the timing of individuals' processing responses suggests that individuals in both processing groups slowed their responses in order to "refresh" the memoranda. Taken together the data suggest that processing has a domain-general effect on working memory performance by impeding refreshment of memoranda but can also cause effects that appear domain-specific and that result from either blocking of rehearsal or interference.
Halcomb, Elizabeth J; Furler, John S; Hermiz, Oshana S; Blackberry, Irene D; Smith, Julie P; Richmond, Robyn L; Zwar, Nicholas A
2015-08-01
Support in primary care can assist smokers to quit successfully, but there are barriers to general practitioners (GPs) providing this support routinely. Practice nurses (PNs) may be able to effectively take on this role. The aim of this study was to perform a process evaluation of a PN-led smoking cessation intervention being tested in a randomized controlled trial in Australian general practice. Process evaluation was conducted by means of semi-structured telephone interviews with GPs and PNs allocated in the intervention arm (Quit with PN) of the Quit in General Practice trial. Interviews focussed on nurse training, content and implementation of the intervention. Twenty-two PNs and 15 GPs participated in the interviews. The Quit with PN intervention was viewed positively. Most PNs were satisfied with the training and the materials provided. Some challenges in managing patient data and follow-up were identified. The Quit with PN intervention was acceptable to participating PNs and GPs. Issues to be addressed in the planning and wider implementation of future trials of nurse-led intervention in general practice include providing ongoing mentoring support, integration into practice management systems and strategies to promote greater collaboration in GPs and PN teams in general practice. The ongoing feasibility of the intervention was impacted by the funding model supporting PN employment and the competing demands on the PNs time. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Directory of Open Access Journals (Sweden)
Joaquim Bento de Souza Ferreira Filho
1999-12-01
Full Text Available This paper deals with the effects of trade liberalization and Mercosur integration process upon the Brazilian economy, with emphasis on the agricultural and agroindustrial production sectors, under the hypothesis that those phenomena could be another step in the rural-urban transfer process in Brazil. The analysis is conducted through an applied general equilibrium model. Results suggest that trade liberalization would hardly generate a widespread process of rural-urban transfers, although Brazilian agriculture shows up as a loser in the process. Notwithstanding that fact, there are transfers inside the agricultural sectors, where, besides the losses in the value added of the grain production sectors, there would be gains for the livestock and for the ''other crops" sectors. The agroindustry, in contrast, seems to gain both in Brazil and Argentina. Model results suggest yet that the Brazilian society would be benefitted as a whole by the integration, despite the losses in the agricultural sector.Este artigo analisa os efeitos do processo de liberalização comercial e de constituição do Mercosul sobre a economia brasileira, com ênfase nos setores produtivos da agricultura e da agroindústria, sob a hipótese de que aqueles fenômenos seriam mais uma etapa no processo de transferências rurais-urbanas no Brasil. Para tanto, a análise é conduzida através do uso de um modelo de equilíbrio geral aplicado. Os resultados sugerem que a integração comercial não irá gerar um processo amplo de transferências rurais-urbanas no Brasil, embora a agricultura brasileira apareça, no agregado, como o setor perdedor na integração, em benefício da agricultura argentina. Há, entretanto, transferências dentro dos setores da agropecuária brasileira, onde, ao lado das perdas no valor adicionado do setor produtor de grãos, haveria ganhos para a pecuária e para o setor ''outras culturas". A agroindústria, em contraste, parece ganhar tanto no Brasil
Gusev, E Yu; Chereshnev, V A
2013-01-01
Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.
Conway, Andrew R. A.; Cowan, Nelsin; Bunting, Michael F.; Therriault, David J.; Minkoff, Scott R. B.
2002-01-01
Studied the interrelationships among general fluid intelligence, short-term memory capacity, working memory capacity, and processing speed in 120 young adults and used structural equation modeling to determine the best predictor of general fluid intelligence. Results suggest that working memory capacity, but not short-term memory capacity or…
Verkuyten, Maykel
1988-01-01
Examined lack of differences in general self-esteem between adolescents of ethnic minorities and Dutch adolescents, focusing on reflected appraisal process. Found significant relationship between general self-esteem and perceived evaluation of family members (and no such relationship with nonfamily members) for ethnic minority adolescents;…
Generalized renewal process for repairable systems based on finite Weibull mixture
International Nuclear Information System (INIS)
Veber, B.; Nagode, M.; Fajdiga, M.
2008-01-01
Repairable systems can be brought to one of possible states following a repair. These states are: 'as good as new', 'as bad as old' and 'better than old but worse than new'. The probabilistic models traditionally used to estimate the expected number of failures account for the first two states, but they do not properly apply to the last one, which is more realistic in practice. In this paper, a probabilistic model that is applicable to all of the three after-repair states, called generalized renewal process (GRP), is applied. Simplistically, GRP addresses the repair assumption by introducing the concept of virtual age into the stochastic point processes to enable them to represent the full spectrum of repair assumptions. The shape of measured or design life distributions of systems can vary considerably, and therefore frequently cannot be approximated by simple distribution functions. The scope of the paper is to prove that a finite Weibull mixture, with positive component weights only, can be used as underlying distribution of the time to first failure (TTFF) of the GRP model, on condition that the unknown parameters can be estimated. To support the main idea, three examples are presented. In order to estimate the unknown parameters of the GRP model with m-fold Weibull mixture, the EM algorithm is applied. The GRP model with m mixture components distributions is compared to the standard GRP model based on two-parameter Weibull distribution by calculating the expected number of failures. It can be concluded that the suggested GRP model with Weibull mixture with an arbitrary but finite number of components is suitable for predicting failures based on the past performance of the system
Tong, Seng Fah; Ng, Chirk Jenn; Lee, Verna Kar Mun; Lee, Ping Yein; Ismail, Irmi Zarina; Khoo, Ee Ming; Tahir, Noor Azizah; Idris, Iliza; Ismail, Mastura; Abdullah, Adina
2018-01-01
The participation of general practitioners (GPs) in primary care research is variable and often poor. We aimed to develop a substantive and empirical theoretical framework to explain GPs' decision-making process to participate in research. We used the grounded theory approach to construct a substantive theory to explain the decision-making process of GPs to participate in research activities. Five in-depth interviews and four focus group discussions were conducted among 21 GPs. Purposeful sampling followed by theoretical sampling were used to attempt saturation of the core category. Data were collected using semi-structured open-ended questions. Interviews were recorded, transcribed verbatim and checked prior to analysis. Open line-by-line coding followed by focus coding were used to arrive at a substantive theory. Memoing was used to help bring concepts to higher abstract levels. The GPs' decision to participate in research was attributed to their inner drive and appreciation for primary care research and their confidence in managing their social and research environments. The drive and appreciation for research motivated the GPs to undergo research training to enhance their research knowledge, skills and confidence. However, the critical step in the GPs' decision to participate in research was their ability to align their research agenda with priorities in their social environment, which included personal life goals, clinical practice and organisational culture. Perceived support for research, such as funding and technical expertise, facilitated the GPs' participation in research. In addition, prior experiences participating in research also influenced the GPs' confidence in taking part in future research. The key to GPs deciding to participate in research is whether the research agenda aligns with the priorities in their social environment. Therefore, research training is important, but should be included in further measures and should comply with GPs' social
MacNamara, Annmarie; Proudfit, Greg Hajcak
2014-08-01
Generalized anxiety disorder (GAD) may be characterized by emotion regulation deficits attributable to an imbalance between top-down (i.e., goal-driven) and bottom-up (i.e., stimulus-driven) attention. In prior work, these attentional processes were examined by presenting unpleasant and neutral pictures within a working memory paradigm. The late positive potential (LPP) measured attention toward task-irrelevant pictures. Results from this prior work showed that working memory load reduced the LPP across participants; however, this effect was attenuated for individuals with greater self-reported state anxiety, suggesting reduced top-down control. In the current study, the same paradigm was used with 106 medication-free female participants-71 with GAD and 35 without GAD. Unpleasant pictures elicited larger LPPs, and working memory load reduced the picture-elicited LPP. Compared with healthy controls, participants with GAD showed large LPPs to unpleasant pictures presented under high working memory load. Self-reported symptoms of anhedonic depression were related to a reduced effect of working memory load on the LPP elicited by neutral pictures. These results indicate that individuals with GAD show less flexible modulation of attention when confronted with unpleasant stimuli. Furthermore, among those with GAD, anhedonic depression may broaden attentional deficits to neutral distracters. (c) 2014 APA, all rights reserved.
Directory of Open Access Journals (Sweden)
Zhen Chen
2016-01-01
Full Text Available Accelerated degradation test (ADT has been widely used to assess highly reliable products’ lifetime. To conduct an ADT, an appropriate degradation model and test plan should be determined in advance. Although many historical studies have proposed quite a few models, there is still room for improvement. Hence we propose a Nonlinear Generalized Wiener Process (NGWP model with consideration of the effects of stress level, product-to-product variability, and measurement errors for a higher estimation accuracy and a wider range of use. Then under the constraints of sample size, test duration, and test cost, the plans of constant-stress ADT (CSADT with multiple stress levels based on the NGWP are designed by minimizing the asymptotic variance of the reliability estimation of the products under normal operation conditions. An optimization algorithm is developed to determine the optimal stress levels, the number of units allocated to each level, inspection frequency, and measurement times simultaneously. In addition, a comparison based on degradation data of LEDs is made to show better goodness-of-fit of the NGWP than that of other models. Finally, optimal two-level and three-level CSADT plans under various constraints and a detailed sensitivity analysis are demonstrated through examples in this paper.
Lensky, Vadim; Hagelstein, Franziska; Pascalutsa, Vladimir; Vanderhaeghen, Marc
2018-04-01
We derive two new sum rules for the unpolarized doubly virtual Compton scattering process on a nucleon, which establish novel low-Q2 relations involving the nucleon's generalized polarizabilities and moments of the nucleon's unpolarized structure functions F1(x ,Q2) and F2(x ,Q2). These relations facilitate the determination of some structure constants which can only be accessed in off-forward doubly virtual Compton scattering, not experimentally accessible at present. We perform an empirical determination for the proton and compare our results with a next-to-leading-order chiral perturbation theory prediction. We also show how these relations may be useful for a model-independent determination of the low-Q2 subtraction function in the Compton amplitude, which enters the two-photon-exchange contribution to the Lamb shift of (muonic) hydrogen. An explicit calculation of the Δ (1232 )-resonance contribution to the muonic-hydrogen 2 P -2 S Lamb shift yields -1 ±1 μ eV , confirming the previously conjectured smallness of this effect.
Gao, Yunjiao; Wong, Dennis S W; Yu, Yanping
2016-01-01
Using a sample of 1,163 adolescents from four middle schools in China, this study explores the intervening process of how adolescent maltreatment is related to delinquency within the framework of general strain theory (GST) by comparing two models. The first model is Agnew's integrated model of GST, which examines the mediating effects of social control, delinquent peer affiliation, state anger, and depression on the relationship between maltreatment and delinquency. Based on this model, with the intent to further explore the mediating effects of state anger and depression and to investigate whether their effects on delinquency can be demonstrated more through delinquent peer affiliation and social control, an extended model (Model 2) is proposed by the authors. The second model relates state anger to delinquent peer affiliation and state depression to social control. By comparing the fit indices and the significance of the hypothesized paths of the two models, the study found that the extended model can better reflect the mechanism of how maltreatment contributes to delinquency, whereas the original integrated GST model only receives partial support because of its failure to find the mediating effects of state negative emotions. © The Author(s) 2014.
Geometric correction of radiographic images using general purpose image processing program
International Nuclear Information System (INIS)
Kim, Eun Kyung; Cheong, Ji Seong; Lee, Sang Hoon
1994-01-01
The present study was undertaken to compare geometric corrected image by general-purpose image processing program for the Apple Macintosh II computer (NIH Image, Adobe Photoshop) with standardized image by individualized custom fabricated alignment instrument. Two non-standardized periapical films with XCP film holder only were taken at the lower molar portion of 19 volunteers. Two standardized periapical films with customized XCP film holder with impression material on the bite-block were taken for each person. Geometric correction was performed with Adobe Photoshop and NIH Image program. Specially, arbitrary image rotation function of 'Adobe Photoshop' and subtraction with transparency function of 'NIH Image' were utilized. The standard deviations of grey values of subtracted images were used to measure image similarity. Average standard deviation of grey values of subtracted images if standardized group was slightly lower than that of corrected group. However, the difference was found to be statistically insignificant (p>0.05). It is considered that we can use 'NIH Image' and 'Adobe Photoshop' program for correction of nonstandardized film, taken with XCP film holder at lower molar portion.
Murdoch, Jamie; Varley, Anna; Fletcher, Emily; Britten, Nicky; Price, Linnie; Calitri, Raff; Green, Colin; Lattimer, Valerie; Richards, Suzanne H; Richards, David A; Salisbury, Chris; Taylor, Rod S; Campbell, John L
2015-04-10
Telephone triage represents one strategy to manage demand for face-to-face GP appointments in primary care. However, limited evidence exists of the challenges GP practices face in implementing telephone triage. We conducted a qualitative process evaluation alongside a UK-based cluster randomised trial (ESTEEM) which compared the impact of GP-led and nurse-led telephone triage with usual care on primary care workload, cost, patient experience, and safety for patients requesting a same-day GP consultation. The aim of the process study was to provide insights into the observed effects of the ESTEEM trial from the perspectives of staff and patients, and to specify the circumstances under which triage is likely to be successfully implemented. Here we report perspectives of staff. The intervention comprised implementation of either GP-led or nurse-led telephone triage for a period of 2-3 months. A qualitative evaluation was conducted using staff interviews recruited from eight general practices (4 GP triage, 4 Nurse triage) in the UK, implementing triage as part of the ESTEEM trial. Qualitative interviews were undertaken with 44 staff members in GP triage and nurse triage practices (16 GPs, 8 nurses, 7 practice managers, 13 administrative staff). Staff reported diverse experiences and perceptions regarding the implementation of telephone triage, its effects on workload, and on the benefits of triage. Such diversity were explained by the different ways triage was organised, the staffing models used to support triage, how the introduction of triage was communicated across practice staff, and by how staff roles were reconfigured as a result of implementing triage. The findings from the process evaluation offer insight into the range of ways GP practices participating in ESTEEM implemented telephone triage, and the circumstances under which telephone triage can be successfully implemented beyond the context of a clinical trial. Staff experiences and perceptions of telephone
Jarrold, Christopher; Tam, Helen; Baddeley, Alan D.; Harvey, Caroline E.
2011-01-01
Two studies that examine whether the forgetting caused by the processing demands of working memory tasks is domain-general or domain-specific are presented. In each, separate groups of adult participants were asked to carry out either verbal or nonverbal operations on exactly the same processing materials while maintaining verbal storage items.…
Directory of Open Access Journals (Sweden)
Rosalind Adam
Full Text Available Delayed cancer diagnosis leads to poorer patient outcomes. During short consultations, General Practitioners (GPs make quick decisions about likelihood of cancer. Patients' facial cues are processed rapidly and may influence diagnosis.To investigate whether patients' facial characteristics influence immediate perception of cancer risk by GPs.Web-based binary forced choice experiment with GPs from Northeast Scotland.GPs were presented with a series of pairs of face prototypes and asked to quickly select the patient more likely to have cancer. Faces were modified with respect to age, gender, and ethnicity. Choices were analysed using Chi-squared goodness-of-fit statistics with Bonferroni corrections.Eighty-two GPs participated. GPs were significantly more likely to suspect cancer in older patients. Gender influenced GP cancer suspicion, but this was modified by age: the male face was chosen as more likely to have cancer than the female face for young (72% of GPs;95% CI 61.0-87.0 and middle-aged faces (65.9%; 95% CI 54.7-75.5; but 63.4% (95% CI 52.2-73.3 decided the older female was more likely to have cancer than the older male (p = 0.015. GPs were significantly more likely to suspect cancer in the young Caucasian male (65.9% (95% CI 54.7, 75.5 compared to the young Asian male (p = 0.004.GPs' first impressions about cancer risk are influenced by patient age, gender, and ethnicity. Tackling GP cognitive biases could be a promising way of reducing cancer diagnostic delays, particularly for younger patients.
Monk, J.; Zhu, Y.; Koons, P. O.; Segee, B. E.
2009-12-01
With the introduction of the G8X series of cards by nVidia an architecture called CUDA was released, virtually all subsequent video cards have had CUDA support. With this new architecture nVidia provided extensions for C/C++ that create an Application Programming Interface (API) allowing code to be executed on the GPU. Since then the concept of GPGPU (general purpose graphics processing unit) has been growing, this is the concept that the GPU is very good a algebra and running things in parallel so we should take use of that power for other applications. This is highly appealing in the area of geodynamic modeling, as multiple parallel solutions of the same differential equations at different points in space leads to a large speedup in simulation speed. Another benefit of CUDA is a programmatic method of transferring large amounts of data between the computer's main memory and the dedicated GPU memory located on the video card. In addition to being able to compute and render on the video card, the CUDA framework allows for a large speedup in the situation, such as with a tiled display wall, where the rendered pixels are to be displayed in a different location than where they are rendered. A CUDA extension for VirtualGL was developed allowing for faster read back at high resolutions. This paper examines several aspects of rendering OpenGL graphics on large displays using VirtualGL and VNC. It demonstrates how performance can be significantly improved in rendering on a tiled monitor wall. We present a CUDA enhanced version of VirtualGL as well as the advantages to having multiple VNC servers. It will discuss restrictions caused by read back and blitting rates and how they are affected by different sizes of virtual displays being rendered.
Plint, Simon; Patterson, Fiona
2010-06-01
The UK national recruitment process into general practice training has been developed over several years, with incremental introduction of stages which have been piloted and validated. Previously independent processes, which encouraged multiple applications and produced inconsistent outcomes, have been replaced by a robust national process which has high reliability and predictive validity, and is perceived to be fair by candidates and allocates applicants equitably across the country. Best selection practice involves a job analysis which identifies required competencies, then designs reliable assessment methods to measure them, and over the long term ensures that the process has predictive validity against future performance. The general practitioner recruitment process introduced machine markable short listing assessments for the first time in the UK postgraduate recruitment context, and also adopted selection centre workplace simulations. The key success factors have been identified as corporate commitment to the goal of a national process, with gradual convergence maintaining locus of control rather than the imposition of change without perceived legitimate authority.
International Nuclear Information System (INIS)
Kitis, G.; Furetta, C.; Azorin, J.
2003-01-01
Synthetic thermoluminescent (Tl) glow peaks, following a second and general kinetics order have been generated by computer. The general properties of the so generated peaks have been investigated over several order of magnitude of simulated doses. Some non usual results which, at the best knowledge of the authors, are not reported in the literature, are obtained and discussed. (Author)
Schellevis, F.G.; Eijk, J.T.M. van; Lisdonk, E.H. van de; Velden, J. van der; Weel, C. van
1994-01-01
In a prospective longitudinal study over 21 months the performance of general practitioners and the disease status of their patients was measured during the formulation and implementation of guidelines on follow-up care. Data on 15 general practitioners and on 613 patients with hypertension, 95 with
UN Secretary-General Normative Capability to Influence The Security Council Decision-Making Process
Directory of Open Access Journals (Sweden)
Dmitry Guennadievich Novik
2016-01-01
Full Text Available The present article studies the issue of the interrelation between the senior UN official - the Secretary-General and the main UN body - the Security Council. The nature of the Secretary-General role is ambiguous since the very creation of the UN. On one hand, the Secretary-General leads the Secretariat - the body that carries out technical and subsidiary functions in relation to other UN Main Bodies. This is the way the Secretary-General position was initially viewed by the UN authors. On the other hand, the UN Charter contains certain provisions that, with a certain representation, give the Secretary-General vigorous powers, including political ones. Since the very beginning of the UN operation the Secretary-Generals have tried to define the nature of these auxiliary powers, formalize the practice of their use. Special place among these powers have the provisions given in the Charter article 99. This article give to the Secretary-General the right to directly appeal to the Security Council and draw its attention to the situation that, in his (Secretary-General's opinion may threaten the international peace and security. This right was used by some Secretary-Generals during different crises occurred after the creation of the UN. This article covers consecutively the crisis in Congo, Iran hostage crisis and the situation in Lebanon. These are three situations that forced Secretary-Generals Hammarskjold, Waldheim and de Cuellar to explicitly use their right to appeal to the Security Council. Other cases in UN history involving the Secretary-General appealing to the Security Council while mentioning article 99 cannot be considered as the use of the nature of this article in full sense of its spirit. Such cases were preceded by other appeals to the Council on the same situations by other subjects (notably, the UN member states or other actions that made Secretary-General to merely perform its technical function. The main research problem here is
Generalized enthalpy model of a high-pressure shift freezing process
Smith, N. A. S.; Peppin, S. S. L.; Ramos, A. M.
2012-01-01
High-pressure freezing processes are a novel emerging technology in food processing, offering significant improvements to the quality of frozen foods. To be able to simulate plateau times and thermal history under different conditions, in this work
The theory, practice, and future of process improvement in general thoracic surgery.
Freeman, Richard K
2014-01-01
Process improvement, in its broadest sense, is the analysis of a given set of actions with the aim of elevating quality and reducing costs. The tenets of process improvement have been applied to medicine in increasing frequency for at least the last quarter century including thoracic surgery. This review outlines the theory underlying process improvement, the currently available data sources for process improvement and possible future directions of research. Copyright © 2015 Elsevier Inc. All rights reserved.
Delgado, Francisco
2017-12-01
Quantum information processing should be generated through control of quantum evolution for physical systems being used as resources, such as superconducting circuits, spinspin couplings in ions and artificial anyons in electronic gases. They have a quantum dynamics which should be translated into more natural languages for quantum information processing. On this terrain, this language should let to establish manipulation operations on the associated quantum information states as classical information processing does. This work shows how a kind of processing operations can be settled and implemented for quantum states design and quantum processing for systems fulfilling a SU(2) reduction in their dynamics.
A General Audiovisual Temporal Processing Deficit in Adult Readers with Dyslexia
Francisco, Ana A.; Jesse, Alexandra; Groen, Margriet A.; McQueen, James M.
2017-01-01
Purpose: Because reading is an audiovisual process, reading impairment may reflect an audiovisual processing deficit. The aim of the present study was to test the existence and scope of such a deficit in adult readers with dyslexia. Method: We tested 39 typical readers and 51 adult readers with dyslexia on their sensitivity to the simultaneity of…
A general audiovisual temporal processing deficit in adult readers with dyslexia
Francisco, A.A.; Jesse, A.; Groen, M.A.; McQueen, J.M.
2017-01-01
Purpose: Because reading is an audiovisual process, reading impairment may reflect an audiovisual processing deficit. The aim of the present study was to test the existence and scope of such a deficit in adult readers with dyslexia. Method: We tested 39 typical readers and 51 adult readers with
Syncope prevalence in the ED compared to general practice and population: a strong selection process
Olde Nordkamp, Louise R. A.; van Dijk, Nynke; Ganzeboom, Karin S.; Reitsma, Johannes B.; Luitse, Jan S. K.; Dekker, Lukas R. C.; Shen, Win-Kuang; Wieling, Wouter
2009-01-01
Objective: We assessed the prevalence and distribution of the different causes of transient loss of consciousness (TLOC) in the emergency department (ED) and chest pain unit (CPU) and estimated the proportion of persons with syncope in the general population who seek medical attention from either
van Velzen, Joke H.
2018-01-01
There were two purposes for this mixed methods study: to investigate (a) the realistic meaning of awareness and understanding as the underlying constructs of general knowledge of the learning process and (b) a procedure for data consolidation. The participants were 11th-grade high school and first-year university students. Integrated data…
International Nuclear Information System (INIS)
Vnukov, V.S.; Rjazanov, B.G.; Sviridov, V.I.; Frolov, V.V.; Zubkov, Y.N.
1991-01-01
The paper describes the general principles of nuclear criticality safety for handling, processing, transportation and fissile materials storing. Measures to limit the consequences of critical accidents are discussed for the fuel processing plants and fissile materials storage. The system of scientific and technical measures on nuclear criticality safety as well as the system of control and state supervision based on the rules, limits and requirements are described. The criticality safety aspects for various stages of handling nuclear materials are considered. The paper gives descriptions of the methods and approaches for critical risk assessments for the processing facilities, plants and storages. (Author)
Uranium tetrafluoride reduction closed bomb. Part I: Reduction process general conditions
International Nuclear Information System (INIS)
Anca Abati, R.; Lopez Rodriguez, M.
1961-01-01
General conditions about the metallo thermic reduction in small bombs (250 and 800 gr. of uranium) has been investigated. Factors such as kind and granulometry of the magnesium used, magnesium excess and preheating temperature, which affect yields and metal quality have been considered. magnesium excess increased yields in a 15% in the small bomb, about the preheating temperature, there is a range between which yields and metal quality does not change. All tests have been made with graphite linings. (Author) 18 refs
Wu, S. Q.; Cai, X.
2000-01-01
Four classical laws of black hole thermodynamics are extended from exterior (event) horizon to interior (Cauchy) horizon. Especially, the first law of classical thermodynamics for Kerr-Newman black hole (KNBH) is generalized to those in quantum form. Then five quantum conservation laws on the KNBH evaporation effect are derived in virtue of thermodynamical equilibrium conditions. As a by-product, Bekenstein-Hawking's relation $ S=A/4 $ is exactly recovered.
International Nuclear Information System (INIS)
Wu, S.Q.; Cai, X.
2000-01-01
Four classical laws of black-hole thermodynamics are extended from exterior (event) horizon to interior (Cauchy) horizon. Especially, the first law of classical thermodynamics for Kerr-Newman black hole (KNBH) is generalized to those in quantum form. Then five quantum conservation laws on the KNBH evaporation effect are derived in virtue of thermodynamical equilibrium conditions. As a by-product, Bekenstein-Haw king's relation S=A/4 is exactly recovered
Johnson, L; Stricker, R B
2009-05-01
Lyme disease is one of the most controversial illnesses in the history of medicine. In 2006 the Connecticut Attorney General launched an antitrust investigation into the Lyme guidelines development process of the Infectious Diseases Society of America (IDSA). In a recent settlement with IDSA, the Attorney General noted important commercial conflicts of interest and suppression of scientific evidence that had tainted the guidelines process. This paper explores two broad ethical themes that influenced the IDSA investigation. The first is the growing problem of conflicts of interest among guidelines developers, and the second is the increasing centralisation of medical decisions by insurance companies, which use treatment guidelines as a means of controlling the practices of individual doctors and denying treatment for patients. The implications of the first-ever antitrust investigation of medical guidelines and the proposed model to remediate the tainted IDSA guidelines process are also discussed.
Variational estimation of process parameters in a simplified atmospheric general circulation model
Lv, Guokun; Koehl, Armin; Stammer, Detlef
2016-04-01
Parameterizations are used to simulate effects of unresolved sub-grid-scale processes in current state-of-the-art climate model. The values of the process parameters, which determine the model's climatology, are usually manually adjusted to reduce the difference of model mean state to the observed climatology. This process requires detailed knowledge of the model and its parameterizations. In this work, a variational method was used to estimate process parameters in the Planet Simulator (PlaSim). The adjoint code was generated using automatic differentiation of the source code. Some hydrological processes were switched off to remove the influence of zero-order discontinuities. In addition, the nonlinearity of the model limits the feasible assimilation window to about 1day, which is too short to tune the model's climatology. To extend the feasible assimilation window, nudging terms for all state variables were added to the model's equations, which essentially suppress all unstable directions. In identical twin experiments, we found that the feasible assimilation window could be extended to over 1-year and accurate parameters could be retrieved. Although the nudging terms transform to a damping of the adjoint variables and therefore tend to erases the information of the data over time, assimilating climatological information is shown to provide sufficient information on the parameters. Moreover, the mechanism of this regularization is discussed.
Domain general sequence operations contribute to pre-SMA involvement in visuo-spatial processing
Directory of Open Access Journals (Sweden)
E. Charles eLeek
2016-01-01
Full Text Available This study used 3T MRI to elucidate the functional role of supplementary motor area (SMA in relation to visuo-spatial processing. A localizer task contrasting sequential number subtraction and repetitive button pressing was used to functionally delineate non-motor sequence processing in pre-SMA, and activity in SMA-proper associated with motor sequencing. Patterns of BOLD responses in these regions were then contrasted to those from two tasks of visuo-spatial processing. In one task participants performed mental rotation in which recognition memory judgments were made to previously memorized 2D novel patterns across image-plane rotations. The other task involved abstract grid navigation in which observers computed a series of imagined location shifts in response to directional (arrow cues around a mental grid. The results showed overlapping activation in pre-SMA for sequential subtraction and both visuo-spatial tasks. These results suggest that visuo-spatial processing is supported by non-motor sequence operations that involve pre-SMA. More broadly, these data further highlight the functional heterogeneity of pre-SMA, and show that its role extends to processes beyond the planning and online control of movement.
Mo, Jian
2005-01-01
A great number of papers have shown that free radicals as well as bioactive molecules can play a role of mediator in a wide spectrum of biological processes, but the biological actions and chemical reactivity of the free radicals are quite different from that of the bioactive molecules, and that a wide variety of bioactive molecules can be easily modified by free radicals due to having functional groups sensitive to redox, and the significance of the interaction between the free radicals and the bioactive molecules in biological processes has been confirmed by the results of some in vitro and in vivo studies. Based on these evidence, this article presented a novel theory about the mediators of biological processes. The essentials of the theory are: (a) mediators of biological processes can be classified into general and specific mediators; the general mediators include two types of free radicals, namely superoxide and nitric oxide; the specific mediators include a wide variety of bioactive molecules, such as specific enzymes, transcription factors, cytokines and eicosanoids; (b) a general mediator can modify almost any class of the biomolecules, and thus play a role of mediator in nearly every biological process via diverse mechanisms; a specific mediator always acts selectively on certain classes of the biomolecules, and may play a role of mediator in different biological processes via a same mechanism; (c) biological processes are mostly controlled by networks of their mediators, so the free radicals can regulate the last consequence of a biological process by modifying some types of the bioactive molecules, or in cooperation with these bioactive molecules; the biological actions of superoxide and nitric oxide may be synergistic or antagonistic. According to this theory, keeping the integrity of these networks and the balance between the free radicals and the bioactive molecules as well as the balance between the free radicals and the free radical scavengers
Collection, transport and general processing of clinical specimens in Microbiology laboratory.
Sánchez-Romero, M Isabel; García-Lechuz Moya, Juan Manuel; González López, Juan José; Orta Mira, Nieves
2018-02-06
The interpretation and the accuracy of the microbiological results still depend to a great extent on the quality of the samples and their processing within the Microbiology laboratory. The type of specimen, the appropriate time to obtain the sample, the way of sampling, the storage and transport are critical points in the diagnostic process. The availability of new laboratory techniques for unusual pathogens, makes necessary the review and update of all the steps involved in the processing of the samples. Nowadays, the laboratory automation and the availability of rapid techniques allow the precision and turn-around time necessary to help the clinicians in the decision making. In order to be efficient, it is very important to obtain clinical information to use the best diagnostic tools. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Enkhbat, S; Toyota, M; Yasuda, N; Ohara, H
1997-06-01
The objective of this study is to compare the influence on delays in the tuberculosis case-finding process according to the types of medical facilities initially visited. The subjects include 107 patients 16 years and older who were diagnosed with bacteriologically confirmed pulmonary tuberculosis at nine tuberculosis specialized facilities in Ulaanbaatar, Mongolia from May 1995 to March 1996. Patients were interviewed about their demographic and socioeconomic factors and their medical records were reviewed for measuring delays. Fifty-five patients initially consulted general physicians and the remaining 52 patients initially visited other types of facilities including tuberculosis specialized facilities. Patients who initially consulted general physicians had shorter patient's delays and longer doctor's delays than those who had visited other facilities first. Since the reduction of patient's delay outweighs the extension of doctor's delay among patients who initially consulted general physicians, their total delay was shorter than that of patients who visited other facilities first. The beneficial influence of consulting general physicians first on the total delay was observed after adjusting for patient's age, sex, residence area, family income and family history of tuberculosis. This finding indicates that general physicians play an important role in improving the passive case-finding process in Mongolia.
Analysis of Queues with Rational Arrival Process Components - A General Approach
DEFF Research Database (Denmark)
Bean, Nigel; Nielsen, Bo Friis
In a previous paper we demonstrated that the well known matrix-geometric solution of Quasi-Birth-and-Death processes is valid also if we introduce Rational Arrival Process (RAP) components. Here we extend those results and we offer an alternative proof by using results obtained by Tweedie. We prove...... the matrix-geometric form for a certain kind of operators on the stationary measure for discrete time Markov chains of GI/M/1 type. We apply this result to an embedded chain with RAP components. We then discuss the straight- forward modification of the standard algorithms for calculating the matrix R...
Westmijze, Mark
2018-01-01
Commercial Off The Shelf (COTS) Chip Multi-Processor (CMP) systems are for cost reasons often used in industry for soft real-time stream processing. COTS CMP systems typically have a low timing predictability, which makes it difficult to develop software applications for these systems with tight
General classification of maturation reaction-norm shape from size-based processes
DEFF Research Database (Denmark)
Christensen, Asbjørn; Andersen, Ken Haste
2011-01-01
for growth and mortality is based on processes at the level of the individual, and is motivated by the energy budget of fish. MRN shape is a balance between opposing factors and depends on subtle details of size dependence of growth and mortality. MRNs with both positive and negative slopes are predicted...
Distribution flow: a general process in the top layer of water repellent soils
Ritsema, C.J.; Dekker, L.W.
1995-01-01
Distribution flow is the process of water and solute flowing in a lateral direction over and through the very first millimetre or centimetre of the soil profile. A potassium bromide tracer was applied in two water-repellent sandy soils to follow the actual flow paths of water and solutes in the
Directory of Open Access Journals (Sweden)
Valery E. Tarabanko
2017-11-01
Full Text Available This review discusses principal patterns that govern the processes of lignins’ catalytic oxidation into vanillin (3-methoxy-4-hydroxybenzaldehyde and syringaldehyde (3,5-dimethoxy-4-hydroxybenzaldehyde. It examines the influence of lignin and oxidant nature, temperature, mass transfer, and of other factors on the yield of the aldehydes and the process selectivity. The review reveals that properly organized processes of catalytic oxidation of various lignins are only insignificantly (10–15% inferior to oxidation by nitrobenzene in terms of yield and selectivity in vanillin and syringaldehyde. Very high consumption of oxygen (and consequentially, of alkali in the process—over 10 mol per mol of obtained vanillin—is highlighted as an unresolved and unexplored problem: scientific literature reveals almost no studies devoted to the possibilities of decreasing the consumption of oxygen and alkali. Different hypotheses about the mechanism of lignin oxidation into the aromatic aldehydes are discussed, and the mechanism comprising the steps of single-electron oxidation of phenolate anions, and ending with retroaldol reaction of a substituted coniferyl aldehyde was pointed out as the most convincing one. The possibility and development prospects of single-stage oxidative processing of wood into the aromatic aldehydes and cellulose are analyzed.
Soobik, Mart
2014-01-01
The sustainability of technology education is related to a traditional understanding of craft and the methods used to teach it; however, the methods used in the teaching process have been influenced by the innovative changes accompanying the development of technology. In respect to social and economic development, it is important to prepare young…
The protection of fundamental human rights in criminal process General report
Brants, C.; Franken, Stijn
2009-01-01
This contribution examines the effect of the uniform standards of human rights in international conventions on criminal process in different countries and identifies factors inherent in national systems that influence the scope of international standards and the way in which they are implemented in
Tarabanko, Valery E.; Tarabanko, Nikolay
2017-01-01
This review discusses principal patterns that govern the processes of lignins’ catalytic oxidation into vanillin (3-methoxy-4-hydroxybenzaldehyde) and syringaldehyde (3,5-dimethoxy-4-hydroxybenzaldehyde). It examines the influence of lignin and oxidant nature, temperature, mass transfer, and of other factors on the yield of the aldehydes and the process selectivity. The review reveals that properly organized processes of catalytic oxidation of various lignins are only insignificantly (10–15%) inferior to oxidation by nitrobenzene in terms of yield and selectivity in vanillin and syringaldehyde. Very high consumption of oxygen (and consequentially, of alkali) in the process—over 10 mol per mol of obtained vanillin—is highlighted as an unresolved and unexplored problem: scientific literature reveals almost no studies devoted to the possibilities of decreasing the consumption of oxygen and alkali. Different hypotheses about the mechanism of lignin oxidation into the aromatic aldehydes are discussed, and the mechanism comprising the steps of single-electron oxidation of phenolate anions, and ending with retroaldol reaction of a substituted coniferyl aldehyde was pointed out as the most convincing one. The possibility and development prospects of single-stage oxidative processing of wood into the aromatic aldehydes and cellulose are analyzed. PMID:29140301
Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.
2012-01-01
Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424
2007-05-01
BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...m3 micrograms per cubic meter US United States USACE United States Army Corp of Engineers USC United States Code USCB United States Census Bureau...effects and annoyance in that very few flight operations and ground engine runs occur between 2200 hours and 0700 hours. BMPs include restricting the
Neural responses to ambiguity involve domain-general and domain-specific emotion processing systems.
Neta, Maital; Kelley, William M; Whalen, Paul J
2013-04-01
Extant research has examined the process of decision making under uncertainty, specifically in situations of ambiguity. However, much of this work has been conducted in the context of semantic and low-level visual processing. An open question is whether ambiguity in social signals (e.g., emotional facial expressions) is processed similarly or whether a unique set of processors come on-line to resolve ambiguity in a social context. Our work has examined ambiguity using surprised facial expressions, as they have predicted both positive and negative outcomes in the past. Specifically, whereas some people tended to interpret surprise as negatively valenced, others tended toward a more positive interpretation. Here, we examined neural responses to social ambiguity using faces (surprise) and nonface emotional scenes (International Affective Picture System). Moreover, we examined whether these effects are specific to ambiguity resolution (i.e., judgments about the ambiguity) or whether similar effects would be demonstrated for incidental judgments (e.g., nonvalence judgments about ambiguously valenced stimuli). We found that a distinct task control (i.e., cingulo-opercular) network was more active when resolving ambiguity. We also found that activity in the ventral amygdala was greater to faces and scenes that were rated explicitly along the dimension of valence, consistent with findings that the ventral amygdala tracks valence. Taken together, there is a complex neural architecture that supports decision making in the presence of ambiguity: (a) a core set of cortical structures engaged for explicit ambiguity processing across stimulus boundaries and (b) other dedicated circuits for biologically relevant learning situations involving faces.
Wang, Rongming; Yang, Wantai; Song, Yuanjun; Shen, Xiaomiao; Wang, Junmei; Zhong, Xiaodi; Li, Shuai; Song, Yujun
2015-01-01
A new methodology based on core alloying and shell gradient-doping are developed for the synthesis of nanohybrids, realized by coupled competitive reactions, or sequenced reducing-nucleation and co-precipitation reaction of mixed metal salts in a microfluidic and batch-cooling process. The latent time of nucleation and the growth of nanohybrids can be well controlled due to the formation of controllable intermediates in the coupled competitive reactions. Thus, spatiotemporal-resolved synthesi...
DEFF Research Database (Denmark)
Østergaard, Jacob; Kramer, Mark A.; Eden, Uri T.
2018-01-01
current. We then fit these spike train datawith a statistical model (a generalized linear model, GLM, with multiplicative influences of past spiking). For different levels of noise, we show how the GLM captures both the deterministic features of the Izhikevich neuron and the variability driven...... by the noise. We conclude that the GLM captures essential features of the simulated spike trains, but for near-deterministic spike trains, goodness-of-fit analyses reveal that the model does not fit very well in a statistical sense; the essential random part of the GLM is not captured....... are separately applied; understanding the relationships between these modeling approaches remains an area of active research. In this letter, we examine this relationship using simulation. To do so, we first generate spike train data from a well-known dynamical model, the Izhikevich neuron, with a noisy input...
International Nuclear Information System (INIS)
Wu Shan-Shan; Wang Lei; Yang De-Ren
2011-01-01
The behavior of wafers and solar cells from the border of a multicrystalline silicon (mc-Si) ingot, which contain deteriorated regions, is investigated. It is found that the diffusion length distribution of minority carriers in the cells is uniform, and high efficiency of the solar cells (about 16%) is achieved. It is considered that the quality of the deteriorated regions could be improved to be similar to that of adjacent regions. Moreover, it is indicated that during general solar cell fabrication, phosphorus gettering and hydrogen passivation could significantly improve the quality of deteriorated regions, while aluminum gettering by RTP could not. Therefore, it is suggested that the border of a mc-Si ingot could be used to fabricate high efficiency solar cells, which will increase mc-Si utilization effectively. (condensed matter: structure, mechanical and thermal properties)
Simulations of the general circulation of the Martian atmosphere. I - Polar processes
Pollack, James B.; Haberle, Robert M.; Schaeffer, James; Lee, Hilda
1990-01-01
Numerical simulations of the Martian atmosphere general circulation are carried out for 50 simulated days, using a three-dimensional model, based on the primitive equations of meteorology, which incorporated the radiative effects of atmospheric dust on solar and thermal radiation. A large number of numerical experiments were conducted for alternative choices of seasonal date and dust optical depth. It was found that, as the dust content of the winter polar region increased, the rate of atmospheric CO2 condensation increased sharply. It is shown that the strong seasonal variation in the atmospheric dust content observed might cause a number of hemispheric asymmetries. These asymmetries include the greater prevalence of polar hoods in the northern polar region during winter, the lower albedo of the northern polar cap during spring, and the total dissipation of the northern CO2 ice cap during the warmer seasons.
Generalized role for the cerebellum in encoding internal models: evidence from semantic processing.
Moberget, Torgeir; Gullesen, Eva Hilland; Andersson, Stein; Ivry, Richard B; Endestad, Tor
2014-02-19
The striking homogeneity of cerebellar microanatomy is strongly suggestive of a corresponding uniformity of function. Consequently, theoretical models of the cerebellum's role in motor control should offer important clues regarding cerebellar contributions to cognition. One such influential theory holds that the cerebellum encodes internal models, neural representations of the context-specific dynamic properties of an object, to facilitate predictive control when manipulating the object. The present study examined whether this theoretical construct can shed light on the contribution of the cerebellum to language processing. We reasoned that the cerebellum might perform a similar coordinative function when the context provided by the initial part of a sentence can be highly predictive of the end of the sentence. Using functional MRI in humans we tested two predictions derived from this hypothesis, building on previous neuroimaging studies of internal models in motor control. First, focal cerebellar activation-reflecting the operation of acquired internal models-should be enhanced when the linguistic context leads terminal words to be predictable. Second, more widespread activation should be observed when such predictions are violated, reflecting the processing of error signals that can be used to update internal models. Both predictions were confirmed, with predictability and prediction violations associated with increased blood oxygenation level-dependent signal in the posterior cerebellum (Crus I/II). Our results provide further evidence for cerebellar involvement in predictive language processing and suggest that the notion of cerebellar internal models may be extended to the language domain.
McCaskey, Ursina; von Aster, Michael; O’Gorman Tuura, Ruth; Kucian, Karin
2017-01-01
The link between number and space has been discussed in the literature for some time, resulting in the theory that number, space and time might be part of a generalized magnitude system. To date, several behavioral and neuroimaging findings support the notion of a generalized magnitude system, although contradictory results showing a partial overlap or separate magnitude systems are also found. The possible existence of a generalized magnitude processing area leads to the question how individuals with developmental dyscalculia (DD), known for deficits in numerical-arithmetical abilities, process magnitudes. By means of neuropsychological tests and functional magnetic resonance imaging (fMRI) we aimed to examine the relationship between number and space in typical and atypical development. Participants were 16 adolescents with DD (14.1 years) and 14 typically developing (TD) peers (13.8 years). In the fMRI paradigm participants had to perform discrete (arrays of dots) and continuous magnitude (angles) comparisons as well as a mental rotation task. In the neuropsychological tests, adolescents with dyscalculia performed significantly worse in numerical and complex visuo-spatial tasks. However, they showed similar results to TD peers when making discrete and continuous magnitude decisions during the neuropsychological tests and the fMRI paradigm. A conjunction analysis of the fMRI data revealed commonly activated higher order visual (inferior and middle occipital gyrus) and parietal (inferior and superior parietal lobe) magnitude areas for the discrete and continuous magnitude tasks. Moreover, no differences were found when contrasting both magnitude processing conditions, favoring the possibility of a generalized magnitude system. Group comparisons further revealed that dyscalculic subjects showed increased activation in domain general regions, whilst TD peers activate domain specific areas to a greater extent. In conclusion, our results point to the existence of a
McCaskey, Ursina; von Aster, Michael; O'Gorman Tuura, Ruth; Kucian, Karin
2017-01-01
The link between number and space has been discussed in the literature for some time, resulting in the theory that number, space and time might be part of a generalized magnitude system. To date, several behavioral and neuroimaging findings support the notion of a generalized magnitude system, although contradictory results showing a partial overlap or separate magnitude systems are also found. The possible existence of a generalized magnitude processing area leads to the question how individuals with developmental dyscalculia (DD), known for deficits in numerical-arithmetical abilities, process magnitudes. By means of neuropsychological tests and functional magnetic resonance imaging (fMRI) we aimed to examine the relationship between number and space in typical and atypical development. Participants were 16 adolescents with DD (14.1 years) and 14 typically developing (TD) peers (13.8 years). In the fMRI paradigm participants had to perform discrete (arrays of dots) and continuous magnitude (angles) comparisons as well as a mental rotation task. In the neuropsychological tests, adolescents with dyscalculia performed significantly worse in numerical and complex visuo-spatial tasks. However, they showed similar results to TD peers when making discrete and continuous magnitude decisions during the neuropsychological tests and the fMRI paradigm. A conjunction analysis of the fMRI data revealed commonly activated higher order visual (inferior and middle occipital gyrus) and parietal (inferior and superior parietal lobe) magnitude areas for the discrete and continuous magnitude tasks. Moreover, no differences were found when contrasting both magnitude processing conditions, favoring the possibility of a generalized magnitude system. Group comparisons further revealed that dyscalculic subjects showed increased activation in domain general regions, whilst TD peers activate domain specific areas to a greater extent. In conclusion, our results point to the existence of a
Cincotti, Silvano; Ponta, Linda; Raberto, Marco; Scalas, Enrico
2005-05-01
In this paper, empirical analyses and computational experiments are presented on high-frequency data for a double-auction (book) market. Main objective of the paper is to generalize the order waiting time process in order to properly model such empirical evidences. The empirical study is performed on the best bid and best ask data of 7 U.S. financial markets, for 30-stock time series. In particular, statistical properties of trading waiting times have been analyzed and quality of fits is evaluated by suitable statistical tests, i.e., comparing empirical distributions with theoretical models. Starting from the statistical studies on real data, attention has been focused on the reproducibility of such results in an artificial market. The computational experiments have been performed within the Genoa Artificial Stock Market. In the market model, heterogeneous agents trade one risky asset in exchange for cash. Agents have zero intelligence and issue random limit or market orders depending on their budget constraints. The price is cleared by means of a limit order book. The order generation is modelled with a renewal process. Based on empirical trading estimation, the distribution of waiting times between two consecutive orders is modelled by a mixture of exponential processes. Results show that the empirical waiting-time distribution can be considered as a generalization of a Poisson process. Moreover, the renewal process can approximate real data and implementation on the artificial stocks market can reproduce the trading activity in a realistic way.
International Nuclear Information System (INIS)
Njau, E.C.
1987-10-01
Complete analytical expressions for the distortion signals introduced into analogue signals by sampling and quantization processes are developed. These expressions are made up of terms that are wholely functions of the parameters of the original signals involved and hence are easy to evaluate numerically. It is shown in Parts 2 and 3 of this series that these expressions may be successfully used in the design and development of some electronic devices whose operation depends upon the above-named distortion signals. (author). 7 refs
International Nuclear Information System (INIS)
Boman, R.; Papeleux, L.; Ponthot, J. P.
2007-01-01
In this paper, the Arbitrary Lagrangian Eulerian formalism is used to compute the steady state of a 2D metal cutting operation and a 3D U-shaped cold roll forming process. Compared to the Lagrangian case, this method allows the use of a refined mesh near the tools, leading to an accurate representation of the chip formation (metal cutting) and the bending of the sheet (roll forming) with a limited computational time. The main problem of this kind of simulation is the rezoning of the nodes on the free surfaces of the sheet. A modified iterative isoparametric smoother is used to manage this geometrically complex and CPU expensive task
Directory of Open Access Journals (Sweden)
Therdsak Maitaouthong
2011-11-01
Full Text Available This article presents the factors affecting the integration of information literacy in the teaching and learning processes of general education courses at an undergraduate level, where information literacy is used as a tool in the student-centered teaching approach. The research was divided into two phases: (1 The study of factors affecting at a policy level – a qualitative research method conducted through an in-depth interview of the vice president for academic affairs and the Director of the General Education Management Center, and (2 The survey of factors affecting in the teaching and learning processes, which is concluded through the questioning of lecturers of general education courses, and librarians. The qualitative data was analyzed on content, and the quantitative data was analyzed through the use of descriptive statistics, weight of score prioritization and percentage. Two major categories were found to have an impact on integrating information literacy in the teaching and learning of general education courses at an undergraduate level. (1 Six factors at a policy level, namely, institutional policy, administrative structure and system, administrators’ roles, resources and infrastructures, learning resources and supporting programs, and teacher evaluation and development. (2 There are eleven instructional factors: roles of lecturers, roles of librarians, roles of learners, knowledge and understanding of information literacy of lecturers and librarians, cooperation between librarians and lecturers, learning outcomes, teaching plans, teaching methods, teaching activities, teaching aids, and student assessment and evaluation.
Hadzidiakos, Daniel; Horn, Nadja; Degener, Roland; Buchner, Axel; Rehberg, Benno
2009-08-01
There have been reports of memory formation during general anesthesia. The process-dissociation procedure has been used to determine if these are controlled (explicit/conscious) or automatic (implicit/unconscious) memories. This study used the process-dissociation procedure with the original measurement model and one which corrected for guessing to determine if more accurate results were obtained in this setting. A total of 160 patients scheduled for elective surgery were enrolled. Memory for words presented during propofol and remifentanil general anesthesia was tested postoperatively by using a word-stem completion task in a process-dissociation procedure. To assign possible memory effects to different levels of anesthetic depth, the authors measured depth of anesthesia using the BIS XP monitor (Aspect Medical Systems, Norwood, MA). Word-stem completion performance showed no evidence of memory for intraoperatively presented words. Nevertheless, an evaluation of these data using the original measurement model for process-dissociation data suggested an evidence of controlled (C = 0.05; 95% confidence interval [CI] 0.02-0.08) and automatic (A = 0.11; 95% CI 0.09-0.12) memory processes (P memory processes was obtained. The authors report and discuss parallel findings for published data sets that were generated by using the process-dissociation procedure. Patients had no memories for auditory information presented during propofol/remifentanil anesthesia after midazolam premedication. The use of the process-dissociation procedure with the original measurement model erroneously detected memories, whereas the extended model, corrected for guessing, correctly revealed no memory.
The protection of fundamental human rights in criminal process
General report
Directory of Open Access Journals (Sweden)
Chrisje Brants
2009-10-01
Full Text Available This contribution examines the effect of the uniform standards of human rights in international conventions on criminal process in different countries and identifies factors inherent in national systems that influence the scope of international standards and the way in which they are implemented in a national context. Three overreaching issues influence the reception of international fundamental rights and freedoms in criminal process: constitutional arrangements, legal tradition and culture, and practical circumstances. There is no such thing as the uniform implementation of convention standards; even in Europe where the European Convention on Human Rights and Fundamental Freedoms and the case law of the European Court play a significant role, there is still much diversity in the actual implementation of international norms due to the influence of legal traditions which form a counterforce to the weight of convention obligations. An even greater counterforce is at work in practical circumstances that can undermine international norms, most especially global issues of security, crime control and combating terrorism. Although convention norms are still in place, there is a very real risk that they are circumvented or at least diluted in order to increase effective crime control.
Generalization of the Poincare sphere to process 2D displacement signals
Sciammarella, Cesar A.; Lamberti, Luciano
2017-06-01
Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.
Ward nurses' experiences of the discharge process between intensive care unit and general ward.
Kauppi, Wivica; Proos, Matilda; Olausson, Sepideh
2018-05-01
Intensive care unit (ICU) discharges are challenging practices that carry risks for patients. Despite the existing body of knowledge, there are still difficulties in clinical practice concerning unplanned ICU discharges, specifically where there is no step-down unit. The aim of this study was to explore general ward nurses' experiences of caring for patients being discharged from an ICU. Data were collected from focus groups and in-depth interviews with a total of 16 nurses from three different hospitals in Sweden. An inductive qualitative design was chosen. The analysis revealed three themes that reflect the challenges in nursing former ICU patients: a vulnerable patient, nurses' powerlessness and organizational structure. The nurses described the challenge of nursing a fragile patient based on several aspects. They expressed feeling unrealistic demands when caring for a fragile former ICU patient. The demands were related to their own profession and knowledge regarding how to care for this group of patients. The organizational structure had an impact on how the nurses' caring practice could be realized. This evoked ethical concerns that the nurses had to cope with as the organization's care guidelines did not always favour the patients. The structure of the organization and its leadership appear to have a significant impact on the nurses' ability to offer patients the care they need. This study sheds light on the need for extended outreach services and intermediate care in order to meet the needs of patients after the intensive care period. © 2018 British Association of Critical Care Nurses.
Song, Yun S; Steinrücken, Matthias
2012-03-01
The transition density function of the Wright-Fisher diffusion describes the evolution of population-wide allele frequencies over time. This function has important practical applications in population genetics, but finding an explicit formula under a general diploid selection model has remained a difficult open problem. In this article, we develop a new computational method to tackle this classic problem. Specifically, our method explicitly finds the eigenvalues and eigenfunctions of the diffusion generator associated with the Wright-Fisher diffusion with recurrent mutation and arbitrary diploid selection, thus allowing one to obtain an accurate spectral representation of the transition density function. Simplicity is one of the appealing features of our approach. Although our derivation involves somewhat advanced mathematical concepts, the resulting algorithm is quite simple and efficient, only involving standard linear algebra. Furthermore, unlike previous approaches based on perturbation, which is applicable only when the population-scaled selection coefficient is small, our method is nonperturbative and is valid for a broad range of parameter values. As a by-product of our work, we obtain the rate of convergence to the stationary distribution under mutation-selection balance.
NJOY-97, General ENDF/B Processing System for Reactor Design Problems
International Nuclear Information System (INIS)
1999-01-01
1 - Description of program or function: The NJOY nuclear data processing system is a modular computer code used for converting evaluated nuclear data in the ENDF format into libraries useful for applications calculations. Because the Evaluated Nuclear Data File (ENDF) format is used all around the world (e.g., ENDF/B-VI in the US, JEF-2.2 in Europe, JENDL-3.2 in Japan, BROND-2.2 in Russia), NJOY gives its users access to a wide variety of the most up-to-date nuclear data. NJOY provides comprehensive capabilities for processing evaluated data, and it can serve applications ranging from continuous-energy Monte Carlo (MCNP), through deterministic transport codes (DANT, ANISN, DORT), to reactor lattice codes (WIMS, EPRI). NJOY handles a wide variety of nuclear effects, including resonances, Doppler broadening, heating (KERMA), radiation-damage, thermal scattering (even cold moderators), gas production, neutrons and charged particles, photo-atomic interactions, self shielding, probability tables, photon production, and high-energy interactions (to 150 MeV). Output can include printed listings, special library files for applications, and Postscript graphics (plus colour). More information on NJOY is available from the developer's home page at http://t2.lanl.gov. Follow the Tourbus section of the Tour area to find notes from the ICTP lectures held at Trieste in March 1998 on the ENDF format and on the NJOY code. 2 - Methods: NJOY97 consists of a set of modules, each performing a well-defined processing task. Each of these modules is essentially a separate computer program linked together by input and output files and a few common constants. The methods and instructions on how to use them are documented in the LA-12740-M report on NJOY91 and in the 'README' file. No new published document is yet available. NJOY97 is a cleaned up version of NJOY94.105 that features compatibility with a wider variety of compilers and machines, explicit double precision for 32-bit systems, a
Prototype performance studies of a Full Mesh ATCA-based General Purpose Data Processing Board
Okumura, Yasuyuki; Liu, Tiehui Ted; Yin, Hang
2013-01-01
High luminosity conditions at the LHC pose many unique challenges for potential silicon based track trigger systems. One of the major challenges is data formatting, where hits from thousands of silicon modules must first be shared and organized into overlapping eta-phi trigger towers. Communication between nodes requires high bandwidth, low latency, and flexible real time data sharing, for which a full mesh backplane is a natural solution. A custom Advanced Telecommunications Computing Architecture data processing board is designed with the goal of creating a scalable architecture abundant in flexible, non-blocking, high bandwidth board to board communication channels while keeping the design as simple as possible. We have performed the first prototype board testing and our first attempt at designing the prototype system has proven to be successful. Leveraging the experience we gained through designing, building and testing the prototype board system we are in the final stages of laying out the next generatio...
General information on licensing process in Bulgaria and disused sealed sources
International Nuclear Information System (INIS)
Nizamska, M.
2003-01-01
The basic legal framework for radiation protection and the safety of radiation sources is given in the report. The authorisation process is described. Actual data for the system of authorisation about SIR during 2002/2003 are given. The planned activities related to RAW management are:commissioning of the complex for treatment, conditioning and storage of RAW in Kozloduy NPP - by the end of 2003; investigation of Gabra site for construction of institutional waste disposal facility - by the end of 2004; implementation of program for reconstruction and modernisation of Novi Han Repository - by the end of 2007; site selection for the national RAW disposal facility - by the end of 2008. The Nuclear Energy Act defines the following future activities: establishment of the State Enterprise 'RAW' in 2004; development of new secondary legislation for safe management of SF and RAW until July 2004; update of the National Strategy for Safe Management of SF and RAW until the end of 2003
General description of few-body break-up processes at threshold
International Nuclear Information System (INIS)
Barrachina, R.O.
2005-01-01
In this communication we describe the effects produced by an N-body threshold behavior in N + 1 body break-up processes, as it occurs in situations where one of the fragments acquires almost all the excess energy of the system. Furthermore, we relate the appearance of discontinuities in single-particle multiply differential cross sections to the threshold behavior of the remaining particles, and describe the applicability of these ideas to different systems from atomic, molecular and nuclear collision physics. We finally show that, even though the study of ultracold collisions represents the direct way of gathering information on a break-up system near threshold, the analysis of high-energy collisions provides an alternative, and sometimes advantageous, approach
Generalization of the Wide-Sense Markov Concept to a Widely Linear Processing
International Nuclear Information System (INIS)
Espinosa-Pulido, Juan Antonio; Navarro-Moreno, Jesús; Fernández-Alcalá, Rosa María; Ruiz-Molina, Juan Carlos; Oya-Lechuga, Antonia; Ruiz-Fuentes, Nuria
2014-01-01
In this paper we show that the classical definition and the associated characterizations of wide-sense Markov (WSM) signals are not valid for improper complex signals. For that, we propose an extension of the concept of WSM to a widely linear (WL) setting and the study of new characterizations. Specifically, we introduce a new class of signals, called widely linear Markov (WLM) signals, and we analyze some of their properties based either on second-order properties or on state-space models from a WL processing standpoint. The study is performed in both the forwards and backwards directions of time. Thus, we provide two forwards and backwards Markovian representations for WLM signals. Finally, different estimation recursive algorithms are obtained for these models
A Full Mesh ATCA-based General Purpose Data Processing Board: Pulsar II
Olsen, J; Okumura, Y
2014-01-01
High luminosity conditions at the LHC pose many unique challenges for potential silicon based track trigger systems. Among those challenges is data formatting, where hits from thousands of silicon modules must first be shared and organized into overlapping trigger towers. Other challenges exist for Level-1 track triggers, where many parallel data paths may be used for 5 high speed time multiplexed data transfers. Communication between processing nodes requires high bandwidth, low latency, and flexible real time data sharing, for which a full mesh backplane is a natural fit. A custom full mesh enabled ATCA board called the Pulsar II has been designed with the goal of creating a scalable architecture abundant in flexible, non-blocking, high bandwidth board- to-board communication channels while keeping the design as simple as possible.
NJOY91, General ENDF/B Processing System for Reactor Design Problems
International Nuclear Information System (INIS)
MacFarlane, R.E.; Barrett, R.J.; Muir, D.W.; Boicourt, R.M.
1997-01-01
1 - Description of problem or function: The NJOY nuclear data processing system is a comprehensive computer code package for producing pointwise and multigroup neutron, photon, and charged particle cross sections from ENDF/B evaluated nuclear data. NJOY-89 is a substantial upgrade of the previous release. It includes photon production and photon interaction capabilities, heating calculations, covariance processing, and thermal scattering capabilities. It is capable of processing data in ENDF/B-4, ENDF/B-5, and ENDF/B-6 formats for evaluated data (to the extent that the latter have been frozen at the time of this release). NJOY-91.118: This is the last in the NJOY-91 series. It uses the same module structure as the earlier versions and its graphics options depend on DISSPLA. NJOY91.118 includes bug fixes, improvements in several modules, and some new capabilities. Information on the changes is included in the README file. A new test problem was added to test some ENDF/B-6 features, including Reich-Moore resonance reconstruction, energy-angle matrices in GROUPR, and energy-angle distributions in ACER. The 91.118 release is basically configured for UNIX. Short descriptions of the different modules follow: RECONR Reconstructs pointwise (energy-dependent) cross sections from ENDF/B resonance parameters and interpolation schemes. BROADR Doppler broadens and thins pointwise cross sections. UNRESR Computes effective self-shielded pointwise cross sections in the unresolved-resonance region. HEATR Generates pointwise heat production cross sections (KERMA factors) and radiation-damage-energy production cross sections. THERMR Produces incoherent inelastic energy-to-energy matrices for free or bound scatterers, coherent elastic cross sections for hexagonal materials, and incoherent elastic cross sections. GROUPR Generates self-shielded multigroup cross sections, group- to-group neutron scattering matrices, and photon production matrices from pointwise input. GAMINR Calculates
Directory of Open Access Journals (Sweden)
Daniel Alfonso-Robaina
2011-09-01
Full Text Available
Para mejorar el enfoque a procesos en el rediseño de la organización de la empresa, es necesaria la adecuación de 6 fases. En la propuesta se presentan las actividades de cada fase del Procedimiento de rediseño organizacional para mejorar el enfoque a procesos, así como sus entradas y salidas. El procedimiento propuesto en esta investigación es el resultado de la fusión de varios de los estudiados, teniendo como base el procedimiento de Rummler y Brache (1995 [1]. En la investigación fue útil la utilización de técnicas, entre las que se destacan: las entrevistas, la tormenta de ideas y la búsqueda bibliográfica; además del empleo de herramientas como: el Mapa de Procesos y el Modelo General de Organización. Con el uso de estas técnicas y herramientas se identificó como asunto crítico de negocio en la empresa Explomat, la insuficiente gestión integrada de los procesos, lo que debilita las posibilidades de la entidad para aprovechar las oportunidades que le brinda el entorno, poniendo en peligro el cumplimiento de su misión. Teniendo en cuenta el análisis del nivel de integración del sistema de dirección, a partir de las matrices de relaciones, se contribuyó a proyectar mejoras, confeccionando el debe “ser”.
Abstract
In order to improve the process approach relative to the organizational redesign, it is necessary the adaptation of 6 phases. In the proposal, the activities of each phase of the Procedure of organizational redesign to improve the process approach, as well as its inputs and outputs, are presented. The proposed procedure in this investigation is the result of the merger of several of those studied, taking as a starting point the procedure of Rummler and Brache (1995 [1]. In this investigation it was useful the use of techniques, such as the interviews, the brainstorm and bibliographical search; besides the employment of tools like the Processes Map and the General Model of
Signal processing and general purpose data acquisition system for on-line tomographic measurements
Murari, A.; Martin, P.; Hemming, O.; Manduchi, G.; Marrelli, L.; Taliercio, C.; Hoffmann, A.
1997-01-01
New analog signal conditioning electronics and data acquisition systems have been developed for the soft x-ray and bolometric tomography diagnostic in the reverse field pinch experiment (RFX). For the soft x-ray detectors the analog signal processing includes a fully differential current to voltage conversion, with up to a 200 kHz bandwidth. For the bolometers, a 50 kHz carrier frequency amplifier allows a maximum bandwidth of 10 kHz. In both cases the analog signals are digitized with a 1 MHz sampling rate close to the diagnostic and are transmitted via a transparent asynchronous xmitter/receiver interface (TAXI) link to purpose built Versa Module Europa (VME) modules which perform data acquisition. A software library has been developed for data preprocessing and tomographic reconstruction. It has been written in C language and is self-contained, i.e., no additional mathematical library is required. The package is therefore platform-free: in particular it can perform online analysis in a real-time application, such as continuous display and feedback, and is portable for long duration fusion or other physical experiments. Due to the modular organization of the library, new preprocessing and analysis modules can be easily integrated in the environment. This software is implemented in RFX over three different platforms: open VMS, digital Unix, and VME 68040 CPU.
NJOY-94, General ENDF/B Processing System for Reactor Design Problems
International Nuclear Information System (INIS)
1997-01-01
1 - Description of program or function: The NJOY nuclear data processing system is a comprehensive computer code system for producing pointwise and multigroup cross sections and related quantities from ENDF/B evaluated nuclear data in the ENDF format, including the latest US library, ENDF/B-VI. The NJOY code works with neutrons, photons, and charged particles and produces libraries for a wide variety of particle transport and reactor analysis codes. It is capable of processing data in ENDF/B-4, ENDF/B-5, and ENDF/B-6 formats for evaluated data. Short descriptions of the different modules follow: RECONR Reconstructs pointwise cross sections from ENDF/B resonance parameters and interpolation schemes. BROADR Doppler broadens and thins pointwise cross sections. UNRESR Computes effective self-shielded pointwise cross sections in the unresolved-resonance region. HEATR Generates pointwise heat production cross sections and radiation-damage-energy production cross sections. THERMR Produces incoherent inelastic energy-to-energy matrices for free or bound scatterers, coherent elastic cross sections for hexagonal materials, and incoherent elastic cross sections. GROUPR Generates self-shielded multigroup cross sections, group- to-group neutron scattering matrices, and photon production matrices from pointwise input. GAMINR Calculates multigroup photon interaction cross sections and KERMA factors and group-to-group photon scattering matrices. ERRORR Produces multigroup covariance matrices from ENDF/B uncertainties. COVR Reads the output of ERRORR and performs covariance plotting and output-formatting operations. DTFR Formats multigroup data for transport codes such as DTF-IV and ANISN. CCCCR Formats multigroup data for the CCCC standard interface files ISOTXS, BRKOXS, and DLAYXS. MATXSR Formats multigroup data for the MATXS cross section interface file. ACER Prepares libraries for the Los Alamos continuous-energy Monte Carlo code MCNP. POWR Prepares libraries for the EPRI
Chu, Shih-I.; Telnov, Dmitry A.
2004-02-01
The advancement of high-power and short-pulse laser technology in the past two decades has generated considerable interest in the study of multiphoton and very high-order nonlinear optical processes of atomic and molecular systems in intense and superintense laser fields, leading to the discovery of a host of novel strong-field phenomena which cannot be understood by the conventional perturbation theory. The Floquet theorem and the time-independent Floquet Hamiltonian method are powerful theoretical framework for the study of bound-bound multiphoton transitions driven by periodically time-dependent fields. However, there are a number of significant strong-field processes cannot be directly treated by the conventional Floquet methods. In this review article, we discuss several recent developments of generalized Floquet theorems, formalisms, and quasienergy methods, beyond the conventional Floquet theorem, for accurate nonperturbative treatment of a broad range of strong-field atomic and molecular processes and phenomena of current interests. Topics covered include (a) artificial intelligence (AI)-most-probable-path approach (MPPA) for effective treatment of ultralarge Floquet matrix problem; (b) non-Hermitian Floquet formalisms and complex quasienergy methods for nonperturbative treatment of bound-free and free-free processes such as multiphoton ionization (MPI) and above-threshold ionization (ATI) of atoms and molecules, multiphoton dissociation (MPD) and above-threshold dissociation (ATD) of molecules, chemical bond softening and hardening, charge-resonance enhanced ionization (CREI) of molecular ions, and multiple high-order harmonic generation (HHG), etc.; (c) many-mode Floquet theorem (MMFT) for exact treatment of multiphoton processes in multi-color laser fields with nonperiodic time-dependent Hamiltonian; (d) Floquet-Liouville supermatrix (FLSM) formalism for exact nonperturbative treatment of time-dependent Liouville equation (allowing for relaxations and
International Nuclear Information System (INIS)
Chu, S.-I.; Telnov, D.A.
2004-01-01
The advancement of high-power and short-pulse laser technology in the past two decades has generated considerable interest in the study of multiphoton and very high-order nonlinear optical processes of atomic and molecular systems in intense and superintense laser fields, leading to the discovery of a host of novel strong-field phenomena which cannot be understood by the conventional perturbation theory. The Floquet theorem and the time-independent Floquet Hamiltonian method are powerful theoretical framework for the study of bound-bound multiphoton transitions driven by periodically time-dependent fields. However, there are a number of significant strong-field processes cannot be directly treated by the conventional Floquet methods. In this review article, we discuss several recent developments of generalized Floquet theorems, formalisms, and quasienergy methods, beyond the conventional Floquet theorem, for accurate nonperturbative treatment of a broad range of strong-field atomic and molecular processes and phenomena of current interests. Topics covered include (a) artificial intelligence (AI)-most-probable-path approach (MPPA) for effective treatment of ultralarge Floquet matrix problem; (b) non-Hermitian Floquet formalisms and complex quasienergy methods for nonperturbative treatment of bound-free and free-free processes such as multiphoton ionization (MPI) and above-threshold ionization (ATI) of atoms and molecules, multiphoton dissociation (MPD) and above-threshold dissociation (ATD) of molecules, chemical bond softening and hardening, charge-resonance enhanced ionization (CREI) of molecular ions, and multiple high-order harmonic generation (HHG), etc.; (c) many-mode Floquet theorem (MMFT) for exact treatment of multiphoton processes in multi-color laser fields with nonperiodic time-dependent Hamiltonian; (d) Floquet-Liouville supermatrix (FLSM) formalism for exact nonperturbative treatment of time-dependent Liouville equation (allowing for relaxations and
Energy Technology Data Exchange (ETDEWEB)
2008-06-01
This report evaluates alternative processes that could be used to produce Pu-238 fueled General Purpose Heat Sources (GPHS) for radioisotope thermoelectric generators (RTG). Fabricating GPHSs with the current process has remained essentially unchanged since its development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the fields of chemistry, manufacturing, ceramics, and control systems. At the Department of Energy’s request, alternate manufacturing methods were compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product. An expert committee performed the evaluation with input from four national laboratories experienced in Pu-238 handling.
Ubiquitin-aldehyde: a general inhibitor of ubiquitin-recycling processes
International Nuclear Information System (INIS)
Hershko, A.; Rose, I.A.
1987-01-01
The generation and characterization of ubiquitin (Ub)-aldehyde, a potent inhibitor of Ub-C-terminal hydrolase, has previously been reported. The authors examine the action of this compound on the Ub-mediated proteolytic pathway using the system derived from rabbit reticulocytes. Addition of Ub-aldehyde was found to strongly inhibit breakdown of added 125 I-labeled lysozyme, but inhibition was overcome by increasing concentrations of Ub. The following evidence shows the effect of Ub-aldehyde on protein breakdown to be indirectly caused by its interference with the recycling of Ub, leading to exhaustion of the supply of free Ub: (i) Ub-aldehyde markedly increased the accumulation of Ub-protein conjugates coincident with a much decreased rate of conjugate breakdown; (ii) release of Ub from isolated Ub-protein conjugates in the absence of ATP (and therefore not coupled to protein degradation) is markedly inhibited by Ub-aldehyde. On the other hand, the ATP-dependent degradation of the protein moiety of Ub conjugates, which is an integral part of the proteolytic process, is not inhibited by this agent; (iii) direct measurement of levels of free Ub showed a rapid disappearance caused by the inhibitor. The Ub is found to be distributed in derivatives of a wide range of molecular weight classes. It thus seems that Ub-aldehyde, previously demonstrated to inhibit the hydrolysis of Ub conjugates of small molecules, also inhibits the activity of a series of enzymes that regenerate free Ub from adducts with proteins and intermediates in protein breakdown
Sotirov, Sotir
2016-01-01
The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...
International Nuclear Information System (INIS)
Peters, H.P.; Renn, O.
1983-01-01
The perception of risk has become a mayor research field, after scientists and politicians recognized that scientific risk studies like the Rasmussen-Report on nuclear energy had no large impact on the public acceptance. With our surveys we aimed to combine two methodological approaches (object perception and attitude theory) and to develop a technique in which the psychic process of perceiving and assessing risk-objects by the general public was followed up and analyzed. Psychological experiments in the field of isolating relevant factors of qualitative risk properties as well as demographic surveys for the measurement of the belief structure were carried out. Our results indicate that in objection to the common conception by natural scientists people in general have a good estimative ability to judge the expected value of different risks. But beyond this estimation of fatalities people also use other criteria (like personal control) to order different objects in respect to their riskiness. The perceived risk is but one factor influencing attitude. A simplified model of the acceptance-building process is carried out showing that acceptance-building is not a purely individual process. Individuals are linked together by their social environment so that every individual decision is influenced by the decision of other people
Lizarralde, I; Fernández-Arévalo, T; Brouckaert, C; Vanrolleghem, P; Ikumi, D S; Ekama, G A; Ayesa, E; Grau, P
2015-05-01
This paper introduces a new general methodology for incorporating physico-chemical and chemical transformations into multi-phase wastewater treatment process models in a systematic and rigorous way under a Plant-Wide modelling (PWM) framework. The methodology presented in this paper requires the selection of the relevant biochemical, chemical and physico-chemical transformations taking place and the definition of the mass transport for the co-existing phases. As an example a mathematical model has been constructed to describe a system for biological COD, nitrogen and phosphorus removal, liquid-gas transfer, precipitation processes, and chemical reactions. The capability of the model has been tested by comparing simulated and experimental results for a nutrient removal system with sludge digestion. Finally, a scenario analysis has been undertaken to show the potential of the obtained mathematical model to study phosphorus recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.
2010-04-01
..., consent to service of process by a nonresident general partner of a broker-dealer firm. This form shall be... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Form 10-M, consent to service of process by a nonresident general partner of a broker-dealer firm. 249.510 Section 249.510...
2010-04-01
... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Form ADV-NR, appointment of agent for service of process by non-resident general partner and non-resident managing agent of an... agent for service of process by non-resident general partner and non-resident managing agent of an...
Adriaensens, Stefanie; Beyers, Wim; Struyf, Elke
2015-01-01
The theory that self-esteem is substantially constructed based on social interactions implies that having a stutter could have a negative impact on self-esteem. Specifically, self-esteem during adolescence, a period of life characterized by increased self-consciousness, could be at risk. In addition to studying mean differences between stuttering and non-stuttering adolescents, this article concentrates on the influence of stuttering severity on domain-specific and general self-esteem. Subsequently, we investigate if covert processes on negative communication attitudes, experienced stigma, non-disclosure of stuttering, and (mal)adaptive perfectionism mediate the relationship between stuttering severity and self-esteem. Our sample comprised 55 stuttering and 76 non-stuttering adolescents. They were asked to fill in a battery of questionnaires, consisting of: Subjective Screening of Stuttering, Self-Perception Profile for Adolescents, Erickson S-24, Multidimensional Perfectionism Scale, and the Stigmatization and Disclosure in Adolescents Who Stutter Scale. SEM (structural equation modeling) analyses showed that stuttering severity negatively influences adolescents' evaluations of social acceptance, school competence, the competence to experience a close friendship, and global self-esteem. Maladaptive perfectionism and especially negative communication attitudes fully mediate the negative influence of stuttering severity on self-esteem. Group comparison showed that the mediation model applies to both stuttering and non-stuttering adolescents. We acknowledge the impact of having a stutter on those domains of the self in which social interactions and communication matter most. We then accentuate that negative attitudes about communication situations and excessive worries about saying things in ways they perceive as wrong are important processes to consider with regard to the self-esteem of adolescents who stutter. Moreover, we provide evidence that these covert
Cervera Peris, Mercedes; Alonso Rorís, Víctor Manuel; Santos Gago, Juan Manuel; Álvarez Sabucedo, Luis; Wanden-Berghe, Carmina; Sanz-Valero, Javier
2018-04-03
Any system applied to the control of parenteral nutrition (PN) ought to prove that the process meets the established requirements and include a repository of records to allow evaluation of the information about PN processes at any time. The goal of the research was to evaluate the mobile health (mHealth) app and validate its effectiveness in monitoring the management of the PN process. We studied the evaluation and validation of the general process of PN using an mHealth app. The units of analysis were the PN bags prepared and administered at the Son Espases University Hospital, Palma, Spain, from June 1 to September 6, 2016. For the evaluation of the app, we used the Poststudy System Usability Questionnaire and subsequent analysis with the Cronbach alpha coefficient. Validation was performed by checking the compliance of control for all operations on each of the stages (validation and transcription of the prescription, preparation, conservation, and administration) and by monitoring the operative control points and critical control points. The results obtained from 387 bags were analyzed, with 30 interruptions of administration. The fulfillment of stages was 100%, including noncritical nonconformities in the storage control. The average deviation in the weight of the bags was less than 5%, and the infusion time did not present deviations greater than 1 hour. The developed app successfully passed the evaluation and validation tests and was implemented to perform the monitoring procedures for the overall PN process. A new mobile solution to manage the quality and traceability of sensitive medicines such as blood-derivative drugs and hazardous drugs derived from this project is currently being deployed. ©Mercedes Cervera Peris, Víctor Manuel Alonso Rorís, Juan Manuel Santos Gago, Luis Álvarez Sabucedo, Carmina Wanden-Berghe, Javier Sanz-Valero. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 03.04.2018.
Dost, A A; Redman, D; Cox, G
2000-08-01
This study assesses the current patterns and levels of exposure to rubber fume and rubber process dust in the British rubber industry and compares and contrasts the data obtained from the general rubber goods (GRG), retread tire (RT) and new tire (NT) sectors. A total of 179 rubber companies were visited and data were obtained from 52 general rubber goods, 29 retread tire and 7 new tire manufacturers. The survey was conducted using a questionnaire and included a walk-through inspection of the workplace to assess the extent of use of control measures and the nature of work practices being employed. The most recent (predominantly 1995-97) exposure monitoring data for rubber fume and rubber process dust were obtained from these companies; no additional sampling was conducted for the purpose of this study. In addition to the assessment of exposure data, evaluation of occupational hygiene reports for the quality of information and advice was also carried out.A comparison of the median exposures for processes showed that the order of exposure to rubber fume (E, in mg m(-3)) is: E(moulding) (0.40) approximately E(extrusion) (0.33)>E(milling) (0.18) for GRG; E(press) (0. 32)>E(extrusion) (0.19)>E(autoclave) (0.10) for RT; and E(press) (0. 22) approximately E(all other) (0.22) for NT. The order of exposure to rubber fume between sectors was E(GRG) (0.40)>E(RT) (0.32)>E(NT) (0.22). Median exposures to rubber process dust in the GRG was E(weighing) (4.2)>E(mixing) (1.2) approximately E(milling) (0.8) approximately E(extrusion) (0.8) and no significant difference (P=0. 31) between GRG and NT sectors. The findings compare well with the study carried out in the Netherlands [Kromhout et al. (1994), Annals of Occupational Hygiene 38(1), 3-22], and it is suggested that the factors governing the significant differences noted between the three sectors relate principally to the production and task functions and also to the extent of controls employed. Evaluation of occupational
The professional’s orientation in the formative process for the bachelor’s general united students
Directory of Open Access Journals (Sweden)
Darwin Stalin Faz-Delgado
2016-11-01
Full Text Available In Ecuador Primary Education has as goal to develop the abilities, skills and linguistic competence in children and teenagers from 5 years old until they arrive to High School degree. High School main objective is to provide to students a general and an interdisciplinary preparation that guide them to elaborate their life projects in order that they can fit in societyas responsible, critical and solidary human beings. It also has as intention to develop students’ abilities in knowledge acquisition and citizen competence and to prepare them to work, to learn and to access to University; this aspect establishes the importance of an adequate professional orientation that facilitates the conscious selection of their future profession and career. This article contains theoretical basis of process the formation and professional’s orientation in the High School, for the attention on the context of the Ecuador people.
Giménez-Alventosa, V; Ballester, F; Vijande, J
2016-12-01
The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Davies, Scott
2011-01-01
We combine the six-dimensional helicity formalism of Cheung and O'Connell with D-dimensional generalized unitarity to obtain a new formalism for computing one-loop amplitudes in dimensionally regularized QCD. With this procedure, we simultaneously obtain the pieces that are constructible from four-dimensional unitarity cuts and the rational pieces that are missed by them, while retaining a helicity formalism. We illustrate the procedure using four- and five-point one-loop amplitudes in QCD, including examples with external fermions. We also demonstrate the technique's effectiveness in next-to-leading order QCD corrections to Higgs processes by computing the next-to-leading order correction to the Higgs plus three positive-helicity gluons amplitude in the large top-quark mass limit.
Trinidade, A; Yung, M W
2014-04-01
A specialist balance clinic to effectively deal with dizzy patients is recommended by ENT-UK. We audit the patient pathway before and following the introduction of a consultant-led dedicated balance clinic. Process evaluation and audit. ENT outpatients department of a district general hospital. The journey of dizzy patients seen in the general ENT clinic was mapped from case notes and recorded retrospectively. A consultant-led, multidisciplinary balance clinic involving an otologist, a senior audiologist and a neurophysiotherapist was then set up, and the journey was prospectively recorded and compared with that before the change. Of the 44 dizzy patients seen in the general clinic, 41% had further follow-up consultations; 64% were given definitive or provisional diagnoses; 75% were discharged without a management plan. Oculomotor examination was not systematically performed. The mean interval between Visits 1 and 2 was 8.4 weeks and the mean number of visits was 3. In the consultant-led dedicated balance clinic, following Visit 1, only 8% of patients required follow-up; 97% received definitive diagnoses, which guided management; all patients left with definitive management plans in place. In all patients, oculomotor assessment was systematically performed and all patients received consultant and, where necessary, allied healthcare professional input. By standardising the management experience for dizzy patients, appropriate and timely treatment can be achieved, allowing for a more seamless and efficient patient journey from referral to treatment. A multidisciplinary balance clinic led by a consultant otologist is the ideal way to achieve this. © 2014 John Wiley & Sons Ltd.
Shen, C.; Fang, K.
2017-12-01
Deep Learning (DL) methods have made revolutionary strides in recent years. A core value proposition of DL is that abstract notions and patterns can be extracted purely from data, without the need for domain expertise. Process-based models (PBM), on the other hand, can be regarded as repositories of human knowledge or hypotheses about how systems function. Here, through computational examples, we argue that there is merit in integrating PBMs with DL due to the imbalance and lack of data in many situations, especially in hydrology. We trained a deep-in-time neural network, the Long Short-Term Memory (LSTM), to learn soil moisture dynamics from Soil Moisture Active Passive (SMAP) Level 3 product. We show that when PBM solutions are integrated into LSTM, the network is able to better generalize across regions. LSTM is able to better utilize PBM solutions than simpler statistical methods. Our results suggest PBMs have generalization value which should be carefully assessed and utilized. We also emphasize that when properly regularized, the deep network is robust and is of superior testing performance compared to simpler methods.
Directory of Open Access Journals (Sweden)
Isis Didier Lins
2018-03-01
Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.
Directory of Open Access Journals (Sweden)
Nicolae Răzvan Decuseară
2013-01-01
Full Text Available Due to limited resources a company cannot serve all potential markets in the world in a manner that all the clients to be satisfied and the business goals achieved, which is why the company should select the most appropriate markets. It can focus on a single product market serving many geographic areas, but may also decide to serve different product markets in a group of selected geographic areas. Due to the large number and diversity of markets that can choose, analyze of the market attractiveness and the selection the most interesting is a complex process. General Electric Matrix/McKinsey has two dimensions, market attractiveness and the competitive strength of the firm, and aims to analyze the strengths and weaknesses of the company in a variety of areas, allowing the company to identify the most attractive markets and to guide managers in allocating resources to these markets, improve the weaker competitive position of the company in emerging markets, or to draw firm unattractive markets. We can say that it is a very efficient tool for the company being used by international market specialists, on one hand to select foreign markets for the company, and on the other hand, to determine the strategy that the firm will be using to internationalize on those markets. At the end of this paper we present a part of a larger study in which we showed how General Electric Matrix/McKinsey it is used specifically in select foreign markets.
Werner, Gerhard
2009-04-01
In this theoretical and speculative essay, I propose that insights into certain aspects of neural system functions can be gained from viewing brain function in terms of the branch of Statistical Mechanics currently referred to as "Modern Critical Theory" [Stanley, H.E., 1987. Introduction to Phase Transitions and Critical Phenomena. Oxford University Press; Marro, J., Dickman, R., 1999. Nonequilibrium Phase Transitions in Lattice Models. Cambridge University Press, Cambridge, UK]. The application of this framework is here explored in two stages: in the first place, its principles are applied to state transitions in global brain dynamics, with benchmarks of Cognitive Neuroscience providing the relevant empirical reference points. The second stage generalizes to suggest in more detail how the same principles could also apply to the relation between other levels of the structural-functional hierarchy of the nervous system and between neural assemblies. In this view, state transitions resulting from the processing at one level are the input to the next, in the image of a 'bucket brigade', with the content of each bucket being passed on along the chain, after having undergone a state transition. The unique features of a process of this kind will be discussed and illustrated.
Salata, Brian M; Sterling, Madeline R; Beecy, Ashley N; Ullal, Ajayram V; Jones, Erica C; Horn, Evelyn M; Goyal, Parag
2018-05-01
Given high rates of heart failure (HF) hospitalizations and widespread adoption of the hospitalist model, patients with HF are often cared for on General Medicine (GM) services. Differences in discharge processes and 30-day readmission rates between patients on GM and those on Cardiology during the contemporary hospitalist era are unknown. The present study compared discharge processes and 30-day readmission rates of patients with HF admitted on GM services and those on Cardiology services. We retrospectively studied 926 patients discharged home after HF hospitalization. The primary outcome was 30-day all-cause readmission after discharge from index hospitalization. Although 60% of patients with HF were admitted to Cardiology services, 40% were admitted to GM services. Prevalence of cardiovascular and noncardiovascular co-morbidities were similar between patients admitted to GM services and Cardiology services. Discharge summaries for patients on GM services were less likely to have reassessments of ejection fraction, new study results, weights, discharge vital signs, discharge physical examinations, and scheduled follow-up cardiologist appointments. In a multivariable regression analysis, patients on GM services were more likely to experience 30-day readmissions compared with those on Cardiology services (odds ratio 1.43 95% confidence interval [1.05 to 1.96], p = 0.02). In conclusion, outcomes are better among those admitted to Cardiology services, signaling the need for studies and interventions focusing on noncardiology hospital providers that care for patients with HF. Copyright © 2018 Elsevier Inc. All rights reserved.
Hama, Hiromitsu; Yamashita, Kazumi
1991-11-01
A new method for video signal processing is described in this paper. The purpose is real-time image transformations at low cost, low power, and small size hardware. This is impossible without special hardware. Here generalized digital differential analyzer (DDA) and control memory (CM) play a very important role. Then indentation, which is called jaggy, is caused on the boundary of a background and a foreground accompanied with the processing. Jaggy does not occur inside the transformed image because of adopting linear interpretation. But it does occur inherently on the boundary of the background and the transformed images. It causes deterioration of image quality, and must be avoided. There are two well-know ways to improve image quality, blurring and supersampling. The former does not have much effect, and the latter has the much higher cost of computing. As a means of settling such a trouble, a method is proposed, which searches for positions that may arise jaggy and smooths such points. Computer simulations based on the real data from VTR, one scene of a movie, are presented to demonstrate our proposed scheme using DDA and CMs and to confirm the effectiveness on various transformations.
Gao, Lili; Zhou, Zai-Fa; Huang, Qing-An
2017-11-08
A microstructure beam is one of the fundamental elements in MEMS devices like cantilever sensors, RF/optical switches, varactors, resonators, etc. It is still difficult to precisely predict the performance of MEMS beams with the current available simulators due to the inevitable process deviations. Feasible numerical methods are required and can be used to improve the yield and profits of the MEMS devices. In this work, process deviations are considered to be stochastic variables, and a newly-developed numerical method, i.e., generalized polynomial chaos (GPC), is applied for the simulation of the MEMS beam. The doubly-clamped polybeam has been utilized to verify the accuracy of GPC, compared with our Monte Carlo (MC) approaches. Performance predictions have been made on the residual stress by achieving its distributions in GaAs Monolithic Microwave Integrated Circuit (MMIC)-based MEMS beams. The results show that errors are within 1% for the results of GPC approximations compared with the MC simulations. Appropriate choices of the 4-order GPC expansions with orthogonal terms have also succeeded in reducing the MC simulation labor. The mean value of the residual stress, concluded from experimental tests, shares an error about 1.1% with that of the 4-order GPC method. It takes a probability around 54.3% for the 4-order GPC approximation to attain the mean test value of the residual stress. The corresponding yield occupies over 90 percent around the mean within the twofold standard deviations.
Directory of Open Access Journals (Sweden)
Yingjun Zheng
2016-11-01
Full Text Available Patients with schizophrenia exhibit consistent abnormalities in face-evoked N170. However, the relation between face-specific N170 abnormalities in schizophrenic patients and schizophrenia clinical characters, which probably based on common neural mechanisms, is still rarely discovered. Using event-related potentials (ERPs recording in both schizophrenic patients and healthy controls, the amplitude and latency of N170 were recorded when participants were passively watching face and non-face (table pictures. The results showed a face-specific N170 latency sluggishness in schizophrenic patients, i.e., the N170 latencies of schizophrenic patients were significantly longer than those of healthy controls under both upright face and inverted face conditions. Importantly, the face-related N170 latencies of the left temporo-occipital electrodes (P7 and PO7 were positively correlated with negative symptoms and general psychiatric symptoms. Besides the analysis of latencies, the N170 amplitudes became weaker in schizophrenic patients under both inverted face and inverted table conditions, with a left hemisphere dominant. More interestingly, the FIEs (the difference of N170 amplitudes between upright and inverted faces were absent in schizophrenic patients, which suggested the abnormality of holistic face processing. These results above revealed a marked symptom-relevant neural sluggishness of face-specific processing in schizophrenic patients, supporting the demyelinating hypothesis of schizophrenia.
Collins, Michael D; Jackson, Chris J; Walker, Benjamin R; O'Connor, Peter J; Gardiner, Elliroma
2017-01-01
Over the last 40 years or more the personality literature has been dominated by trait models based on the Big Five (B5). Trait-based models describe personality at the between-person level but cannot explain the within-person mental mechanisms responsible for personality. Nor can they adequately account for variations in emotion and behavior experienced by individuals across different situations and over time. An alternative, yet understated, approach to personality architecture can be found in neurobiological theories of personality, most notably reinforcement sensitivity theory (RST). In contrast to static trait-based personality models like the B5, RST provides a more plausible basis for a personality process model, namely, one that explains how emotions and behavior arise from the dynamic interaction between contextual factors and within-person mental mechanisms. In this article, the authors review the evolution of a neurobiologically based personality process model based on RST, the response modulation model and the context-appropriate balanced attention model. They argue that by integrating this complex literature, and by incorporating evidence from personality neuroscience, one can meaningfully explain personality at both the within- and between-person levels. This approach achieves a domain-general architecture based on RST and self-regulation that can be used to align within-person mental mechanisms, neurobiological systems and between-person measurement models. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Directory of Open Access Journals (Sweden)
Lili Gao
2017-11-01
Full Text Available A microstructure beam is one of the fundamental elements in MEMS devices like cantilever sensors, RF/optical switches, varactors, resonators, etc. It is still difficult to precisely predict the performance of MEMS beams with the current available simulators due to the inevitable process deviations. Feasible numerical methods are required and can be used to improve the yield and profits of the MEMS devices. In this work, process deviations are considered to be stochastic variables, and a newly-developed numerical method, i.e., generalized polynomial chaos (GPC, is applied for the simulation of the MEMS beam. The doubly-clamped polybeam has been utilized to verify the accuracy of GPC, compared with our Monte Carlo (MC approaches. Performance predictions have been made on the residual stress by achieving its distributions in GaAs Monolithic Microwave Integrated Circuit (MMIC-based MEMS beams. The results show that errors are within 1% for the results of GPC approximations compared with the MC simulations. Appropriate choices of the 4-order GPC expansions with orthogonal terms have also succeeded in reducing the MC simulation labor. The mean value of the residual stress, concluded from experimental tests, shares an error about 1.1% with that of the 4-order GPC method. It takes a probability around 54.3% for the 4-order GPC approximation to attain the mean test value of the residual stress. The corresponding yield occupies over 90 percent around the mean within the twofold standard deviations.
International Nuclear Information System (INIS)
Saida, Hiromi
2006-01-01
When a black hole is in an empty space in which there is no matter field except that of the Hawking radiation (Hawking field), then the black hole evaporates and the entropy of the black hole decreases. The generalized second law guarantees the increase of the total entropy of the whole system which consists of the black hole and the Hawking field. That is, the increase of the entropy of the Hawking field is faster than the decrease of the black hole entropy. In a naive sense, one may expect that the entropy increase of the Hawking field is due to the self-interaction among the composite particles of the Hawking field, and that the self-relaxation of the Hawking field results in the entropy increase. Then, when one considers a non-self-interacting matter field as the Hawking field, it is obvious that self-relaxation does not take place, and one may think that the total entropy does not increase. However, using nonequilibrium thermodynamics which has been developed recently, we find for the non-self-interacting Hawking field that the rate of entropy increase of the Hawking field (the entropy emission rate by the black hole) grows faster than the rate of entropy decrease of the black hole during the black hole evaporation in empty space. The origin of the entropy increase of the Hawking field is the increase of the black hole temperature. Hence an understanding of the generalized second law in the context of nonequilibrium thermodynamics is suggested; even if the self-relaxation of the Hawking field does not take place, the temperature increase of the black hole during the evaporation process causes the entropy increase of the Hawking field to result in the increase of the total entropy
English, Thomas
2005-01-01
A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.
Energy Technology Data Exchange (ETDEWEB)
Porro, C; Biral, G P [Modena Univ. (Italy). Ist. di Fisiologia Umana; Fonda, S; Baraldi, P [Modena Univ. (Italy). Lab. di Bioingegneria della Clinica Oculistica; Cavazzuti, M [Modena Univ. (Italy). Clinica Neurologica
1984-09-01
A general purpose image processing system is described including B/W TV camera, high resolution image processor and display system (TESAK VDC 501), computer (DEC PDP 11/23) and monochrome and color monitors. Images may be acquired from a microscope equipped with a TV camera or using the TV in direct viewing; the A/D converter and the image processor provides fast (40 ms) and precise (512x512 data points) digitization of TV signal with a 256 gray levels maximum resolution. Computer programs have been developed in order to perform qualitative and quantitative analyses of autoradiographs obtained with the 2-DG method, which are written in FORTRAN and MACRO 11 Assembly Language. They include: (1) procedures designed to recognize errors in acquisition due to possible image shading and correct them via software; (2) routines suitable for qualitative analyses of the whole image or selected regions of it, providing the opportunity for pseudocolor coding, statistics, graphic overlays; (3) programs permitting the conversion of gray levels into metabolic rates of glucose utilization and the display of gray- or color-coded metabolic maps.
Bouchaud, Jean-Philippe; Sornette, Didier
1994-06-01
The ability to price risks and devise optimal investment strategies in thé présence of an uncertain "random" market is thé cornerstone of modern finance theory. We first consider thé simplest such problem of a so-called "European call option" initially solved by Black and Scholes using Ito stochastic calculus for markets modelled by a log-Brownien stochastic process. A simple and powerful formalism is presented which allows us to generalize thé analysis to a large class of stochastic processes, such as ARCH, jump or Lévy processes. We also address thé case of correlated Gaussian processes, which is shown to be a good description of three différent market indices (MATIF, CAC40, FTSE100). Our main result is thé introduction of thé concept of an optimal strategy in the sense of (functional) minimization of the risk with respect to the portfolio. If the risk may be made to vanish for particular continuous uncorrelated 'quasiGaussian' stochastic processes (including Black and Scholes model), this is no longer the case for more general stochastic processes. The value of the residual risk is obtained and suggests the concept of risk-corrected option prices. In the presence of very large deviations such as in Lévy processes, new criteria for rational fixing of the option prices are discussed. We also apply our method to other types of options, `Asian', `American', and discuss new possibilities (`doubledecker'...). The inclusion of transaction costs leads to the appearance of a natural characteristic trading time scale. L'aptitude à quantifier le coût du risque et à définir une stratégie optimale de gestion de portefeuille dans un marché aléatoire constitue la base de la théorie moderne de la finance. Nous considérons d'abord le problème le plus simple de ce type, à savoir celui de l'option d'achat `européenne', qui a été résolu par Black et Scholes à l'aide du calcul stochastique d'Ito appliqué aux marchés modélisés par un processus Log
Tax, N.; van Zelst, S.J.; Teinemaa, I.; Gulden, Jens; Reinhartz-Berger, Iris; Schmidt, Rainer; Guerreiro, Sérgio; Guédria, Wided; Bera, Palash
2018-01-01
A plethora of automated process discovery techniques have been developed which aim to discover a process model based on event data originating from the execution of business processes. The aim of the discovered process models is to describe the control-flow of the underlying business process. At the
Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren
1997-01-01
The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
Digital Repository Service at National Institute of Oceanography (India)
Nakamoto, S.; Saito, H.; Muneyama, K.; Sato, T.; PrasannaKumar, S.; Kumar, A.; Frouin, R.
-chemical system that supports steady carbon circulation in geological time scale in the world ocean using Mixed Layer-Isopycnal ocean General Circulation model with remotely sensed Coastal Zone Color Scanner (CZCS) chlorophyll pigment concentration....
Directory of Open Access Journals (Sweden)
Viacheslav Mulyk
2017-06-01
Full Text Available Purpose: substantiation of the methodology of the training process of qualified female athletes engaged in bodybuilding in the general preparatory stage of the preparatory period, taking into account the biological cycle. Material & Methods: in the study participated 18 qualified female athletes engaged in bodybuilding, included in the Kharkov region team of bodybuilding. Results: comparative characteristic of the most frequently used methodology of the training process in bodybuilding are shows. An optimal methodology for qualified female athletes engaged in bodybuilding has been developed and justified, depending on the initial form of the athlete at the beginning of the general preparatory stage of the training. The dependence of the change in the body weight of female athletes from the training process is shows. Conclusion: on the basis of the study, the author suggests an optimal training methodology depending on the mesocycle of training in the preparatory period in the general preparatory stage.
Scott, Felipe; Aroca, Germán; Caballero, José Antonio; Conejeros, Raúl
2017-07-01
The aim of this study is to analyze the techno-economic performance of process configurations for ethanol production involving solid-liquid separators and reactors in the saccharification and fermentation stage, a family of process configurations where few alternatives have been proposed. Since including these process alternatives creates a large number of possible process configurations, a framework for process synthesis and optimization is proposed. This approach is supported on kinetic models fed with experimental data and a plant-wide techno-economic model. Among 150 process configurations, 40 show an improved MESP compared to a well-documented base case (BC), almost all include solid separators and some show energy retrieved in products 32% higher compared to the BC. Moreover, 16 of them also show a lower capital investment per unit of ethanol produced per year. Several of the process configurations found in this work have not been reported in the literature. Copyright © 2017 Elsevier Ltd. All rights reserved.
Darabi, Aubteen; Kalyuga, Slava
2012-01-01
The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…
Hiromori, Tomohito
2009-01-01
The purpose of this study is to examine a process model of L2 learners' motivation. To investigate the overall process of motivation, the motivation of 148 university students was analyzed. Data were collected on three variables from the pre-decisional phase of motivation (i.e., value, expectancy, and intention) and four variables from the…
Leusink, Peter; Teunissen, Doreth; Lucassen, Peter L.; Laan, Ellen T.; Lagro-Janssen, Antoine L.
2018-01-01
Background: The gap between the relatively high prevalence of provoked vulvodynia (PVD) in the general population and the low incidence in primary care can partly be explained by physicians' lack of knowledge about the assessment and management of PVD. Objectives: To recognize barriers and
Directory of Open Access Journals (Sweden)
Luis Emilio Caro Betancourt
2008-09-01
Full Text Available This article approaches the theoretical referents that sustain the professional pedagogical behavior of the Entire General Professor of Secondary School, using computer science in the teaching learning process. Taking into account the introducti on of the scientific and technical developments (Computer science in education and the professional's role starting from the demands of the conceived model for Secondary School Education.
Directory of Open Access Journals (Sweden)
Simon de Lusignan
2006-03-01
Conclusions Routinely collected primary care data could contribute more to the process of health improvement; however, those working with these data need to understand fully the complexity of the context within which data entry takes place.
Directory of Open Access Journals (Sweden)
T. P. Varshanina
2016-01-01
Full Text Available This work substantiates the need to ontologically couple methods of prediction of geospace processes and fundamental bases of the modern epistemological picture of the world. The method of a structural mask of power geographical fields is offered. On its basis a way of a solution of the problem of indeterminacy and overcoming influence of nonlinearity of geospace processes, as well as the methods of their dot prediction are developed.
Directory of Open Access Journals (Sweden)
Y.M. Furman
2014-10-01
Full Text Available Purpose : to examine the effect of general physical preparedness of young swimmers in the body artificially created state hypercapnic normobaric hypoxia. Material : the study involved 21 swimmer aged 13-14 years with sports qualifications at third and second sports categories. Results : the original method of working with young swimmers. Studies were conducted for 16 weeks a year preparatory period macrocycle. The average value of the index on the results of general endurance races 800m improved by 2.80 %. 8.24 % increased speed- strength endurance and 18.77 % increased dynamic strength endurance. During the period of formative experiment performance speed, agility, static endurance, flexibility and explosive strength athletes first experimental group was not significantly changed. Conclusions : it was found that the use of the proposed technique provides statistically significant increase in overall endurance, speed strength endurance and dynamic strength endurance.
Abraini, Jacques H; Marassio, Guillaume; David, Helene N; Vallone, Beatrice; Prangé, Thierry; Colloc'h, Nathalie
2014-11-01
The mechanisms by which general anesthetics, including xenon and nitrous oxide, act are only beginning to be discovered. However, structural approaches revealed weak but specific protein-gas interactions. To improve knowledge, we performed x-ray crystallography studies under xenon and nitrous oxide pressure in a series of 10 binding sites within four proteins. Whatever the pressure, we show (1) hydrophobicity of the gas binding sites has a screening effect on xenon and nitrous oxide binding, with a threshold value of 83% beyond which and below which xenon and nitrous oxide, respectively, binds to their sites preferentially compared to each other; (2) xenon and nitrous oxide occupancies are significantly correlated respectively to the product and the ratio of hydrophobicity by volume, indicating that hydrophobicity and volume are binding parameters that complement and oppose each other's effects; and (3) the ratio of occupancy of xenon to nitrous oxide is significantly correlated to hydrophobicity of their binding sites. These data demonstrate that xenon and nitrous oxide obey different binding mechanisms, a finding that argues against all unitary hypotheses of narcosis and anesthesia, and indicate that the Meyer-Overton rule of a high correlation between anesthetic potency and solubility in lipids of general anesthetics is often overinterpreted. This study provides evidence that the mechanisms of gas binding to proteins and therefore of general anesthesia should be considered as the result of a fully reversible interaction between a drug ligand and a receptor as this occurs in classical pharmacology.
International Nuclear Information System (INIS)
Pretschner, D.P.; Pfeiffer, G.; Deutsches Elektronen-Sychnchrotron
1981-01-01
In the field of nuclear medicine, BASIC and FORTRAN are currently being favoured as higher-level programming languages for computer-aided signal processing, and most operating systems of so-called ''freely programmable analyzers'' in nuclear wards have compilers for this purpose. However, FORTRAN is not an interactive language and thus not suited for conversational computing as a man-machine interface. BASIC, on the other hand, although a useful starting language for beginners, is not sufficiently sophisticated for complex nuclear medicine problems involving detailed calculations. Integration of new methods of signal acquisition, processing and presentation into an existing system or generation of new systems is difficult in FORTRAN, BASIC or ASSEMBLER and can only be done by system specialists, not by nuclear physicians. This problem may be solved by suitable interactive systems that are easy to learn, flexible, transparent and user-friendly. An interactive system of this type, XDS, was developed in the course of a project on evaluation of radiological image sequences. An XDS-generated command processing system for signal and image processing in nuclear medicine is described. The system is characterized by interactive program development and execution, problem-relevant data types, a flexible procedure concept and an integrated system implementation language for modern image processing algorithms. The advantages of the interactive system are illustrated by an example of diagnosis by nuclear methods. (orig.) [de
Xia, Chuan
2016-12-30
We demonstrate a versatile top-down ion exchange process, done at ambient temperature, to form epitaxial chalcogenide films and devices, with nanometer scale thickness control. To demonstrate the versatility of our process we have synthesized (1) epitaxial chalcogenide metallic and semiconducting films and (2) free-standing chalcogenide films and (3) completed in situ formation of atomically sharp heterojunctions by selective ion exchange. Epitaxial NiCo2S4 thin films prepared by our process show 115 times higher mobility than NiCo2S4 pellets (23 vs 0.2 cm(2) V-1 s(-1)) prepared by previous reports. By controlling the ion exchange process time, we made free-standing epitaxial films of NiCo2S4 and transferred them onto different substrates. We also demonstrate in situ formation of atomically sharp, lateral Schottky diodes based on NiCo2O4/NiCo2S4 heterojunction, using a single ion exchange step. Additionally, we show that our approach can be easily extended to other chalcogenide semiconductors. Specifically, we used our process to prepare Cu1.8S thin films with mobility that matches single crystal Cu1.8S (25 cm(2) V-1 s(-1)), which is ca. 28 times higher than the previously reported Cu1.8S thin film mobility (0.58 cm(2) V-1 s(-1)), thus demonstrating the universal nature of our process. This is the first report in which chalcogenide thin films retain the epitaxial nature of the precursor oxide films, an approach that will be useful in many applications.
Eppelbaum, Lev
2016-04-01
the basis of multimodel (Eppelbaum and Yakubov, 2004), informational (Eppelbaum, 2014), or wavelet (Eppelbaum et al., 2011, 2014; Eppelbaum, 2015c) approaches. In Israel, a lot of positive results were derived from magnetic method employment with application of the abovementioned procedures at numerous archaeological sites (e.g., Eppelbaum, 2000; Eppelbaum et al., 2000, 2001; Eppelbaum and Itkis, 2003; 2003a; Eppelbaum et al., 2006, 2010; Eppelbaum, 2010a, 2011a, 2014, 2015a). Similar effective techniques were developed for the interpretation of microgravity anomalies (Eppelbaum, 2009b, 2011b, 2015b), temperature anomalies (Eppelbaum, 2009a, 2013a), self-potential anomalies (Eppelbaum et al., 2003b; 2004), induced polarization anomalies (Khesin et al., 1997; Eppelbaum, 2000), piezoelectric anomalies (Neishtadt and Eppelbaum, 2012), Very Low Frequency (VLF) anomalies (Eppelbaum, 2000; Eppelbaum and Khesin, 2012). The theoretical analysis indicates that for all aforementioned geophysical methods a common interpretation methodology may be applied . The main peculiarities of the developed non-conventional system for analysis of potential and quasi-potential geophysical fields are presented in Table 1. Table 1. Elements of the developed system of geophysical fields processing and interpretation under complicated environments (on the basis of Khesin et al., 1996, Eppelbaum and Khesin, 2001; Eppelbaum et al., 2000, 2001, 2004; Eppelbaum and Yakubov, 2004; Eppelbaum et al., 2006; Eppelbaum, 2009a, 2009b; Eppelbaum, 2010a, 2010b; Eppelbaum et al., 2010, 2011; Eppelbaum and Mishne, 2011; Eppelbaum, 2011a, 2011b; Neishtadt and Eppelbaum, 2012; Eppelbaum, 2013a, 2013b, 2014; Eppelbaum and Kutasov, 2014; Eppelbaum et al., 2014; Eppelbaum, 2015a, 2015b, 2015c) Time Terrain Informational, Inverse problem solution Integrated variation correction multimodel and in conditions of: 3-D integrated FIELD correction using and wavelet ruggedarbitrary approximation modeling correlation
Xia, Chuan; Li, Peng; Li, Jun; Jiang, Qiu; Zhang, Xixiang; Alshareef, Husam N.
2016-01-01
) epitaxial chalcogenide metallic and semiconducting films and (2) free-standing chalcogenide films and (3) completed in situ formation of atomically sharp heterojunctions by selective ion exchange. Epitaxial NiCo2S4 thin films prepared by our process show 115
Directory of Open Access Journals (Sweden)
Thais Rabanea-Souza
2016-03-01
Conclusion: The present findings suggest that executive function deficits are present in chronic schizophrenic patients. In addition, specific executive processes might be associated to symptom remission. Future studies examining prospectively first-episode, drug naive patients diagnosed with schizophrenia may be especially elucidative.
Willson-Conrad, Angela; Kowalske, Megan Grunert
2018-01-01
Retention of students who major in STEM continues to be a major concern for universities. Many students cite poor teaching and disappointing grades as reasons for dropping out of STEM courses. Current college chemistry courses often assess what a student has learned through summative exams. To understand students' experiences of the exam process,…
Ding, Junyan; Johnson, Edward A.; Martin, Yvonne E.
2018-03-01
The diffusive and advective erosion-created landscapes have similar structure (hillslopes and channels) across different scales regardless of variations in drivers and controls. The relative magnitude of diffusive erosion to advective erosion (D/K ratio) in a landscape development model controls hillslope length, shape, and drainage density, which regulate soil moisture variation, one of the critical resources of plants, through the contributing area (A) and local slope (S) represented by a topographic index (TI). Here we explore the theoretical relation between geomorphic processes, TI, and the abundance and distribution of plants. We derived an analytical model that expresses the TI with D, K, and A. This gives us the relation between soil moisture variation and geomorphic processes. Plant tolerance curves are used to link plant performance to soil moisture. Using the hypothetical tolerance curves of three plants, we show that the abundance and distribution of xeric, mesic, and hydric plants on the landscape are regulated by the D/K ratio. Where diffusive erosion is the major erosion process (large D/K ratio), mesic plants have higher abundance relative to xeric and hydric plants and the landscape has longer and convex-upward hillslope and low channel density. Increasing the dominance of advective erosion increases relative abundance of xeric and hydric plants dominance, and the landscape has short and concave hillslope and high channel density.
Yang, X. I. A.; Marusic, I.; Meneveau, C.
2016-06-01
Townsend [Townsend, The Structure of Turbulent Shear Flow (Cambridge University Press, Cambridge, UK, 1976)] hypothesized that the logarithmic region in high-Reynolds-number wall-bounded flows consists of space-filling, self-similar attached eddies. Invoking this hypothesis, we express streamwise velocity fluctuations in the inertial layer in high-Reynolds-number wall-bounded flows as a hierarchical random additive process (HRAP): uz+=∑i=1Nzai . Here u is the streamwise velocity fluctuation, + indicates normalization in wall units, z is the wall normal distance, and ai's are independently, identically distributed random additives, each of which is associated with an attached eddy in the wall-attached hierarchy. The number of random additives is Nz˜ln(δ /z ) where δ is the boundary layer thickness and ln is natural log. Due to its simplified structure, such a process leads to predictions of the scaling behaviors for various turbulence statistics in the logarithmic layer. Besides reproducing known logarithmic scaling of moments, structure functions, and correlation function [" close="]3/2 uz(x ) uz(x +r ) >, new logarithmic laws in two-point statistics such as uz4(x ) > 1 /2, 1/3, etc. can be derived using the HRAP formalism. Supporting empirical evidence for the logarithmic scaling in such statistics is found from the Melbourne High Reynolds Number Boundary Layer Wind Tunnel measurements. We also show that, at high Reynolds numbers, the above mentioned new logarithmic laws can be derived by assuming the arrival of an attached eddy at a generic point in the flow field to be a Poisson process [Woodcock and Marusic, Phys. Fluids 27, 015104 (2015), 10.1063/1.4905301]. Taken together, the results provide new evidence supporting the essential ingredients of the attached eddy hypothesis to describe streamwise velocity fluctuations of large, momentum transporting eddies in wall-bounded turbulence, while observed deviations suggest the need for further extensions of the
The Development of a General Purpose ARM-based Processing Unit for the ATLAS TileCal sROD
Cox, Mitchell Arij; Reed, Robert; Mellado Garcia, Bruce Rafael
2014-01-01
The Large Hadron Collider at CERN generates enormous amounts of raw data which present a serious computing challenge. After Phase-II upgrades in 2022, the data output from the ATLAS Tile Calorimeter will increase by 200 times to 41 Tb/s! ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a clus...
Generalized Superconductivity. Generalized Levitation
International Nuclear Information System (INIS)
Ciobanu, B.; Agop, M.
2004-01-01
In the recent papers, the gravitational superconductivity is described. We introduce the concept of generalized superconductivity observing that any nongeodesic motion and, in particular, the motion in an electromagnetic field, can be transformed in a geodesic motion by a suitable choice of the connection. In the present paper, the gravitoelectromagnetic London equations have been obtained from the generalized Helmholtz vortex theorem using the generalized local equivalence principle. In this context, the gravitoelectromagnetic Meissner effect and, implicitly, the gravitoelectromagnetic levitation are given. (authors)
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
The Development of a General Purpose ARM-based Processing Unit for the ATLAS TileCal sROD
Cox, Mitchell Arij; The ATLAS collaboration; Mellado Garcia, Bruce Rafael
2015-01-01
The Large Hadron Collider at CERN generates enormous amounts of raw data which present a serious computing challenge. After Phase-II upgrades in 2022, the data output from the ATLAS Tile Calorimeter will increase by 200 times to 41 Tb/s! ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface ...
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
International Nuclear Information System (INIS)
Cox, M A; Reed, R; Mellado, B
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented
The Development of a General Purpose ARM-based Processing Unit for the TileCal sROD
Cox, Mitchell A
2014-01-01
The Large Hadron Collider at CERN generates enormous amounts of raw data which present a serious computing challenge. After planned upgrades in 2022, the data output from the ATLAS Tile Calorimeter will increase by 200 times to 41 Tb/s! ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface t...
Hirsh, Jacob B; Galinsky, Adam D; Zhong, Chen-Bo
2011-09-01
Social power, alcohol intoxication, and anonymity all have strong influences on human cognition and behavior. However, the social consequences of each of these conditions can be diverse, sometimes producing prosocial outcomes and other times enabling antisocial behavior. We present a general model of disinhibition to explain how these seemingly contradictory effects emerge from a single underlying mechanism: The decreased salience of competing response options prevents activation of the Behavioral Inhibition System (BIS). As a result, the most salient response in any given situation is expressed, regardless of whether it has prosocial or antisocial consequences. We review three distinct routes through which power, alcohol intoxication, and anonymity reduce the salience of competing response options, namely, through Behavioral Approach System (BAS) activation, cognitive depletion, and reduced social desirability concerns. We further discuss how these states can both reveal and shape the person. Overall, our approach allows for multiple domain-specific models to be unified within a common conceptual framework that explains how both situational and dispositional factors can influence the expression of disinhibited behavior, producing both prosocial and antisocial outcomes. © Association for Psychological Science 2011.
International Nuclear Information System (INIS)
Holder, N.D.; Strand, J.B.; Schwarz, F.A.; Drake, R.N.
1981-11-01
The Federal Republic of Germany (FRG) and the United States (US) are cooperating on certain aspects of gas-cooled reactor technology under an umbrella agreement. Under the spent fuel treatment development section of the agreement, both FRG mixed uranium/ thorium and low-enriched uranium fuel spheres have been processed in the Department of Energy-sponsored cold pilot plant for high-temperature gas-cooled reactor (HTGR) fuel processing at General Atomic Company in San Diego, California. The FRG fuel spheres were crushed and burned to recover coated fuel particles suitable for further treatment for uranium recovery. Successful completion of the tests described in this paper demonstrated certain modifications to the US HTGR fuel burining process necessary for FRG fuel treatment. Results of the tests will be used in the design of a US/FRG joint prototype headend facility for HTGR fuel
International Nuclear Information System (INIS)
Piriou, Pierre-Yves; Faure, Jean-Marc; Lesage, Jean-Jacques
2017-01-01
This paper presents a modeling framework that permits to describe in an integrated manner the structure of the critical system to analyze, by using an enriched fault tree, the dysfunctional behavior of its components, by means of Markov processes, and the reconfiguration strategies that have been planned to ensure safety and availability, with Moore machines. This framework has been developed from BDMP (Boolean logic Driven Markov Processes), a previous framework for dynamic repairable systems. First, the contribution is motivated by pinpointing the limitations of BDMP to model complex reconfiguration strategies and the failures of the control of these strategies. The syntax and semantics of GBDMP (Generalized Boolean logic Driven Markov Processes) are then formally defined; in particular, an algorithm to analyze the dynamic behavior of a GBDMP model is developed. The modeling capabilities of this framework are illustrated on three representative examples. Last, qualitative and quantitative analysis of GDBMP models highlight the benefits of the approach.
International Nuclear Information System (INIS)
Holder, N.D.; Strand, J.B.; Schwarz, F.A.; Tischer, H.E.
1980-11-01
The Federal Republic of Germany (FRG) and the United States (US) are cooperating on certain aspects gas-cooled reactor technology under an umbrella agreement. Under the spent fuel treatment section of the agreement, FRG fuel spheres were recently sent for processing in the Department of Energy sponsored cold pilot plant for High-Temperature Gas-Cooled Reactor (HTGR) fuel processing at General Atomic Company in San Diego, California. The FRG fuel spheres were crushed and burned to recover coated fuel particles. These particles were in turn crushed and burned to recover the fuel-bearing kernels for further treatment for uranium recovery. Successful completion of the tests described in this paper demonstrated the applicability of the US HTGR fuel treatment flowsheet to FRG fuel processing. 10 figures
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
Togelius, Julian; Yannakakis, Georgios N.; 2016 IEEE Conference on Computational Intelligence and Games (CIG)
2016-01-01
Arguably the grand goal of artificial intelligence research is to produce machines with general intelligence: the capacity to solve multiple problems, not just one. Artificial intelligence (AI) has investigated the general intelligence capacity of machines within the domain of games more than any other domain given the ideal properties of games for that purpose: controlled yet interesting and computationally hard problems. This line of research, however, has so far focuse...
Mander, Johannes V; Jacob, Gitta A; Götz, Lea; Sammet, Isa; Zipfel, Stephan; Teufel, Martin
2015-01-01
The study aimed at analyzing associations between Grawe's general mechanisms of change and Young's early maladaptive schemas (EMS). Therefore, 98 patients completed the Scale for the Multiperspective Assessment of General Change Mechanisms in Psychotherapy (SACiP), the Young Shema Questionnaire-Short Form Revised (YSQ S3R), and diverse outcome measures at the beginning and end of treatment. Our results are important for clinical applications, as we demonstrated strong predictive effects of change mechanisms on schema domains using regression analyses and cross-lagged panel models. Resource activation experiences seem to be especially crucial in fostering alterations in EMS, as this change mechanism demonstrated significant associations with several schema domains. Future research should investigate these aspects in more detail using observer-based micro-process analyses.
Jia, Ding
2017-12-01
The expected indefinite causal structure in quantum gravity poses a challenge to the notion of entanglement: If two parties are in an indefinite causal relation of being causally connected and not, can they still be entangled? If so, how does one measure the amount of entanglement? We propose to generalize the notions of entanglement and entanglement measure to address these questions. Importantly, the generalization opens the path to study quantum entanglement of states, channels, networks, and processes with definite or indefinite causal structure in a unified fashion, e.g., we show that the entanglement distillation capacity of a state, the quantum communication capacity of a channel, and the entanglement generation capacity of a network or a process are different manifestations of one and the same entanglement measure.
Ehrlich, Carolyn; Kendall, Elizabeth; St John, Winsome
2013-01-01
The aim of this study was to develop understanding about how a registered nurse-provided care coordination model can "fit" within organisational processes and professional relationships in general practice. In this project, registered nurses were involved in implementation of registered nurse-provided care coordination, which aimed to improve quality of care and support patients with chronic conditions to maintain their care and manage their lifestyle. Focus group interviews were conducted with nurses using a semi-structured interview protocol. Interpretive analysis of interview data was conducted using Normalization Process Theory to structure data analysis and interpretation. Three core themes emerged: (1) pre-requisites for care coordination, (2) the intervention in context, and (3) achieving outcomes. Pre-requisites were adequate funding mechanisms, engaging organisational power-brokers, leadership roles, and utilising and valuing registered nurses' broad skill base. To ensure registered nurse-provided care coordination processes were sustainable and embedded, mentoring and support as well as allocated time were required. Finally, when registered nurse-provided care coordination was supported, positive client outcomes were achievable, and transformation of professional practice and development of advanced nursing roles was possible. Registered nurse-provided care coordination could "fit" within the context of general practice if it was adequately resourced. However, the heterogeneity of general practice can create an impasse that could be addressed through close attention to shared and agreed understandings. Successful development and implementation of registered nurse roles in care coordination requires attention to educational preparation, support of the individual nurse, and attention to organisational structures, financial implications and team member relationships.
Sieng, Sokha; Hurst, Cameron
2017-08-07
This study compares a combination of processes of care and clinical targets among patients with type 2 diabetes mellitus (T2DM) between specialist diabetes clinics (SDCs) and general medical clinics (GMCs), and how differences between these two types of clinics differ with hospital type (community, provincial and regional). Type 2 diabetes mellitus patient medical records were collected from 595 hospitals (499 community, 70 provincial, 26 regional) in Thailand between April 1 to June 30, 2012 resulting in a cross-sectional sample of 26,860 patients. Generalized linear mixed modeling was conducted to examine associations between clinic type and quality of care. The outcome variables of interest were split into clinical targets and process of care. A subsequent subgroup analysis was conducted to examine if the nature of clinical target and process of care differences between GMCs and SDCs varied with hospital type (regional, provincial, community). Regardless of the types of hospitals (regional, provincial, or community) patients attending SDCs were considerably more likely to have eye and foot exam. In terms of larger hospitals (regional and provincial) patients attending SDCs were more likely to achieve HbA1c exam, All FACE exam, BP target, and the Num7Q. Interestingly, SDCs performed better than GMCs at only provincial hospitals for LDL-C target and the All7Q. Finally, patients with T2DM who attended community hospital-GMCs had a better chance of achieving the blood pressure target than patients who attended community hospital-SDCs. Specialized diabetes clinics outperform general medical clinics for both regional and provincial hospitals for all quality of care indicators and the number of quality of care indicators achieved was never lower. However, this better performance of SDC was not observed in community hospital. Indeed, GMCs outperformed SDCs for some quality of care indicators in the community level setting.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
International Nuclear Information System (INIS)
Maekawa, T.; Tanaka, H.; Uchida, M.; Igami, H.
2003-01-01
General properties of scattering matrix, which governs the mode conversion process between electron Bernstein (B) waves and external electromagnetic (EM) waves in the presence of steep density gradient, are theoretically analyzed. Based on the analysis, polarization adjustment of incident EM waves for optimal mode conversion to B waves is possible and effective for a range of density gradient near the upper hybrid resonance, which are not covered by the previously proposed schemes of perpendicular injection of X mode and oblique injection of O mode. Furthermore, the analysis shows that the polarization of the externally emitted EM waves from B waves is uniquely related to the optimized polarization of incident EM waves for B wave heating and that the mode conversion rate is the same for the both processes of emission and the injection with the optimized polarization
Weis, Daniel; Willems, Helmut
2017-06-01
The article deals with the question of how aggregated data which allow for generalizable insights can be generated from single-case based qualitative investigations. Thereby, two central challenges of qualitative social research are outlined: First, researchers must ensure that the single-case data can be aggregated and condensed so that new collective structures can be detected. Second, they must apply methods and practices to allow for the generalization of the results beyond the specific study. In the following, we demonstrate how and under what conditions these challenges can be addressed in research practice. To this end, the research process of the construction of an empirically based typology is described. A qualitative study, conducted within the framework of the Luxembourg Youth Report, is used to illustrate this process. Specifically, strategies are presented which increase the likelihood of generalizability or transferability of the results, while also highlighting their limitations.
Process generalization in conceptual models
Wieringa, Roelf J.
In conceptual modeling, the universe of discourse (UoD) is divided into classes which have a taxonomic structure. The classes are usually defined in terms of attributes (all objects in a class share attribute names) and possibly of events. For enmple, the class of employees is the set of objects to
Yeung, Anna; Hocking, Jane; Guy, Rebecca; Fairley, Christopher K; Smith, Kirsty; Vaisey, Alaina; Donovan, Basil; Imrie, John; Gunn, Jane; Temple-Smith, Meredith
2018-03-28
Chlamydia is the most common notifiable sexually transmissible infection in Australia. Left untreated, it can develop into pelvic inflammatory disease and infertility. The majority of notifications come from general practice and it is ideally situated to test young Australians. The Australian Chlamydia Control Effectiveness Pilot (ACCEPt) was a multifaceted intervention that aimed to reduce chlamydia prevalence by increasing testing in 16- to 29-year-olds attending general practice. GPs were interviewed to describe the effectiveness of the ACCEPt intervention in integrating chlamydia testing into routine practice using Normalization Process Theory (NPT). GPs were purposively selected based on age, gender, geographic location and size of practice at baseline and midpoint. Interview data were analysed regarding the intervention components and results were interpreted using NPT. A total of 44 GPs at baseline and 24 at midpoint were interviewed. Most GPs reported offering a test based on age at midpoint versus offering a test based on symptoms or patient request at baseline. Quarterly feedback was the most significant ACCEPt component for facilitating a chlamydia test. The ACCEPt intervention has been able to moderately normalize chlamydia testing among GPs, although the components had varying levels of effectiveness. NPT can demonstrate the effective implementation of an intervention in general practice and has been valuable in understanding which components are essential and which components can be improved upon.
Directory of Open Access Journals (Sweden)
Simon Gorin
Full Text Available Several models in the verbal domain of short-term memory (STM consider a dissociation between item and order processing. This view is supported by data demonstrating that different types of time-based interference have a greater effect on memory for the order of to-be-remembered items than on memory for the items themselves. The present study investigated the domain-generality of the item versus serial order dissociation by comparing the differential effects of time-based interfering tasks, such as rhythmic interference and articulatory suppression, on item and order processing in verbal and musical STM domains. In Experiment 1, participants had to maintain sequences of verbal or musical information in STM, followed by a probe sequence, this under different conditions of interference (no-interference, rhythmic interference, articulatory suppression. They were required to decide whether all items of the probe list matched those of the memory list (item condition or whether the order of the items in the probe sequence matched the order in the memory list (order condition. In Experiment 2, participants performed a serial order probe recognition task for verbal and musical sequences ensuring sequential maintenance processes, under no-interference or rhythmic interference conditions. For Experiment 1, serial order recognition was not significantly more impacted by interfering tasks than was item recognition, this for both verbal and musical domains. For Experiment 2, we observed selective interference of the rhythmic interference condition on both musical and verbal order STM tasks. Overall, the results suggest a similar and selective sensitivity to time-based interference for serial order STM in verbal and musical domains, but only when the STM tasks ensure sequential maintenance processes.
Gorin, Simon; Kowialiewski, Benjamin; Majerus, Steve
2016-01-01
Several models in the verbal domain of short-term memory (STM) consider a dissociation between item and order processing. This view is supported by data demonstrating that different types of time-based interference have a greater effect on memory for the order of to-be-remembered items than on memory for the items themselves. The present study investigated the domain-generality of the item versus serial order dissociation by comparing the differential effects of time-based interfering tasks, such as rhythmic interference and articulatory suppression, on item and order processing in verbal and musical STM domains. In Experiment 1, participants had to maintain sequences of verbal or musical information in STM, followed by a probe sequence, this under different conditions of interference (no-interference, rhythmic interference, articulatory suppression). They were required to decide whether all items of the probe list matched those of the memory list (item condition) or whether the order of the items in the probe sequence matched the order in the memory list (order condition). In Experiment 2, participants performed a serial order probe recognition task for verbal and musical sequences ensuring sequential maintenance processes, under no-interference or rhythmic interference conditions. For Experiment 1, serial order recognition was not significantly more impacted by interfering tasks than was item recognition, this for both verbal and musical domains. For Experiment 2, we observed selective interference of the rhythmic interference condition on both musical and verbal order STM tasks. Overall, the results suggest a similar and selective sensitivity to time-based interference for serial order STM in verbal and musical domains, but only when the STM tasks ensure sequential maintenance processes.
Siegel, Daniel M; Metzger, Brian D
2017-12-08
The merger of binary neutron stars, or of a neutron star and a stellar-mass black hole, can result in the formation of a massive rotating torus around a spinning black hole. In addition to providing collimating media for γ-ray burst jets, unbound outflows from these disks are an important source of mass ejection and rapid neutron capture (r-process) nucleosynthesis. We present the first three-dimensional general-relativistic magnetohydrodynamic (GRMHD) simulations of neutrino-cooled accretion disks in neutron star mergers, including a realistic equation of state valid at low densities and temperatures, self-consistent evolution of the electron fraction, and neutrino cooling through an approximate leakage scheme. After initial magnetic field amplification by magnetic winding, we witness the vigorous onset of turbulence driven by the magnetorotational instability (MRI). The disk quickly reaches a balance between heating from MRI-driven turbulence and neutrino cooling, which regulates the midplane electron fraction to a low equilibrium value Y_{e}≈0.1. Over the 380-ms duration of the simulation, we find that a fraction ≈20% of the initial torus mass is unbound in powerful outflows with asymptotic velocities v≈0.1c and electron fractions Y_{e}≈0.1-0.25. Postprocessing the outflows through a nuclear reaction network shows the production of a robust second- and third-peak r process. Though broadly consistent with the results of previous axisymmetric hydrodynamical simulations, extrapolation of our results to late times suggests that the total ejecta mass from GRMHD disks is significantly higher. Our results provide strong evidence that postmerger disk outflows are an important site for the r process.
Documented Safety Analysis for the B695 Segment
Energy Technology Data Exchange (ETDEWEB)
Laycak, D
2008-09-11
This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., {sup 90}Sr, {sup 137}Cs, or {sup 3}H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building
Documented Safety Analysis for the B695 Segment
International Nuclear Information System (INIS)
Laycak, D.
2008-01-01
This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., 90 Sr, 137 Cs, or 3 H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building systems, and keeping
International Nuclear Information System (INIS)
Hakanson, Lars; Lindgren, Dan
2009-01-01
In this work a general, process-based mass-balance model for water contaminants for coastal areas at the ecosystem scale (CoastMab) is presented and for the first time tested for radionuclides. The model is dynamic, based on ordinary differential equations and gives monthly predictions. Connected to the core model there is also a sub-model for contaminant concentrations in fish. CoastMab calculates sedimentation, resuspension, diffusion, mixing, burial and retention of the given contaminant. The model contains both general algorithms, which apply to all contaminants, and substance-specific parts (such as algorithms for the particulate fraction, diffusion, biouptake and biological half-life). CoastMab and the sub-model for fish are simple to apply in practice since all driving variables may be readily accessed from maps or regular monitoring programs. The separation between the surface-water layer and the deep-water layer is not done as in most traditional models from water temperature data but from sedimentological criteria. Previous versions of the models for phosphorus and suspended particulate matter (in the Baltic Sea) have been validated and shown to predict well. This work presents modifications of the model and tests using two tracers, radiocesium and radiostrontium (from the Chernobyl fallout) in the Dnieper-Bug estuary (the Black Sea). Good correlations are shown between modeled and empirical data, except for the month directly after the fallout. We have, e.g., shown that: 1. The conditions in the sea outside the bay are important for the concentrations of the substances in water, sediments and fish within the bay, 2. We have demonstrated 'biological,' 'chemical' and 'water' dilution, 3. That the water chemical conditions in the bay influence biouptake and concentrations in fish of the radionuclides and 4. That the feeding behaviour of the coastal fish is very important for the biouptake of the radionuclides
Directory of Open Access Journals (Sweden)
Ofir Bahar
2014-01-01
Full Text Available Pattern recognition receptors (PRRs play an important role in detecting invading pathogens and mounting a robust defense response to restrict infection. In rice, one of the best characterized PRRs is XA21, a leucine rich repeat receptor-like kinase that confers broad-spectrum resistance to multiple strains of the bacterial pathogen Xanthomonas oryzae pv. oryzae (Xoo. In 2009 we reported that an Xoo protein, called Ax21, is secreted by a type I-secretion system and that it serves to activate XA21-mediated immunity. This report has recently been retracted. Here we present data that corrects our previous model. We first show that Ax21 secretion does not depend on the predicted type I secretion system and that it is processed by the general secretion (Sec system. We further show that Ax21 is an outer membrane protein, secreted in association with outer membrane vesicles. Finally, we provide data showing that ax21 knockout strains do not overcome XA21-mediated immunity.
International Nuclear Information System (INIS)
1982-09-01
The Fundamental Safety Rules applicable to certain types of nuclear installation are intended to clarify the conditions of which observance, for the type of installation concerned and for the subject that they deal with, is considered as equivalent to compliance with regulatory French technical practice. These Rules should facilitate safety analysises and the clear understanding between persons interested in matters related to nuclear safety. They in no way reduce the operator's liability and pose no obstacle to statutory provisions in force. For any installation to which a Fundamental Safety Rule applies according to the foregoing paragraph, the operator may be relieved from application of the Rule if he shows proof that the safety objectives set by the Rule are attained by other means that he proposes within the framework of statutory procedures. Furthermore, the Central Service for the Safety of Nuclear Installations reserves the right at all times to alter any Fundamental Safety Rule, as required, should it deem this necessary, while specifying the applicability conditions. This rule is intended to define the general provisions applicable to the production, inspection, processing, packaging and storage of the different types of wastes resulting from the reprocessing of fuels irradiated in a PWR
2013-09-13
Event 1.4.4,” August 7, 2012 AAA Attestation Report A-2010-0187- FFM , “General Fund Enterprise Business System - Federal Financial Management...Improvement Act Compliance. Examination of Requirements Through Test Event 1.4.0,” September 14, 2010 AAA Audit Report A-2009-0232- FFM , “General Fund...September 30, 2009 AAA Audit Report A-2009-0231- FFM , “General Fund Enterprise Business System - Federal Financial Management Improvement Act
Gelfand, I M; Graev, M I; Vilenkin, N Y; Pyatetskii-Shapiro, I I
Volume 1 is devoted to basics of the theory of generalized functions. The first chapter contains main definitions and most important properties of generalized functions as functional on the space of smooth functions with compact support. The second chapter talks about the Fourier transform of generalized functions. In Chapter 3, definitions and properties of some important classes of generalized functions are discussed; in particular, generalized functions supported on submanifolds of lower dimension, generalized functions associated with quadratic forms, and homogeneous generalized functions are studied in detail. Many simple basic examples make this book an excellent place for a novice to get acquainted with the theory of generalized functions. A long appendix presents basics of generalized functions of complex variables.
International Nuclear Information System (INIS)
Hista, J.C.
1982-01-01
This reactor building includes a containment enclosure for the internal structures composed of a slab wedged on its periphery against the containment enclosure gusset and resting on the general raft by means of a peripheral bearing ring, a compressible layer being provided between the general raft and the slab [fr
Energy Technology Data Exchange (ETDEWEB)
Anca Abati, R; Lopez Rodriguez, M
1961-07-01
General conditions about the metallo thermic reduction in small bombs (250 and 800 gr. of uranium) has been investigated. Factors such as kind and granulometry of the magnesium used, magnesium excess and preheating temperature, which affect yields and metal quality have been considered. magnesium excess increased yields in a 15% in the small bomb, about the preheating temperature, there is a range between which yields and metal quality does not change. All tests have been made with graphite linings. (Author) 18 refs.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education. General Editorial. Articles in Resonance – Journal of Science Education. Volume 19 Issue 1 January 2014 pp 1-2 General Editorial. General Editorial on Publication Ethics · R Ramaswamy · More Details Fulltext PDF. Volume 19 Issue 1 January 2014 pp 3-3 ...
Elwyn, G; Edwards, A; Hood, K; Robling, M; Atwell, C; Russell, I; Wensing, M; Grol, R
2004-08-01
A consulting method known as 'shared decision making' (SDM) has been described and operationalized in terms of several 'competences'. One of these competences concerns the discussion of the risks and benefits of treatment or care options-'risk communication'. Few data exist on clinicians' ability to acquire skills and implement the competences of SDM or risk communication in consultations with patients. The aims of this study were to evaluate the effects of skill development workshops for SDM and the use of risk communication aids on the process of consultations. A cluster randomized trial with crossover was carried out with the participation of 20 recently qualified GPs in urban and rural general practices in Gwent, South Wales. A total of 747 patients with known atrial fibrillation, prostatism, menorrhagia or menopausal symptoms were invited to a consultation to review their condition or treatments. Half the consultations were randomly selected for audio-taping, of which 352 patients attended and were audio-taped successfully. After baseline, participating doctors were randomized to receive training in (i) SDM skills or (ii) the use of simple risk communication aids, using simulated patients. The alternative training was then provided for the final study phase. Patients were allocated randomly to a consultation during baseline or intervention 1 (SDM or risk communication aids) or intervention 2 phases. A randomly selected half of the consultations were audio-taped from each phase. Raters (independent, trained and blinded to study phase) assessed the audio-tapes using a validated scale to assess levels of patient involvement (OPTION: observing patient involvement), and to analyse the nature of risk information discussed. Clinicians completed questionnaires after each consultation, assessing perceived clinician-patient agreement and level of patient involvement in decisions. Multilevel modelling was carried out with the OPTION score as the dependent variable, and
International Nuclear Information System (INIS)
Karitskaya, S.G.; Ruzanov, K.A.; Davletov, V.S.
2005-01-01
The results of work of making the electronic textbook of special discipline ('General theory and construction of heat-and-power engineering facilities' are brought. The principles and requirements, presented towards literature of such type, are outlined. (author)
The Society of Toxicologic Pathology charged a Nervous System Sampling Working Group with devising recommended practices to routinely screen the central and peripheral nervous systems in Good Laboratory Practice-type nonclinical general toxicity studies. Brains should be trimmed ...
Greco, Salvatore; Mesiar, Radko; Rindone, Fabio
2014-01-01
Aggregation functions on [0,1] with annihilator 0 can be seen as a generalized product on [0,1]. We study the generalized product on the bipolar scale [–1,1], stressing the axiomatic point of view. Based on newly introduced bipolar properties, such as the bipolar increasingness, bipolar unit element, bipolar idempotent element, several kinds of generalized bipolar product are introduced and studied. A special stress is put on bipolar semicopulas, bipolar quasi-copulas and bipolar copulas.
DEFF Research Database (Denmark)
Larsen, Christian; Kiesmüller, Gudrun P.
We derive a closed-form cost expression for an (R,s,nQ) inventory control policy where all replenishment orders have a constant lead-time, unfilled demand is backlogged and inter-arrival times of order requests are generalized Erlang distributed.......We derive a closed-form cost expression for an (R,s,nQ) inventory control policy where all replenishment orders have a constant lead-time, unfilled demand is backlogged and inter-arrival times of order requests are generalized Erlang distributed....
DEFF Research Database (Denmark)
Larsen, Christian; Kiesmüller, G.P
2007-01-01
We derive a closed-form cost expression for an (R,s,nQ) inventory control policy where all replenishment orders have a constant lead-time, unfilled demand is back-logged and inter-arrival times of order requests are generalized Erlang distributed. For given values of Q and R we show how to compute...
Allen, Johnie J.; Anderson, Craig A.; Bushman, Brad J.
The General Aggression Model (GAM) is a comprehensive, integrative, framework for understanding aggression. It considers the role of social, cognitive, personality, developmental, and biological factors on aggression. Proximate processes of GAM detail how person and situation factors influence
International Nuclear Information System (INIS)
Bezak, P.; Daniska, V.; Ondra, F.; Necas, V.
2012-01-01
Conditional release of steels from NPP decommissioning enables controlled reuse of non-negligible volumes of steels. For proposal of scenarios for steel reuse, it is needed to identify and evaluate partial elementary activities of the whole process from conditional release of steels, manufacturing of various elements up to realisation of scenarios. For scenarios of reuse of conditionally released steel the products of steel, as steel reinforcements, rails, profiles and sheets for technical constructions such as bridges, tunnels, railways and other constructions which guarantee the long-term properties over the periods of 50-100 years are considered. The idea offers also the possibility for using this type of steel for particular technical constructions, directly usable in nuclear facilities. The paper presents the review of activities for manufacturing of various steel construction elements made from conditionally released steels and their use in general and also in the nuclear industry. As the starting material for manufacturing of steel elements ingots or just fragments of steel after dismantling in controlled area can be used. These input materials are re-melted in industrial facilities in order to achieve the required physical and chemical characteristics. Mostly used technique for manufacturing of the steel construction elements is rolling. As the products considered in scenarios for reuse of conditional released steels are bars for reinforcement concrete, rolled steel sheets and other rolled profiles. For use in the nuclear industry it offers the possibility for casting of thick-walled steel containers for long-term storage of high level radioactive components in integral storage and also assembly of stainless steel tanks for storing of liquid radioactive waste. Lists of elementary activities which are needed for manufacturing of selected steel elements are elaborated. These elementary activities are then the base for detailed safety evaluation of external
2018-01-09
As required by Federal Aviation Administration Order 8110.4C, Type Certification Process, the Volpe Center Acoustics Facility (Volpe), in support of the Federal Aviation Administration Office of Environment and Energy (AEE), has completed valid...
2017-08-18
As required by Federal Aviation Administration (FAA) Order 8110.4C: Type Certification Process (most recently revised as Change 5, 20 December, 2011), the Volpe Center Acoustics Facility (Volpe), in support of the FAA Office of Environmen...
International Nuclear Information System (INIS)
Kenyon, I.R.
1990-01-01
General relativity is discussed in this book at a level appropriate to undergraduate students of physics and astronomy. It describes concepts and experimental results, and provides a succinct account of the formalism. A brief review of special relativity is followed by a discussion of the equivalence principle and its implications. Other topics covered include the concepts of curvature and the Schwarzschild metric, test of the general theory, black holes and their properties, gravitational radiation and methods for its detection, the impact of general relativity on cosmology, and the continuing search for a quantum theory of gravity. (author)
International Nuclear Information System (INIS)
2005-01-01
This article presents the general problems as natural disasters, consequences of global climate change, public health, the danger of criminal actions, the availability to information about problems of environment
DEFF Research Database (Denmark)
Jensen, Christian Skov; Lando, David; Pedersen, Lasse Heje
We characterize when physical probabilities, marginal utilities, and the discount rate can be recovered from observed state prices for several future time periods. Our characterization makes no assumptions of the probability distribution, thus generalizing the time-homogeneous stationary model...
The General Conformity requirements ensure that the actions taken by federal agencies in nonattainment and maintenance areas do not interfere with a state’s plans to meet national standards for air quality.
International Nuclear Information System (INIS)
Komatsu, Nobuyoshi; Kiwata, Takahiro; Kimura, Shigeo
2010-01-01
To clarify the nonequilibrium processes of self-gravitating systems, we examine a system enclosed in a spherical container with reflecting walls, by N-body simulations. To simulate nonequilibrium processes, we consider loss of energy through the reflecting wall, i.e., a particle reflected at a non-adiabatic wall is cooled to mimic energy loss. We also consider quasi-equilibrium structures of stellar polytropes to compare with the nonequilibrium process, where the quasi-equilibrium structure is obtained from an extremum-state of Tsallis' entropy. Consequently, we numerically show that, with increasing cooling rates, the dependence of the temperature on energy, i.e., the ε-T curve, varies from that of microcanonical ensembles (or isothermal spheres) to a common curve. The common curve appearing in the nonequilibrium process agrees well with an ε-T curve for a quasi-equilibrium structure of the stellar polytrope, especially for the polytrope index n ∼ 5. In fact, for n > 5, the stellar polytrope within an adiabatic wall exhibits gravothermal instability [Taruya, Sakagami, Physica A, 322 (2003) 285]. The present study indicates that the stellar polytrope with n ∼ 5 likely plays an important role in quasi-attractors of the nonequilibrium process in self-gravitating systems with non-adiabatic walls.
Morrill, Tuuli H; McAuley, J Devin; Dilley, Laura C; Hambrick, David Z
2015-08-01
Do the same mechanisms underlie processing of music and language? Recent investigations of this question have yielded inconsistent results. Likely factors contributing to discrepant findings are use of small samples and failure to control for individual differences in cognitive ability. We investigated the relationship between music and speech prosody processing, while controlling for cognitive ability. Participants (n = 179) completed a battery of cognitive ability tests, the Montreal Battery of Evaluation of Amusia (MBEA) to assess music perception, and a prosody test of pitch peak timing discrimination (early, as in insight vs. late, incite). Structural equation modeling revealed that only music perception was a significant predictor of prosody test performance. Music perception accounted for 34.5% of variance on prosody test performance; cognitive abilities and music training added only about 8%. These results indicate musical pitch and temporal processing are highly predictive of pitch discrimination in speech processing, even after controlling for other possible predictors of this aspect of language processing. (c) 2015 APA, all rights reserved).
International Nuclear Information System (INIS)
1999-01-01
The document reproduces the text of the letter dated 18 October 1999 sent to the Secretary-General by the Permanent Representative of China to the United Nations in connection with the agenda item 76 (General and complete disarmament) of the 54th session of the General Assembly, First Committee. The letter expresses the position of the Chinese delegation concerning the proposed amendment of the Anti-Ballistic Missile Treaty (ABM Treaty)
International Nuclear Information System (INIS)
Mikhailovskii, A.B.
1986-01-01
Some general problems of the theory of Alfven instabilities of a tokamak with high-energy ions are considered. It is assumed that such ions are due to either ionization of fast neutral atoms, injected into the tokamak, or production of them under thermo-nuclear conditions. Small-oscillation equations are derived for the Alfven-type waves, which allow for both destabilizing effects, associated with the high-energy particles, and stabilizing ones, such as effects of shear and bulk-plasm dissipation. A high-energy ion contribution is calculated into the growth rate of the Alfven waves. The author considers the role of trapped-electron collisional dissipation
Van Maldeghem, Hendrik
1998-01-01
Generalized Polygons is the first book to cover, in a coherent manner, the theory of polygons from scratch. In particular, it fills elementary gaps in the literature and gives an up-to-date account of current research in this area, including most proofs, which are often unified and streamlined in comparison to the versions generally known. Generalized Polygons will be welcomed both by the student seeking an introduction to the subject as well as the researcher who will value the work as a reference. In particular, it will be of great value for specialists working in the field of generalized polygons (which are, incidentally, the rank 2 Tits-buildings) or in fields directly related to Tits-buildings, incidence geometry and finite geometry. The approach taken in the book is of geometric nature, but algebraic results are included and proven (in a geometric way!). A noteworthy feature is that the book unifies and generalizes notions, definitions and results that exist for quadrangles, hexagons, octagons - in the ...
International Nuclear Information System (INIS)
Tubiana, M.
1993-01-01
In conclusion, a general consensus of a number of points which the author endeavours to summarize in this article: -doctors are an excellent channel for passing on information to the public -doctors feel that they do not know enough about the subject and a training on radiobiology and radiation protection is a necessity for them -communication between doctors and the general public is poor in this field -research should be encouraged in numerous areas such as: carcinogenic effect of low doses of radiation, pedagogy and risk perception
Ana Teresa Fernández Vidal; José Aurelio Díaz Quiñones; Silvia Enrique Vilaplana
2016-01-01
Cuban educators conceive programs and processes using the cultural-historical approach to human development, since this is the theory that, thanks to its founder Lev Semiónovich Vygotsky, could overcome the approaches that fragmented the analysis and understanding of the human development. Such currents of thought hyperbolized the different conditioning factors of this development and ignored the dialectical relationship between them in terms of personality formation and development, its proc...
Boothe, W. A.; Corman, J. C.; Johnson, G. G.; Cassel, T. A. V.
1976-01-01
Results are presented of an investigation of gasification and clean fuels from coal. Factors discussed include: coal and coal transportation costs; clean liquid and gas fuel process efficiencies and costs; and cost, performance, and environmental intrusion elements of the integrated low-Btu coal gasification system. Cost estimates for the balance-of-plant requirements associated with advanced energy conversion systems utilizing coal or coal-derived fuels are included.
Directory of Open Access Journals (Sweden)
Stefanie E. Grund
2012-03-01
Full Text Available Drosha is a key enzyme in microRNA biogenesis, generating the precursor miRNA (pre-miRNA by excising the stem-loop embedded in the primary transcripts (pri-miRNA. The specificity for the pri-miRNAs and determination of the cleavage site are provided by its binding partner DGCR8, which is necessary for efficient processing. The crucial Drosha domains for pri-miRNA cleavage are the middle part, the two enzymatic RNase III domains (RIIID, and the dsRNA binding domain (dsRBD in the C-terminus. Here, we identify alternatively spliced transcripts in human melanoma and NT2 cell lines, encoding C-terminally truncated Drosha proteins lacking part of the RIIIDb and the entire dsRBD. Proteins generated from these alternative splice variants fail to bind to DGCR8 but still interact with Ewing sarcoma protein (EWS. In vitro as well as in vivo, the Drosha splice variants are deficient in pri-miRNA processing. However, the aberrant transcripts in melanoma cells do not consistently reduce mature miRNA levels compared with melanoma cell lines lacking those splice variants, possibly owing to their limited abundance. Our findings show that alternative processing-deficient Drosha splice variants exist in melanoma cells. In elevated amounts, these alternatively spliced transcripts could provide one potential mechanism accounting for the deregulation of miRNAs in cancer cells. On the basis of our results, the search for alternative inactive splice variants might be fruitful in different tumor entities to unravel the molecular basis of the previously observed decreased microRNA processing efficiency in cancer.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 2. Supersymmetry. Akshay Kulkarni P Ramadevi. General Article Volume 8 Issue 2 February 2003 pp 28-41 ... Author Affiliations. Akshay Kulkarni1 P Ramadevi1. Physics Department, Indian Institute of Technology, Mumbai 400 076, India.
International Nuclear Information System (INIS)
2003-01-01
This document summarizes the main 2002 energy indicators for France. A first table lists the evolution of general indicators between 1973 and 2002: energy bill, price of imported crude oil, energy independence, primary and final energy consumption. The main 2002 results are detailed separately for natural gas, petroleum and coal (consumption, imports, exports, production, stocks, prices). (J.S.)
DEFF Research Database (Denmark)
Jensen, Christian Skov; Lando, David; Pedersen, Lasse Heje
We characterize when physical probabilities, marginal utilities, and the discount rate can be recovered from observed state prices for several future time periods. We make no assumptions of the probability distribution, thus generalizing the time-homogeneous stationary model of Ross (2015). Recov...
African Journals Online (AJOL)
Department of Surgery, University of Cape Town Health Sciences Faculty, Groote Schuur Hospital, Observatory, Cape Town,. South Africa ... included all district, regional and tertiary hospitals in the nine provinces. Clinics and so-called ..... large contingency of senior general surgeons from countries such as Cuba, who have ...
African Journals Online (AJOL)
effect of fatigue on patient safety, and owing to increasing emphasis on lifestyle issues .... increasing emphasis on an appropriate work-life balance in professional life.10 ... experience, were the most negative about the EWTD in general.3,13 ...
African Journals Online (AJOL)
in the endoscopy room. GENERAL SURGERY. T du Toit, O C Buchel, S J A Smit. Department of Surgery, University of the Free State, Bloemfontein, ... The lack of video instrumentation in developing countries: Redundant fibre-optic instruments (the old. “eye scope”) are still being used. This instrument brings endoscopists ...
Staff Association
2016-01-01
5th April, 2016 – Ordinary General Assembly of the Staff Association! In the first semester of each year, the Staff Association (SA) invites its members to attend and participate in the Ordinary General Assembly (OGA). This year the OGA will be held on Tuesday, April 5th 2016 from 11:00 to 12:00 in BE Auditorium, Meyrin (6-2-024). During the Ordinary General Assembly, the activity and financial reports of the SA are presented and submitted for approval to the members. This is the occasion to get a global view on the activities of the SA, its financial management, and an opportunity to express one’s opinion, including taking part in the votes. Other points are listed on the agenda, as proposed by the Staff Council. Who can vote? Only “ordinary” members (MPE) of the SA can vote. Associated members (MPA) of the SA and/or affiliated pensioners have a right to vote on those topics that are of direct interest to them. Who can give his/her opinion? The Ordinary General Asse...
African Journals Online (AJOL)
could cripple the global economy. Greater attention ... Africa and 5.7 general surgeons per 100 000 in the US.12 One of the key ... 100 000 insured population working in the private sector, which is comparable with the United States (US).
Indian Academy of Sciences (India)
IAS Admin
. A q-ary necklace of length n is an equivalence class of q-coloured strings of length n under rota- tion. In this article, we study various generaliza- tions and derive analytical expressions to count the number of these generalized necklaces.
DEFF Research Database (Denmark)
Lando, David; Pedersen, Lasse Heje; Jensen, Christian Skov
We characterize when physical probabilities, marginal utilities, and the discount rate can be recovered from observed state prices for several future time periods. We make no assumptions of the probability distribution, thus generalizing the time-homogeneous stationary model of Ross (2015...... our model empirically, testing the predictive power of the recovered expected return and other recovered statistics....
Straumann, Norbert
2013-01-01
This book provides a completely revised and expanded version of the previous classic edition ‘General Relativity and Relativistic Astrophysics’. In Part I the foundations of general relativity are thoroughly developed, while Part II is devoted to tests of general relativity and many of its applications. Binary pulsars – our best laboratories for general relativity – are studied in considerable detail. An introduction to gravitational lensing theory is included as well, so as to make the current literature on the subject accessible to readers. Considerable attention is devoted to the study of compact objects, especially to black holes. This includes a detailed derivation of the Kerr solution, Israel’s proof of his uniqueness theorem, and a derivation of the basic laws of black hole physics. Part II ends with Witten’s proof of the positive energy theorem, which is presented in detail, together with the required tools on spin structures and spinor analysis. In Part III, all of the differential geomet...
Benos, Dale J; Vollmer, Sara H
2010-12-01
Modifying images for scientific publication is now quick and easy due to changes in technology. This has created a need for new image processing guidelines and attitudes, such as those offered to the research community by Doug Cromey (Cromey 2010). We suggest that related changes in technology have simplified the task of detecting misconduct for journal editors as well as researchers, and that this simplification has caused a shift in the responsibility for reporting misconduct. We also argue that the concept of best practices in image processing can serve as a general model for education in best practices in research.
Rodriguez-Iturbe, I.; Porporato, A.; Laio, F.; Ridolfi, L.
This series of four papers studies the complex dynamics of water-controlled ecosystems from the hydro-ecological point of view [e.g., I. Rodriguez-Iturbe, Water Resour. Res. 36 (1) (2000) 3-9]. After this general outline, the role of climate, soil, and vegetation is modeled in Part II [F. Laio, A. Porporato, L. Ridolfi, I. Rodriguez-Iturbe, Adv. Water Res. 24 (7) (2001) 707-723] to investigate the probabilistic structure of soil moisture dynamics and the water balance. Particular attention is given to the impact of timing and amount of rainfall, plant physiology, and soil properties. From the statistical characterization of the crossing properties of arbitrary levels of soil moisture, Part III develops an expression for vegetation water stress [A. Porporato, F. Laio, L. Ridolfi, I. Rodriguez-Iturbe, Adv. Water Res. 24 (7) (2001) 725-744]. This measure of stress is then employed to quantify the response of plants to soil moisture deficit as well as to infer plant suitability to given environmental conditions and understand some of the reasons for possible coexistence of different species. Detailed applications of these concepts are developed in Part IV [F. Laio, A. Porporato, C.P. Fernandez-Illescas, I. Rodriguez-Iturbe, Adv. Water Res. 24 (7) (2001) 745-762], where we investigate the dynamics of three different water-controlled ecosystems.
Nikolaev, A. V.; Alymenko, N. I.; Kamenskikh, A. A.; Alymenko, D. N.; Nikolaev, V. A.; Petrov, A. I.
2017-10-01
The article specifies measuring data of air parameters and its volume flow in the shafts and on the surface, collected in BKPRU-2 (Berezniki potash plant and mine 2) («Uralkali» PJSC) in normal operation mode, after shutdown of the main mine fan (GVU) and within several hours. As a result of the test it has been established that thermal pressure between the mine shafts is active continuously regardless of the GVU operation mode or other draught sources. Also it has been discovered that depth of the mine shafts has no impact on thermal pressure value. By the same difference of shaft elevation marks and parameters of outer air between the shafts, by their different depth, thermal pressure of the same value will be active. Value of the general mine natural draught defined as an algebraic sum of thermal pressure values between the shafts depends only on the difference of temperature and pressure of outer air and air in the shaft bottoms on condition of shutdown of the air handling system (unit-heaters, air conditioning systems).
Willard, Stephen
2004-01-01
Among the best available reference introductions to general topology, this volume is appropriate for advanced undergraduate and beginning graduate students. Its treatment encompasses two broad areas of topology: ""continuous topology,"" represented by sections on convergence, compactness, metrization and complete metric spaces, uniform spaces, and function spaces; and ""geometric topology,"" covered by nine sections on connectivity properties, topological characterization theorems, and homotopy theory. Many standard spaces are introduced in the related problems that accompany each section (340
Maldeghem, Hendrik
1998-01-01
This book is intended to be an introduction to the fascinating theory ofgeneralized polygons for both the graduate student and the specialized researcher in the field. It gathers together a lot of basic properties (some of which are usually referred to in research papers as belonging to folklore) and very recent and sometimes deep results. I have chosen a fairly strict geometrical approach, which requires some knowledge of basic projective geometry. Yet, it enables one to prove some typically group-theoretical results such as the determination of the automorphism groups of certain Moufang polygons. As such, some basic group-theoretical knowledge is required of the reader. The notion of a generalized polygon is a relatively recent one. But it is one of the most important concepts in incidence geometry. Generalized polygons are the building bricks of Tits buildings. They are the prototypes and precursors of more general geometries such as partial geometries, partial quadrangles, semi-partial ge ometries, near...
International Nuclear Information System (INIS)
Dory, A.B.
1982-01-01
This presentation is divided into two main sections. In the first, the author explores the issues of radiation and tailings disposal, and then examines the Canadian nuclear regulatory process from the point of view of jurisdiction, objectives, philosophy and mechanics. The compliance inspection program is outlined, and the author discussed the relationships between the AECB and other regulatory agencies, the public and uranium mine-mill workers. The section concludes with an examination of the stance of the medical profession on nuclear issues. In part two, the radiological hazards for uranium miners are examined: radon daughters, gamma radiation, thoron daughters and uranium dust. The author touches on new regulations being drafted, the assessment of past exposures in mine atmospheres, and the regulatory approach at the surface exploration stage. The presentation concludes with the author's brief observations on the findings of other uranium mining inquiries and on future requirements in the industry's interests
Li, Yun; Wang, Shengpei; Pan, Chuxiong; Xue, Fushan; Xian, Junfang; Huang, Yaqi; Wang, Xiaoyi; Li, Tianzuo; He, Huiguang
2018-01-01
The mechanism of general anesthesia (GA) has been explored for hundreds of years, but unclear. Previous studies indicated a possible correlation between NREM sleep and GA. The purpose of this study is to compare them by in vivo human brain function to probe the neuromechanism of consciousness, so as to find out a clue to GA mechanism. 24 healthy participants were equally assigned to sleep or propofol sedation group by sleeping ability. EEG and Ramsay Sedation Scale were applied to determine sleep stage and sedation depth respectively. Resting-state functional magnetic resonance imaging (RS-fMRI) was acquired at each status. Regional homogeneity (ReHo) and seed-based whole brain functional connectivity maps (WB-FC maps) were compared. During sleep, ReHo primarily weakened on frontal lobe (especially preoptic area), but strengthened on brainstem. While during sedation, ReHo changed in various brain areas, including cingulate, precuneus, thalamus and cerebellum. Cingulate, fusiform and insula were concomitance of sleep and sedation. Comparing to sleep, FCs between the cortex and subcortical centers (centralized in cerebellum) were significantly attenuated under sedation. As sedation deepening, cerebellum-based FC maps were diminished, while thalamus- and brainstem-based FC maps were increased. There're huge distinctions in human brain function between sleep and GA. Sleep mainly rely on brainstem and frontal lobe function, while sedation is prone to affect widespread functional network. The most significant differences exist in the precuneus and cingulate, which may play important roles in mechanisms of inducing unconciousness by anesthetics. Institutional Review Board (IRB) ChiCTR-IOC-15007454.
Diphoton generalized distribution amplitudes
International Nuclear Information System (INIS)
El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.
2008-01-01
We calculate the leading order diphoton generalized distribution amplitudes by calculating the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region at the Born order and in the leading logarithmic approximation. As in the case of the anomalous photon structure functions, the γγ generalized distribution amplitudes exhibit a characteristic lnQ 2 behavior and obey inhomogeneous QCD evolution equations.
International Nuclear Information System (INIS)
Vaturi, Sylvain
1969-01-01
Computerized edition is essential for data processing exploitation. When a more or less complex edition program is required for each task, then the need for a general edition program become obvious. The aim of this study is to create a general edition program. Universal programs are capable to execute numerous and varied tasks. For a more precise processing, the execution of which is frequently required, the use of a specialized program is preferable because, contradictory to the universal program, it goes straight to the point [fr
International Nuclear Information System (INIS)
Kwon, Yeong Sik; Lee, Dong Seop; Ryu, Haung Ryong; Jang, Cheol Hyeon; Choi, Bong Jong; Choi, Sang Won
1993-07-01
The book concentrates on the latest general chemistry, which is divided int twenty-three chapters. It deals with basic conception and stoichiometry, nature of gas, structure of atoms, quantum mechanics, symbol and structure of an electron of ion and molecule, chemical thermodynamics, nature of solid, change of state and liquid, properties of solution, chemical equilibrium, solution and acid-base, equilibrium of aqueous solution, electrochemistry, chemical reaction speed, molecule spectroscopy, hydrogen, oxygen and water, metallic atom; 1A, IIA, IIIA, carbon and atom IVA, nonmetal atom and an inert gas, transition metals, lanthanons, and actinoids, nuclear properties and radioactivity, biochemistry and environment chemistry.
International Nuclear Information System (INIS)
Gourgoulhon, Eric
2013-01-01
The author proposes a course on general relativity. He first presents a geometrical framework by addressing, presenting and discussion the following notions: the relativistic space-time, the metric tensor, Universe lines, observers, principle of equivalence and geodesics. In the next part, he addresses gravitational fields with spherical symmetry: presentation of the Schwarzschild metrics, radial light geodesics, gravitational spectral shift (Einstein effect), orbitals of material objects, photon trajectories. The next parts address the Einstein equation, black holes, gravitational waves, and cosmological solutions. Appendices propose a discussion of the relationship between relativity and GPS, some problems and their solutions, and Sage codes
The Generalized Quantum Statistics
Hwang, WonYoung; Ji, Jeong-Young; Hong, Jongbae
1999-01-01
The concept of wavefunction reduction should be introduced to standard quantum mechanics in any physical processes where effective reduction of wavefunction occurs, as well as in the measurement processes. When the overlap is negligible, each particle obey Maxwell-Boltzmann statistics even if the particles are in principle described by totally symmetrized wavefunction [P.R.Holland, The Quantum Theory of Motion, Cambridge Unversity Press, 1993, p293]. We generalize the conjecture. That is, par...
Generalized Nonlinear Yule Models
Lansky, Petr; Polito, Federico; Sacerdote, Laura
2016-01-01
With the aim of considering models with persistent memory we propose a fractional nonlinear modification of the classical Yule model often studied in the context of macrovolution. Here the model is analyzed and interpreted in the framework of the development of networks such as the World Wide Web. Nonlinearity is introduced by replacing the linear birth process governing the growth of the in-links of each specific webpage with a fractional nonlinear birth process with completely general birth...
Directory of Open Access Journals (Sweden)
John Cossey
2015-03-01
Full Text Available Quasinormal subgroups have been studied for nearly 80 years. In finite groups, questions concerning them invariably reduce to p-groups, and here they have the added interest of being invariant under projectivities, unlike normal subgroups. However, it has been shown recently that certain groups, constructed by Berger and Gross in 1982, of an important universal nature with regard to the existence of core-free quasinormal subgroups gener- ally, have remarkably few such subgroups. Therefore in order to overcome this misfortune, a generalization of the concept of quasi- normality will be defined. It could be the beginning of a lengthy undertaking. But some of the initial findings are encouraging, in particular the fact that this larger class of subgroups also remains invariant under projectivities of finite p-groups, thus connecting group and subgroup lattice structures.
International Nuclear Information System (INIS)
Nicklisch, F.
1984-01-01
Growing complexity of technical matter has meant that technical expertise is called upon in more and more legal proceedings. The technical expert is, in general terms, the mediator between technology and the law, he is also entrusted with the task of pointing up the differences in approach and in the nature of authority in these two areas and thus paving the way for mutual understanding. The evaluation of the technical expert's opinion is one of the cardinal problems bound up with the role of the expert in legal procedure. After the presentation of the expert's opinion, the judge is supposed to possess so much specialised knowledge that he can assess the opinion itself in scientific and technical respects and put his finger on any errors the expert may have made. This problem can only be solved via an assessment opinion. First of all, the opinion can be assessed indirectly via evaluation of the credentials and the neutrality and independence of the expert. In direct terms, the opinion can be subjected to a certain - albeit restricted - scrutiny, whether it is generally convincing, as far as the layman is competent to judge. This interpretation alone makes it possible to classify and integrate legally the technical standards and regulations represent expert statements on the scientific and technical theorems based on the knowledge and experience gained in a given area. They are designed to reflect prevailing opinion among leading representatives of the profession and can thus themselves be regarded as expert opinions. As a rule, these opinions will have such weight that - other than in exceptional cases - they will not be invalidated in procedure by deviating opinions from individual experts. (orig./HSCH) [de
International Nuclear Information System (INIS)
Rady, E.A.; Kozae, A.M.; Abd El-Monsef, M.M.E.
2004-01-01
The process of analyzing data under uncertainty is a main goal for many real life problems. Statistical analysis for such data is an interested area for research. The aim of this paper is to introduce a new method concerning the generalization and modification of the rough set theory introduced early by Pawlak [Int. J. Comput. Inform. Sci. 11 (1982) 314
Stijnen, Mandy M N; Jansen, Maria W J; Duimel-Peeters, Inge G P; Vrijhoef, Hubertus J M
2014-10-25
Population ageing fosters new models of care delivery for older people that are increasingly integrated into existing care systems. In the Netherlands, a primary-care based preventive home visitation programme has been developed for potentially frail community-dwelling older people (aged ≥75 years), consisting of a comprehensive geriatric assessment during a home visit by a practice nurse followed by targeted interdisciplinary care and follow-up over time. A theory-based process evaluation was designed to examine (1) the extent to which the home visitation programme was implemented as planned and (2) the extent to which general practices successfully redesigned their care delivery. Using a mixed-methods approach, the focus was on fidelity (quality of implementation), dose delivered (completeness), dose received (exposure and satisfaction), reach (participation rate), recruitment, and context. Twenty-four general practices participated, of which 13 implemented the home visitation programme and 11 delivered usual care to older people. Data collection consisted of semi-structured interviews with practice nurses (PNs), general practitioners (GPs), and older people; feedback meetings with PNs; structured registration forms filled-out by PNs; and narrative descriptions of the recruitment procedures and registration of inclusion and drop-outs by members of the research team. Fidelity of implementation was acceptable, but time constraints and inadequate reach (i.e., the relatively healthy older people participated) negatively influenced complete delivery of protocol elements, such as interdisciplinary cooperation and follow-up of older people over time. The home visitation programme was judged positively by PNs, GPs, and older people. Useful tools were offered to general practices for organising proactive geriatric care. The home visitation programme did not have major shortcomings in itself, but the delivery offered room for improvement. General practices received
Criteria and Processes for the Certification of Non-Radioactive Hazardous and Non-Hazardous Wastes
International Nuclear Information System (INIS)
Dominick, J.
2008-01-01
This document details Lawrence Livermore National Laboratory's (LLNL) criteria and processes for determining if potentially volumetrically contaminated or potentially surface contaminated wastes are to be managed as material containing residual radioactivity or as non-radioactive. This document updates and replaces UCRL-AR-109662, Criteria and Procedures for the Certification of Nonradioactive Hazardous Waste (Reference 1), also known as 'The Moratorium', and follows the guidance found in the U.S. Department of Energy (DOE) document, Performance Objective for Certification of Non-Radioactive Hazardous Waste (Reference 2). The 1992 Moratorium document (UCRL-AR-109662) is three volumes and 703 pages. The first volume provides an overview of the certification process and lists the key radioanalytical methods and their associated Limits of Sensitivities. Volumes Two and Three contain supporting documents and include over 30 operating procedures, QA plans, training documents and organizational charts that describe the hazardous and radioactive waste management system in place in 1992. This current document is intended to update the previous Moratorium documents and to serve as the top-tier LLNL institutional Moratorium document. The 1992 Moratorium document was restricted to certification of Resource Conservation and Recovery Act (RCRA), State and Toxic Substances Control Act (TSCA) hazardous waste from Radioactive Material Management Areas (RMMA). This still remains the primary focus of the Moratorium; however, this document increases the scope to allow use of this methodology to certify other LLNL wastes and materials destined for off-site disposal, transfer, and re-use including non-hazardous wastes and wastes generated outside of RMMAs with the potential for DOE added radioactivity. The LLNL organization that authorizes off-site transfer/disposal of a material or waste stream is responsible for implementing the requirements of this document. The LLNL Radioactive and
Allen, Johnie J; Anderson, Craig A; Bushman, Brad J
2018-02-01
The General Aggression Model (GAM) is a comprehensive, integrative, framework for understanding aggression. It considers the role of social, cognitive, personality, developmental, and biological factors on aggression. Proximate processes of GAM detail how person and situation factors influence cognitions, feelings, and arousal, which in turn affect appraisal and decision processes, which in turn influence aggressive or nonaggressive behavioral outcomes. Each cycle of the proximate processes serves as a learning trial that affects the development and accessibility of aggressive knowledge structures. Distal processes of GAM detail how biological and persistent environmental factors can influence personality through changes in knowledge structures. GAM has been applied to understand aggression in many contexts including media violence effects, domestic violence, intergroup violence, temperature effects, pain effects, and the effects of global climate change. Copyright © 2017 Elsevier Ltd. All rights reserved.
Categorization = Decision Making + Generalization
Seger, Carol A; Peterson, Erik J.
2013-01-01
We rarely, if ever, repeatedly encounter exactly the same situation. This makes generalization crucial for real world decision making. We argue that categorization, the study of generalizable representations, is a type of decision making, and that categorization learning research would benefit from approaches developed to study the neuroscience of decision making. Similarly, methods developed to examine generalization and learning within the field of categorization may enhance decision making research. We first discuss perceptual information processing and integration, with an emphasis on accumulator models. We then examine learning the value of different decision making choices via experience, emphasizing reinforcement learning modeling approaches. Next we discuss how value is combined with other factors in decision making, emphasizing the effects of uncertainty. Finally, we describe how a final decision is selected via thresholding processes implemented by the basal ganglia and related regions. We also consider how memory related functions in the hippocampus may be integrated with decision making mechanisms and contribute to categorization. PMID:23548891
General relativity and mathematics; Relatividad General y Matematicas
Energy Technology Data Exchange (ETDEWEB)
Mars, M.
2015-07-01
General relativity is more than a theory of gravity, since any physical process occupies space and lasts for a time, forcing to reconcile that physical theory that describes what the dynamic nature of space-time itself. (Author)
Cancer Investigation in General Practice
DEFF Research Database (Denmark)
Jensen, Jacob Reinholdt; Møller, Henrik; Thomsen, Janus Laust
2014-01-01
Initiation of cancer investigations in general practice Background Close to 90% of all cancers are diagnosed because the patient presents symptoms and signs. Of these patients, 85% initiate the diagnostic pathway in general practice. Therefore, the initiation of a diagnostic pathway in general...... practice becomes extremely important. On average, a general practitioner (GP) is involved in 7500 consultations each year, and in the diagnostic process of 8-10 incident cancers. One half of cancer patients consult their GP with either general symptoms, which are not indicative of cancer, or vague and non......-specific symptoms. The other half present with what the GP assess as alarm symptoms. Three months prior to diagnosis, patients who are later diagnosed with cancer have twice as many GP consultations than a comparable reference population. Thus the complex diagnostic process in general practice requires the GP...
Glauber model and its generalizations
International Nuclear Information System (INIS)
Bialkowski, G.
The physical aspects of the Glauber model problems are studied: potential model, profile function and Feynman diagrams approaches. Different generalizations of the Glauber model are discussed: particularly higher and lower energy processes and large angles [fr
St Clair Gibson, A; Swart, J; Tucker, R
2018-02-01
Either central (brain) or peripheral (body physiological system) control mechanisms, or a combination of these, have been championed in the last few decades in the field of Exercise Sciences as how physiological activity and fatigue processes are regulated. In this review, we suggest that the concept of 'central' or 'peripheral' mechanisms are both artificial constructs that have 'straight-jacketed' research in the field, and rather that competition between psychological and physiological homeostatic drives is central to the regulation of both, and that governing principles, rather than distinct physical processes, underpin all physical system and exercise regulation. As part of the Integrative Governor theory we develop in this review, we suggest that both psychological and physiological drives and requirements are underpinned by homeostatic principles, and that regulation of the relative activity of each is by dynamic negative feedback activity, as the fundamental general operational controller. Because of this competitive, dynamic interplay, we propose that the activity in all systems will oscillate, that these oscillations create information, and comparison of this oscillatory information with either prior information, current activity, or activity templates create efferent responses that change the activity in the different systems in a similarly dynamic manner. Changes in a particular system are always the result of perturbations occurring outside the system itself, the behavioural causative 'history' of this external activity will be evident in the pattern of the oscillations, and awareness of change occurs as a result of unexpected rather than planned change in physiological activity or psychological state.
DEFF Research Database (Denmark)
Borregaard, Michael K.; Matthews, Thomas J.; Whittaker, Robert James
2016-01-01
Aim: Island biogeography focuses on understanding the processes that underlie a set of well-described patterns on islands, but it lacks a unified theoretical framework for integrating these processes. The recently proposed general dynamic model (GDM) of oceanic island biogeography offers a step...... towards this goal. Here, we present an analysis of causality within the GDM and investigate its potential for the further development of island biogeographical theory. Further, we extend the GDM to include subduction-based island arcs and continental fragment islands. Location: A conceptual analysis...... of evolutionary processes in simulations derived from the mechanistic assumptions of the GDM corresponded broadly to those initially suggested, with the exception of trends in extinction rates. Expanding the model to incorporate different scenarios of island ontogeny and isolation revealed a sensitivity...
Hamlin, J K
2014-01-01
The ability to distinguish friends from foes allows humans to engage in mutually beneficial cooperative acts while avoiding the costs associated with cooperating with the wrong individuals. One way to do so effectively is to observe how unknown individuals behave toward third parties, and to selectively cooperate with those who help others while avoiding those who harm others. Recent research suggests that a preference for prosocial over antisocial individuals emerges by the time that infants are 3 months of age, and by 8 months, but not before, infants evaluate others' actions in context: they prefer those who harm, rather than help, individuals who have previously harmed others. Currently there are at least two reasons for younger infants' failure to show context-dependent social evaluations. First, this failure may reflect fundamental change in infants' social evaluation system over the first year of life, in which infants first prefer helpers in any situation and only later evaluate prosocial and antisocial actors in context. On the other hand, it is possible that this developmental change actually reflects domain-general limitations of younger infants, such as limited memory and processing capacities. To distinguish between these possibilities, 4.5-month-olds in the current studies were habituated, rather than familiarized as in previous work, to one individual helping and another harming a third party, greatly increasing infants' exposure to the characters' actions. Following habituation, 4.5-month-olds displayed context-dependent social preferences, selectively reaching for helpers of prosocial and hinderers of antisocial others. Such results suggest that younger infants' failure to display global social evaluation in previous work reflected domain-general rather than domain-specific limitations.
Directory of Open Access Journals (Sweden)
J Kiley eHamlin
2014-06-01
Full Text Available The ability to distinguish friends from foes allows humans to engage in mutually beneficial cooperative acts while avoiding the costs associated with cooperating with the wrong individuals. One way to do so effectively is to observe how unknown individuals behave toward third parties, and to selectively cooperate with those who help others while avoiding those who harm others. Recent research suggests that a preference for prosocial over antisocial individuals emerges by the time that infants are 3 months of age, and by 8 months, but not before, infants evaluate others’ actions in context: they prefer those who harm, rather than help, individuals who have previously harmed others. Currently there are at least two reasons for younger infants’ failure to show context-dependent social evaluations. First, this failure may reflect fundamental change in infants’ social evaluation system over the first year of life, in which infants first prefer helpers in any situation and only later evaluate prosocial and antisocial actors in context. On the other hand, it is possible that this developmental change actually reflects domain-general limitations of younger infants, such as limited memory and processing capacities. To distinguish between these possibilities, 4.5-month-olds in the current studies were habituated, rather than familiarized as in previous work, to one individual helping and another harming a third party, greatly increasing infants’ exposure to the characters’ actions. Following habituation, 4.5-month-olds displayed context-dependent social preferences, selectively reaching for helpers of prosocial and hinderers of antisocial others. Such results suggest that younger infants’ failure to display global social evaluation in previous work reflected domain-general rather than domain-specific limitations.
Generalized internal multiple imaging
Zuberi, Mohammad Akbar Hosain
2014-12-04
Various examples are provided for generalized internal multiple imaging (GIMI). In one example, among others, a method includes generating a higher order internal multiple image using a background Green\\'s function and rendering the higher order internal multiple image for presentation. In another example, a system includes a computing device and a generalized internal multiple imaging (GIMI) application executable in the computing device. The GIMI application includes logic that generates a higher order internal multiple image using a background Green\\'s function and logic that renders the higher order internal multiple image for display on a display device. In another example, a non-transitory computer readable medium has a program executable by processing circuitry that generates a higher order internal multiple image using a background Green\\'s function and renders the higher order internal multiple image for display on a display device.
Generalized internal multiple imaging
Zuberi, Mohammad Akbar Hosain; Alkhalifah, Tariq
2014-01-01
Various examples are provided for generalized internal multiple imaging (GIMI). In one example, among others, a method includes generating a higher order internal multiple image using a background Green's function and rendering the higher order internal multiple image for presentation. In another example, a system includes a computing device and a generalized internal multiple imaging (GIMI) application executable in the computing device. The GIMI application includes logic that generates a higher order internal multiple image using a background Green's function and logic that renders the higher order internal multiple image for display on a display device. In another example, a non-transitory computer readable medium has a program executable by processing circuitry that generates a higher order internal multiple image using a background Green's function and renders the higher order internal multiple image for display on a display device.
40 CFR 68.12 - General requirements.
2010-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS General § 68.12 General requirements. (a) General requirements. The... the five-year accident history for the process as provided in § 68.42 of this part and submit it in... §§ 68.150 to 68.185. The RMP shall include a registration that reflects all covered processes. (b...
Generalized Multiphoton Quantum Interference
Directory of Open Access Journals (Sweden)
Max Tillmann
2015-10-01
Full Text Available Nonclassical interference of photons lies at the heart of optical quantum information processing. Here, we exploit tunable distinguishability to reveal the full spectrum of multiphoton nonclassical interference. We investigate this in theory and experiment by controlling the delay times of three photons injected into an integrated interferometric network. We derive the entire coincidence landscape and identify transition matrix immanants as ideally suited functions to describe the generalized case of input photons with arbitrary distinguishability. We introduce a compact description by utilizing a natural basis that decouples the input state from the interferometric network, thereby providing a useful tool for even larger photon numbers.
General Criterion for Harmonicity
Proesmans, Karel; Vandebroek, Hans; Van den Broeck, Christian
2017-10-01
Inspired by Kubo-Anderson Markov processes, we introduce a new class of transfer matrices whose largest eigenvalue is determined by a simple explicit algebraic equation. Applications include the free energy calculation for various equilibrium systems and a general criterion for perfect harmonicity, i.e., a free energy that is exactly quadratic in the external field. As an illustration, we construct a "perfect spring," namely, a polymer with non-Gaussian, exponentially distributed subunits which, nevertheless, remains harmonic until it is fully stretched. This surprising discovery is confirmed by Monte Carlo and Langevin simulations.
International Nuclear Information System (INIS)
1982-11-01
The Fundamental Safety Rules applicable to certain types of nuclear installation are intended to clarify the conditions of which observance, for the type of installation concerned and for the subject that they deal with, is considered as equivalent to compliance with regulatory French technical practice. These Rules should facilitate safety analysises and the clear understanding between persons interested in matters related to nuclear safety. They in no way reduce the operator's liability and pose no obstacle to statutory provisions in force. For any installation to which a Fundamental Safety Rule applies according to the foregoing paragraph, the operator may be relieved from application of the Rule if he shows proof that the safety objectives set by the Rule are attained by other means that he proposes within the framework of statutory procedures. Furthermore, the Central Service for the Safety of Nuclear Installations reserves the right at all times to alter any Fundamental Safety Rule, as required, should it deem this necessary, while specifying the applicability conditions. This rule is intended to define the general provisions applicable to the production, inspection, processing, packaging and storage of wastes, resulting from the reprocessing of fuels irradiated in a PWR, packaged in the form of glass
Generalized Nonlinear Yule Models
Lansky, Petr; Polito, Federico; Sacerdote, Laura
2016-11-01
With the aim of considering models related to random graphs growth exhibiting persistent memory, we propose a fractional nonlinear modification of the classical Yule model often studied in the context of macroevolution. Here the model is analyzed and interpreted in the framework of the development of networks such as the World Wide Web. Nonlinearity is introduced by replacing the linear birth process governing the growth of the in-links of each specific webpage with a fractional nonlinear birth process with completely general birth rates. Among the main results we derive the explicit distribution of the number of in-links of a webpage chosen uniformly at random recognizing the contribution to the asymptotics and the finite time correction. The mean value of the latter distribution is also calculated explicitly in the most general case. Furthermore, in order to show the usefulness of our results, we particularize them in the case of specific birth rates giving rise to a saturating behaviour, a property that is often observed in nature. The further specialization to the non-fractional case allows us to extend the Yule model accounting for a nonlinear growth.
Entrepreneurship within General Aviation
Ullmann, Brian M.
1995-01-01
Many modern economic theories place great importance upon entrepreneurship in the economy. Some see the entrepreneur as the individual who bears risk of operating a business in the face of uncertainty about future conditions and who is rewarded through profits and losses. The 20th century economist Joseph Schumpter saw the entrepreneur as the medium by which advancing technology is incorporated into society as businesses seek competitive advantages through more efficient product development processes. Due to the importance that capitalistic systems place upon entrepreneurship, it has become a well studied subject with many texts to discuss how entrepreneurs can succeed in modern society. Many entrepreneuring and business management courses go so far as to discuss the characteristic phases and prominent challenges that fledgling companies face in their efforts to bring a new product into a competitive market. However, even with all of these aids, start-up companies fail at an enormous rate. Indeed, the odds of shepherding a new company through the travails of becoming a well established company (as measured by the ability to reach Initial Public Offering (IPO)) have been estimated to be six in 1,000,000. Each niche industry has characteristic challenges which act as barriers to entry for new products into that industry. Thus, the applicability of broad generalizations is subject to limitations within niche markets. This paper will discuss entrepreneurship as it relates to general aviation. The goals of this paper will be to: introduce general aviation; discuss the details of marrying entrepreneurship with general aviation; and present a sample business plan which would characterize a possible entrepreneurial venture.
Newman, Michelle G; Castonguay, Louis G; Jacobson, Nicholas C; Moore, Ginger A
2015-10-01
To determine whether baseline dimensions of adult insecure attachment (avoidant and anxious) moderated outcome in a secondary analysis of a randomized controlled trial comparing cognitive-behavioral therapy (CBT) plus supportive listening (CBT + SL) versus CBT plus interpersonal and emotional processing therapy (CBT + I/EP). Eighty-three participants diagnosed with generalized anxiety disorder (GAD) were recruited from the community and assigned randomly to CBT + SL (n = 40) or to CBT + I/EP (n = 43) within a study using an additive design. PhD-level psychologists treated participants. Blind assessors evaluated participants at pretreatment, posttreatment, 6-month, 12-month, and 2-year follow-up with a composite of self-report and assessor-rated GAD symptom measures (Penn State Worry Questionnaire, Hamilton Anxiety Rating Scale, Clinician's Severity Rating). Avoidant and anxious attachment were assessed using self-reported dismissing and angry states of mind, respectively, on the Perceptions of Adult Attachment Questionnaire. Consistent with our prediction, at all assessments higher levels of dismissing styles in those who received CBT + I/EP predicted greater change in GAD symptoms compared with those who received CBT + SL for whom dismissiveness was unrelated to the change. At postassessment, higher angry attachment was associated with less change in GAD symptoms for those receiving CBT + I/EP, compared with CBT + SL, for whom anger was unrelated to change in GAD symptoms. Pretreatment attachment-related anger failed to moderate outcome at other time points and therefore, these moderation effects were more short-lived than the ones for dismissing attachment. When compared with CBT + SL, CBT + I/EP may be better for individuals with GAD who have relatively higher dismissing styles of attachment. (c) 2015 APA, all rights reserved).
Newman, Michelle G.; Castonguay, Louis G.; Jacobson, Nicholas C.; Moore, Ginger A.
2016-01-01
Objective To determine whether baseline dimensions of adult insecure attachment (avoidant and anxious) moderated outcome in a secondary analysis of a randomized controlled trial comparing cognitive–behavioral therapy (CBT) plus supportive listening (CBT + SL) versus CBT plus interpersonal and emotional processing therapy (CBT + I/EP). Method Eighty-three participants diagnosed with generalized anxiety disorder (GAD) were recruited from the community and assigned randomly to CBT + SL (n = 40) or to CBT + I/EP (n = 43) within a study using an additive design. PhD-level psychologists treated participants. Blind assessors evaluated participants at pretreatment, posttreatment, 6-month, 12-month, and 2-year follow-up with a composite of self-report and assessor-rated GAD symptom measures (Penn State Worry Questionnaire, Hamilton Anxiety Rating Scale, Clinician’s Severity Rating). Avoidant and anxious attachment were assessed using self-reported dismissing and angry states of mind, respectively, on the Perceptions of Adult Attachment Questionnaire. Results Consistent with our prediction, at all assessments higher levels of dismissing styles in those who received CBT + I/EP predicted greater change in GAD symptoms compared with those who received CBT + SL for whom dismissiveness was unrelated to the change. At postassessment, higher angry attachment was associated with less change in GAD symptoms for those receiving CBT + I/EP, compared with CBT + SL, for whom anger was unrelated to change in GAD symptoms. Pretreatment attachment-related anger failed to moderate outcome at other time points and therefore, these moderation effects were more short-lived than the ones for dismissing attachment. Conclusions When compared with CBT + SL, CBT + I/EP may be better for individuals with GAD who have relatively higher dismissing styles of attachment. PMID:26052875
A generalized gyrokinetic Poisson solver
International Nuclear Information System (INIS)
Lin, Z.; Lee, W.W.
1995-03-01
A generalized gyrokinetic Poisson solver has been developed, which employs local operations in the configuration space to compute the polarization density response. The new technique is based on the actual physical process of gyrophase-averaging. It is useful for nonlocal simulations using general geometry equilibrium. Since it utilizes local operations rather than the global ones such as FFT, the new method is most amenable to massively parallel algorithms
Directory of Open Access Journals (Sweden)
Yakelin Rodríguez
2012-09-01
Full Text Available Antioxidant system is involved in arbuscular mycorrhizal symbiosis, but its role during the colonization process is still poorly understood. To gain new insights into the role of antioxidant system during root colonization by arbuscular mycorrhizal fungi, the activities of key antioxidant enzymes were evaluated in tomato (Solanum lycopersicum L. roots inoculated with six strains of different genera and species: two Glomus mosseae, Glomus cubense, Glomus intraradices, Glomus sp. and Acaulospora scrobiculata. Glomus cubense and A. scrobiculata strains reached the highest infectivity levels with maximum values of colonization frequency and intensity of 29-10.88% and 18-9.20%, respectively; G. mosseae strains showed an intermediate infectivity, both with 15% of colonization frequency and maximum intensities of 7.647.06%, respectively; while the infectivity levels of Glomus sp. and G. intraradices strains were the lowest with colonization frequency- 13% and intensities- 5.07 and 5.41, respectively. Some activity patterns of peroxidase, superoxide dismutase, and polyphenol oxidase enzymes were not specific for early or late colonization stages neither for the colonization level and type of strain. However, a unique superoxide dismutase-band presents at early colonization and the low level of guaiacol-peroxidase activity at later stages presents in all inoculated roots indicate that these antioxidant responses are independent of colonization degree and strain. Taking together, our data suggest that alterations of the antioxidant enzyme activities are not general characteristics of the colonization process by arbuscular mycorrhizal fungi, probably having the key role on those responses the specific feature of each strain rather than colonization per se.El sistema antioxidante está involucrado en la simbiosis micorrízico-arbuscular, pero su rol durante el proceso de colonización es aún escasamente comprendido. Para esclarecer el papel del sistema
Full Text Available ... News Physician Resources Professions Site Index A-Z General Ultrasound Ultrasound imaging uses sound waves to produce ... the limitations of General Ultrasound Imaging? What is General Ultrasound Imaging? Ultrasound is safe and painless, and ...
... Resources Professions Site Index A-Z General Nuclear Medicine Nuclear medicine imaging uses small amounts of radioactive ... of General Nuclear Medicine? What is General Nuclear Medicine? Nuclear medicine is a branch of medical imaging ...
California Natural Resource Agency — We undertook creating the first ever seamless statewide General Plan map for California. All county general plans and many city general plans were integrated into 1...
Recruitment of general practices
DEFF Research Database (Denmark)
Riis, Allan; Jensen, Cathrine Elgaard; Maindal, Helle Terkildsen
2016-01-01
-factors as determinants for successfully recruiting healthcare professionals: relationships, reputation, requirements, rewards, reciprocity, resolution, and respect. Method: This is a process evaluation of the seven R-factors. We applied these factors to guide the design of our recruitment strategy as well as to make......Introduction: Health service research often involves the active participation of healthcare professionals. However, their ability and commitment to research varies. This can cause recruitment difficulties and thereby prolong the study period and inflate budgets. Solberg has identified seven R...... adjustments when recruiting general practices in a guideline implementation study. In the guideline implementation study, we studied the effect of outreach visits, quality reports, and new patient stratification tools for low back pain patients. Results: During a period of 15 months, we recruited 60 practices...
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Generalized Linear Covariance Analysis
Carpenter, James R.; Markley, F. Landis
2014-01-01
This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.
DEFF Research Database (Denmark)
van der Aalst, W.M.P.; Rubin, V.; Verbeek, H.M.W.
2010-01-01
Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...... support for it). None of the existing techniques enables the user to control the balance between “overfitting” and “underfitting”. To address this, we propose a two-step approach. First, using a configurable approach, a transition system is constructed. Then, using the “theory of regions”, the model...
International Nuclear Information System (INIS)
Anon.
1981-01-01
Many of the measurements and observations made in a nuclear processing facility to monitor processes and product quality can also be used to monitor the location and movements of nuclear materials. In this session information is presented on how to use process monitoring data to enhance nuclear material control and accounting (MC and A). It will be seen that SNM losses can generally be detected with greater sensitivity and timeliness and point of loss localized more closely than by conventional MC and A systems if process monitoring data are applied. The purpose of this session is to enable the participants to: (1) identify process unit operations that could improve control units for monitoring SNM losses; (2) choose key measurement points and formulate a loss indicator for each control unit; and (3) describe how the sensitivities and timeliness of loss detection could be determined for each loss indicator
DEFF Research Database (Denmark)
Vilstrup Pedersen, Klaus
2006-01-01
.e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...
Minimal and careful processing
Nielsen, Thorkild
2004-01-01
In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...
40 CFR 425.02 - General definitions.
2010-07-01
... STANDARDS LEATHER TANNING AND FINISHING POINT SOURCE CATEGORY General Provisions § 425.02 General...) “Chrome tan” means the process of converting hide into leather using a form of chromium. (g) “Vegetable tan” means the process of converting hides into leather using chemicals either derived from vegetable...
Generalized Cartan Calculus in general dimension
Wang, Yi-Nan
2015-07-01
We develop the generalized Cartan Calculus for the groups and SO(5 , 5). They are the underlying algebraic structures of d = 9 , 7 , 6 exceptional field theory, respectively. These algebraic identities are needed for the "tensor hierarchy" structure in exceptional field theory. The validity of Poincaré lemmas in this new differential geometry is also discussed. Finally we explore some possible extension of the generalized Cartan calculus beyond the exceptional series.
O(3)-invariant tunneling in general relativity
International Nuclear Information System (INIS)
Berezin, V.A.; Tkachev, I.I.; Kuzmin, V.A.; AN SSSR, Moscow. Inst. Yadernykh Issledovanij)
1987-12-01
We derived a general formula for the action for any O(3)-invariant tunneling processes in false vacuum decay in general relativity. The general classification of the bubble Euclidean trajectories is elaborated and explicit expressions for bounces for some processes like the vacuum creation of a double bubble, in particular in the vicinity of a black hole; the subbarrier creation of the Einstein-Rosen bridge, creation from nothing of two Minkowski worlds connected by a shell etc., are given. (orig.)
Generalized convexity, generalized monotonicity recent results
Martinez-Legaz, Juan-Enrique; Volle, Michel
1998-01-01
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized conve...
International Nuclear Information System (INIS)
Dragon, N.
1979-01-01
The possible use of trilinear algebras as symmetry algebras for para-Fermi fields is investigated. The shortcomings of the examples are argued to be a general feature of such generalized algebras. (author)
... Examine Oral Systemic Health Nov 14, 2017 General Dentistry and American Family Physician Collaborate to Examine Oral ... Oral Health Oct 23, 2017 Academy of General Dentistry Foundation Celebrates 45 Years Raising Awareness for Oral ...
International Nuclear Information System (INIS)
Leivo, H.P.
1992-01-01
The algebraic approach to quantum groups is generalized to include what may be called an anyonic symmetry, reflecting the appearance of phases more general than ±1 under transposition. (author). 6 refs
Full Text Available ... What are the limitations of General Ultrasound Imaging? What is General Ultrasound Imaging? Ultrasound is safe and ... be heard with every heartbeat. top of page What are some common uses of the procedure? Ultrasound ...
Department of Transportation — Delphi general ledger contains the following data elements, but is not limited to the United States Standard General Ledger (USSGL) chart of accounts, stores actual,...
Generalized hypergeometric coherent states
International Nuclear Information System (INIS)
Appl, Thomas; Schiller, Diethard H
2004-01-01
We introduce a large class of holomorphic quantum states by choosing their normalization functions to be given by generalized hypergeometric functions. We call them generalized hypergeometric states in general, and generalized hypergeometric coherent states in particular, if they allow a resolution of unity. Depending on the domain of convergence of the generalized hypergeometric functions, we distinguish generalized hypergeometric states on the plane, the open unit disc and the unit circle. All states are eigenstates of suitably defined lowering operators. We then study their photon number statistics and phase properties as revealed by the Husimi and Pegg-Barnett phase distributions. On the basis of the generalized hypergeometric coherent states we introduce new analytic representations of arbitrary quantum states in Bargmann and Hardy spaces as well as generalized hypergeometric Husimi distributions and corresponding phase distributions
International Nuclear Information System (INIS)
Manus, C.; Mainfray, G.
1980-01-01
The main features of multiphoton processes are described on a somewhat elementary basis. The emphasis is put on multiphoton ionization of atoms where the influence of resonance effects is given through typical examples. The important role played by the coherence of light is shown to produce a very dramatic influence on multiphoton absorption. Different observations concerning molecules, electrons, as well as solid surfaces illustrate the generality of these very non linear interaction between light and matter
Regeneration and general Markov chains
Directory of Open Access Journals (Sweden)
Vladimir V. Kalashnikov
1994-01-01
Full Text Available Ergodicity, continuity, finite approximations and rare visits of general Markov chains are investigated. The obtained results permit further quantitative analysis of characteristics, such as, rates of convergence, continuity (measured as a distance between perturbed and non-perturbed characteristics, deviations between Markov chains, accuracy of approximations and bounds on the distribution function of the first visit time to a chosen subset, etc. The underlying techniques use the embedding of the general Markov chain into a wide sense regenerative process with the help of splitting construction.
Generalization of Vaidya's radiation metric
Energy Technology Data Exchange (ETDEWEB)
Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica
1981-11-01
In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.
Generalized Fourier transforms classes
DEFF Research Database (Denmark)
Berntsen, Svend; Møller, Steen
2002-01-01
The Fourier class of integral transforms with kernels $B(\\omega r)$ has by definition inverse transforms with kernel $B(-\\omega r)$. The space of such transforms is explicitly constructed. A slightly more general class of generalized Fourier transforms are introduced. From the general theory foll...... follows that integral transform with kernels which are products of a Bessel and a Hankel function or which is of a certain general hypergeometric type have inverse transforms of the same structure....
Generalized Fourier transforms classes
DEFF Research Database (Denmark)
Berntsen, Svend; Møller, Steen
2002-01-01
The Fourier class of integral transforms with kernels $B(\\omega r)$ has by definition inverse transforms with kernel $B(-\\omega r)$. The space of such transforms is explicitly constructed. A slightly more general class of generalized Fourier transforms are introduced. From the general theory...
Ridgely, Charles T.
2010-01-01
Many textbooks dealing with general relativity do not demonstrate the derivation of forces in enough detail. The analyses presented herein demonstrate straightforward methods for computing forces by way of general relativity. Covariant divergence of the stress-energy-momentum tensor is used to derive a general expression of the force experienced…
Blastogenetic associations: General considerations.
Lubinsky, Mark
2015-11-01
Associations of anomalies, with VACTERL as the prototype, have been the source of much debate, including questions about the validity and definition of this category. Evidence is presented for a teratologic basis for associations involving interactions between disruptive events and specific vulnerabilities. Because the embryo is organized in time and space, differences in the timing, location, and severity of exposures will create variable sequelae for any specific vulnerability, creating associations. The blastogenetic stage of development involves distinct properties that affect the nature of associations arising during this time, including relatively undifferentiated developmental fields and causally nonspecific malformations. With this, single anomalies can be part of the spectrum of findings that comprise a specific association. A specific defect defines a subset of disturbances, biasing frequencies of other defects. Processes are basic, integrated, and general, so disruptions are often lethal, and can have multiple effects, accounting for high incidences of multiple anomalies, and overlaps between associations. Blastogenetic disturbances also do not affect the late "fine tuning" of minor anomalies, although pathogenetic sequences can occur. This model suggests that certain combinations of congenital anomalies can arise from causally nonspecific teratogenetic fields determined by timing, location, and vulnerabilities, rather than polytopic developmental fields. © 2015 Wiley Periodicals, Inc.
Oxygen general saturation after bronchography under general ...
African Journals Online (AJOL)
Thirty-six patients undergoing bronchography or bronchoscopy under general anaesthesia were continuously monitored by pulse oximetry for 5 hours after these procedures. Significant falls in oxygen saturation were observed in the first hour and were of most clinical relevance in patients with preexisting pulmonary ...
On a Generalized Hankel Type Convolution of Generalized Functions
Indian Academy of Sciences (India)
Generalized Hankel type transformation; Parserval relation; generalized ... The classical generalized Hankel type convolution are defined and extended to a class of generalized functions. ... Proceedings – Mathematical Sciences | News.
International Nuclear Information System (INIS)
Moser, D.R.
1986-01-01
Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs
Generally covariant gauge theories
International Nuclear Information System (INIS)
Capovilla, R.
1992-01-01
A new class of generally covariant gauge theories in four space-time dimensions is investigated. The field variables are taken to be a Lie algebra valued connection 1-form and a scalar density. Modulo an important degeneracy, complex [euclidean] vacuum general relativity corresponds to a special case in this class. A canonical analysis of the generally covariant gauge theories with the same gauge group as general relativity shows that they describe two degrees of freedom per space point, qualifying therefore as a new set of neighbors of general relativity. The modification of the algebra of the constraints with respect to the general relativity case is computed; this is used in addressing the question of how general relativity stands out from its neighbors. (orig.)
Generally representative is generally representative: comment on Shuttleworth-Edwards.
Taylor, Nicola
2016-10-01
The aim of this paper is to provide comment on Shuttleworth-Edwards' criticism of the general population norms created for the South African adaptation of the WAIS-IV. In her criticism, she states that the norms are not applicable for any groups in South Africa, based on the fact that the norms were not stratified according to quality of education. A discussion of some of the key issues that impact on the creation of general population norms in the South African context is provided. Demographic characteristics such as education level, quality of education, urban and rural demarcations, and home language are all considered. While the utility of within-group norms is not denied, the adoption of these without reference to the general population is not advised. To recommend that practitioners simply dispense with the general population norm without evidence that it creates misclassification or does not function effectively for the intended population lacks scientific merit at the current time. The need for clinical studies and further predictive validity research using the South African adaptation of the WAIS-IV is crucial to demonstrate the continued utility of the test in the South African context. Additional reference groups will improve the amount of comparative information available for clinicians to be able to make better informed decisions for diagnosis, but the general population norms will be an important starting point in this process.
Generalization of Gibbs Entropy and Thermodynamic Relation
Park, Jun Chul
2010-01-01
In this paper, we extend Gibbs's approach of quasi-equilibrium thermodynamic processes, and calculate the microscopic expression of entropy for general non-equilibrium thermodynamic processes. Also, we analyze the formal structure of thermodynamic relation in non-equilibrium thermodynamic processes.
How general are general source conditions?
International Nuclear Information System (INIS)
Mathé, Peter; Hofmann, Bernd
2008-01-01
Error analysis of regularization methods in Hilbert spaces is based on smoothness assumptions in terms of source conditions. In the traditional setup, i.e. when smoothness is in a power scale, we see that not all elements in the underlying Hilbert space possess some smoothness with this scale. Our main result asserts that this can be overcome when turning to general source conditions defined in terms of index functions. We conclude with some consequences
Energy Technology Data Exchange (ETDEWEB)
Korff, W
1979-01-01
The investigation was born by a concrete challenge. In the decision conflict regarding the project in Wyhl/Oberrhein, the theologic ethic was asked for his opinion. As the present ethical theory has only few strategies concerning efficiency and performance, the author turned to the traditional models to find some points of view there. Here is the main point of the investigation: in the emphasizing of general criteria which make a reasonable, i.e. controllable decision possible; not really in the concrete individual results to which the application from the preconditions taken into consideration leads.
Teachers' Understanding of Algebraic Generalization
Hawthorne, Casey Wayne
Generalization has been identified as a cornerstone of algebraic thinking (e.g., Lee, 1996; Sfard, 1995) and is at the center of a rich conceptualization of K-8 algebra (Kaput, 2008; Smith, 2003). Moreover, mathematics teachers are being encouraged to use figural-pattern generalizing tasks as a basis of student-centered instruction, whereby teachers respond to and build upon the ideas that arise from students' explorations of these activities. Although more and more teachers are engaging their students in such generalizing tasks, little is known about teachers' understanding of generalization and their understanding of students' mathematical thinking in this domain. In this work, I addressed this gap, exploring the understanding of algebraic generalization of 4 exemplary 8th-grade teachers from multiple perspectives. A significant feature of this investigation is an examination of teachers' understanding of the generalization process, including the use of algebraic symbols. The research consisted of two phases. Phase I was an examination of the teachers' understandings of the underlying quantities and quantitative relationships represented by algebraic notation. In Phase II, I observed the instruction of 2 of these teachers. Using the lens of professional noticing of students' mathematical thinking, I explored the teachers' enacted knowledge of algebraic generalization, characterizing how it supported them to effectively respond to the needs and queries of their students. Results indicated that teachers predominantly see these figural patterns as enrichment activities, disconnected from course content. Furthermore, in my analysis, I identified conceptual difficulties teachers experienced when solving generalization tasks, in particular, connecting multiple symbolic representations with the quantities in the figures. Moreover, while the teachers strived to overcome the challenges of connecting different representations, they invoked both productive and unproductive
DEFF Research Database (Denmark)
Hull Kristensen, Peer; Bojesen, Anders
This paper invites to discuss the processes of individualization and organizing being carried out under what we might see as an emerging regime of change. The underlying argumentation is that in certain processes of change, competence becomes questionable at all times. The hazy characteristics...... of this regime of change are pursued through a discussion of competencies as opposed to qualifications illustrated by distinct cases from the Danish public sector in the search for repetitive mechanisms. The cases are put into a general perspective by drawing upon experiences from similar change processes...... in MNCs. The paper concludes by asking whether we can escape from a regime of competence in a world defined by a rhetoric of change and create a more promising world in which doubt and search serve as a strategy for gaining knowledge and professionalism that improve on our capability for mutualism....
Unsupervised Learning and Generalization
DEFF Research Database (Denmark)
Hansen, Lars Kai; Larsen, Jan
1996-01-01
The concept of generalization is defined for a general class of unsupervised learning machines. The generalization error is a straightforward extension of the corresponding concept for supervised learning, and may be estimated empirically using a test set or by statistical means-in close analogy ...... with supervised learning. The empirical and analytical estimates are compared for principal component analysis and for K-means clustering based density estimation......The concept of generalization is defined for a general class of unsupervised learning machines. The generalization error is a straightforward extension of the corresponding concept for supervised learning, and may be estimated empirically using a test set or by statistical means-in close analogy...
Generalization of concurrence vectors
International Nuclear Information System (INIS)
Yu Changshui; Song Heshan
2004-01-01
In this Letter, based on the generalization of concurrence vectors for bipartite pure state with respect to employing tensor product of generators of the corresponding rotation groups, we generalize concurrence vectors to the case of mixed states; a new criterion of separability of multipartite pure states is given out, for which we define a concurrence vector; we generalize the vector to the case of multipartite mixed state and give out a good measure of free entanglement
General quantum variational calculus
Directory of Open Access Journals (Sweden)
Artur M. C. Brito da Cruz
2018-02-01
Full Text Available We develop a new variational calculus based in the general quantum difference operator recently introduced by Hamza et al. In particular, we obtain optimality conditions for generalized variational problems where the Lagrangian may depend on the endpoints conditions and a real parameter, for the basic and isoperimetric problems, with and without fixed boundary conditions. Our results provide a generalization to previous results obtained for the $q$- and Hahn-calculus.
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Generalized quasi variational inequalities
Energy Technology Data Exchange (ETDEWEB)
Noor, M.A. [King Saud Univ., Riyadh (Saudi Arabia)
1996-12-31
In this paper, we establish the equivalence between the generalized quasi variational inequalities and the generalized implicit Wiener-Hopf equations using essentially the projection technique. This equivalence is used to suggest and analyze a number of new iterative algorithms for solving generalized quasi variational inequalities and the related complementarity problems. The convergence criteria is also considered. The results proved in this paper represent a significant improvement and refinement of the previously known results.
DEFF Research Database (Denmark)
Jacobsen, Christian Robert Dahl; Nielsen, Morten; Rasmussen, Morten Grud
2017-01-01
Generalized sampling is a numerically stable framework for obtaining reconstructions of signals in different bases and frames from their samples. For example, one can use wavelet bases for reconstruction given frequency measurements. In this paper, we will introduce a carefully documented toolbox...... for performing generalized sampling in Julia. Julia is a new language for technical computing with focus on performance, which is ideally suited to handle the large size problems often encountered in generalized sampling. The toolbox provides specialized solutions for the setup of Fourier bases and wavelets....... The performance of the toolbox is compared to existing implementations of generalized sampling in MATLAB....
Directory of Open Access Journals (Sweden)
Gonzalo R Quintana
2010-11-01
Full Text Available En un experimento de aprendizaje predictivo humano se investigó si un estímulo visual compuesto por 2 elementos es procesado como un todo (configuracionismo o como la suma de sus elementos (elementalismo. El experimento se realizó a través de un juego computacional en el que los participantes debían aprender que ciertos microorganismos (claves compuestos por 2 rasgos visuales (elementos producían alergia en animales ficticios. Un total de 38 estudiantes universitarios aprendieron que 2 microorganismos causaban alergia (claves positivas y otros 2 no (claves negativas. Posteriormente, se examinó el valor predictivo que los participantes asignaban a un nuevo microorganismo compuesto por un elemento de cada clave positiva y a otro compuesto por un elemento de cada clave negativa. El valor predictivo asignado a las nuevas claves fue similar al asignado a sus respectivas claves aprendidas cuando los elementos eran perceptivamente separables (tamaño y ángulo de la figura, indicando elementalismo, pero no cuando eran integrales (brillo y saturación, indicando configuracionismo. Esto apoya la hipótesis de que las características de los estímulos determinan el tipo de procesamiento en el aprendizaje.An experiment on human predictive learning investigated whether a visual stimulus composed of 2 elements is processed as a whole (configural processing or as the aggregation of its elements (elemental processing. The experiment was conducted by means of a computer game in which the participants have to learn that certain microorganisms (cues composed of 2 visual features (elements produced an allergic reaction in fictitious animals. A total of 38 college students learned that 2 microorganisms cause allergy (positive cues and 2 microorganisms do not (negative cues. Subsequently, the predictive value that participants assigned to a new microorganism composed of one element of each positive cue and to a new microorganism composed of one element of
Grant, Andrew J; Vermunt, Jan D; Kinnersley, Paul; Houston, Helen
2007-03-30
Portfolio learning enables students to collect evidence of their learning. Component tasks making up a portfolio can be devised that relate directly to intended learning outcomes. Reflective tasks can stimulate students to recognise their own learning needs. Assessment of portfolios using a rating scale relating to intended learning outcomes offers high content validity. This study evaluated a reflective portfolio used during a final-year attachment in general practice (family medicine). Students were asked to evaluate the portfolio (which used significant event analysis as a basis for reflection) as a learning tool. The validity and reliability of the portfolio as an assessment tool were also measured. 81 final-year medical students completed reflective significant event analyses as part of a portfolio created during a three-week attachment (clerkship) in general practice (family medicine). As well as two reflective significant event analyses each portfolio contained an audit and a health needs assessment. Portfolios were marked three times; by the student's GP teacher, the course organiser and by another teacher in the university department of general practice. Inter-rater reliability between pairs of markers was calculated. A questionnaire enabled the students' experience of portfolio learning to be determined. Benefits to learning from reflective learning were limited. Students said that they thought more about the patients they wrote up in significant event analyses but information as to the nature and effect of this was not forthcoming. Moderate inter-rater reliability (Spearman's Rho .65) was found between pairs of departmental raters dealing with larger numbers (20-60) of portfolios. Inter-rater reliability of marking involving GP tutors who only marked 1-3 portfolios was very low. Students rated highly their mentoring relationship with their GP teacher but found the portfolio tasks time-consuming. The inter-rater reliability observed in this study should
International Nuclear Information System (INIS)
Ridgely, Charles T
2010-01-01
Many textbooks dealing with general relativity do not demonstrate the derivation of forces in enough detail. The analyses presented herein demonstrate straightforward methods for computing forces by way of general relativity. Covariant divergence of the stress-energy-momentum tensor is used to derive a general expression of the force experienced by an observer in general coordinates. The general force is then applied to the local co-moving coordinate system of a uniformly accelerating observer, leading to an expression of the inertial force experienced by the observer. Next, applying the general force in Schwarzschild coordinates is shown to lead to familiar expressions of the gravitational force. As a more complex demonstration, the general force is applied to an observer in Boyer-Lindquist coordinates near a rotating, Kerr black hole. It is then shown that when the angular momentum of the black hole goes to zero, the force on the observer reduces to the force on an observer held stationary in Schwarzschild coordinates. As a final consideration, the force on an observer moving in rotating coordinates is derived. Expressing the force in terms of Christoffel symbols in rotating coordinates leads to familiar expressions of the centrifugal and Coriolis forces on the observer. It is envisioned that the techniques presented herein will be most useful to graduate level students, as well as those undergraduate students having experience with general relativity and tensor analysis.
Generalizing: The descriptive struggle
Directory of Open Access Journals (Sweden)
Barney G. Glaser, Ph.D.; Hon Ph.D.
2006-11-01
Full Text Available The literature is not kind to the use of descriptive generalizations. Authors struggle and struggle to find and rationalize a way to use them and then fail in spite of trying a myriad of work-arounds. And then we have Lincoln and Guba’s famous statement: “The only generalization is: there is no generalization” in referring to qualitative research. (op cit, p. 110 They are referring to routine QDA yielding extensive descriptions, but which tacitly include conceptual generalizations without any real thought of knowledge about them. In this chapter I wish to explore this struggle for the purpose of explaining that the various contra arguments to using descriptive generalizations DO NOT apply to the ease of using conceptual generalizations yielded in SGT and especially FGT. I will not argue for the use of descriptive generalization. I agree with Lincoln and Guba with respect to QDA, “the only generalization is: there is no generalization.” It is up to the QDA methodologists, of whom there are many; to continue the struggle and I wish them well.
International Nuclear Information System (INIS)
Rosen, N.
1979-01-01
A modification of general relativity is proposed involving a second metric tensor describing a space-time of constant curvature and associated with the fundamental rest-frame of the universe. The theory generally agrees with the Einstein theory, but gives cosmological models without singularities which can account for present observation, including helium abundance
International Nuclear Information System (INIS)
Papoyan, V.V.
1989-01-01
A Kerr generalized solution for a stationary axially-symmetric gravitational field of rotating self-gravitational objects is given. For solving the problem Einstein equations and their combinations are used. The particular cases: internal and external Schwarzschild solutions are considered. The external solution of the stationary problem is a Kerr solution generalization. 3 refs
Marsee, Stuart
After reviewing definitions of general education and statements regarding its importance found in the literature, this paper presents observations to be considered in updating or developing general education programs. It is observed that many disciplines have developed excessive departmentalization; that administrators tend to view general…
Generalized elementary functions
Czech Academy of Sciences Publication Activity Database
Monteiro, Giselle Antunes; Slavík, A.
2014-01-01
Roč. 411, č. 2 (2014), s. 838-852 ISSN 0022-247X Institutional support: RVO:67985840 Keywords : elementary functions * Kurzweil-Stieltjes integral * generalized linear ordinary differential equations * time scale calculus Subject RIV: BA - General Mathematics Impact factor: 1.120, year: 2014 http://www.sciencedirect.com/science/article/pii/S0022247X13009141
DEFF Research Database (Denmark)
Glückstad, Jesper; Palima, Darwin
Generalized Phase Contrast elevates the phase contrast technique not only to improve phase imaging but also to cross over and interface with diverse and seemingly disparate fields of contemporary optics and photonics. This book presents a comprehensive introduction to the Generalized Phase Contrast...
Quantity Constrained General Equilibrium
Babenko, R.; Talman, A.J.J.
2006-01-01
In a standard general equilibrium model it is assumed that there are no price restrictions and that prices adjust infinitely fast to their equilibrium values.In case of price restrictions a general equilibrium may not exist and rationing on net demands or supplies is needed to clear the markets.In
Generalized connectivity of graphs
Li, Xueliang
2016-01-01
Noteworthy results, proof techniques, open problems and conjectures in generalized (edge-) connectivity are discussed in this book. Both theoretical and practical analyses for generalized (edge-) connectivity of graphs are provided. Topics covered in this book include: generalized (edge-) connectivity of graph classes, algorithms, computational complexity, sharp bounds, Nordhaus-Gaddum-type results, maximum generalized local connectivity, extremal problems, random graphs, multigraphs, relations with the Steiner tree packing problem and generalizations of connectivity. This book enables graduate students to understand and master a segment of graph theory and combinatorial optimization. Researchers in graph theory, combinatorics, combinatorial optimization, probability, computer science, discrete algorithms, complexity analysis, network design, and the information transferring models will find this book useful in their studies.
2010-07-01
... 28 Judicial Administration 1 2010-07-01 2010-07-01 false General. 33.60 Section 33.60 Judicial Administration DEPARTMENT OF JUSTICE BUREAU OF JUSTICE ASSISTANCE GRANT PROGRAMS Criminal Justice Block Grants Submission and Review of Applications § 33.60 General. This subpart describes the process and criteria for...
2010-04-01
... 23 Highways 1 2010-04-01 2010-04-01 false General. 710.301 Section 710.301 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RIGHT-OF-WAY AND ENVIRONMENT RIGHT-OF-WAY AND REAL ESTATE Project Development § 710.301 General. The project development process typically follows a...
Experienced General Music Teachers' Instructional Decision Making
Johnson, Daniel C.; Matthews, Wendy K.
2017-01-01
The purpose of this descriptive study was to explore experienced general music teachers' decision-making processes. Participants included seven experienced, American general music teachers who contributed their views during two phases of data collection: (1) responses to three classroom scenarios; and (2) in-depth, semi-structured, follow-up…
2010-07-01
... Administration DEPARTMENT OF JUSTICE (CONTINUED) EMERGENCY FEDERAL LAW ENFORCEMENT ASSISTANCE Submission and Review of Applications § 65.40 General. This subpart describes the process and criteria for the Attorney General's review and approval or disapproval of state applications. The original application, on Standard...
40 CFR 434.11 - General definitions.
2010-07-01
... General Provisions § 434.11 General definitions. (a) The term “acid or ferruginous mine drainage” means mine drainage which, before any treatment, either has a pH of less than 6.0 or a total iron... processes within a coal preparation plant. (h) The term “mine drainage” means any drainage, and any water...
1983-05-01
The VDE system developed had the capability of recognizing up to 248 separate words in syntactic structures. 4 The two systems described are isolated...AND SPEAKER RECOGNITION by M.J.Hunt 5 ASSESSMENT OF SPEECH SYSTEMS ’ ..- * . by R.K.Moore 6 A SURVEY OF CURRENT EQUIPMENT AND RESEARCH’ by J.S.Bridle...TECHNOLOGY IN NAVY TRAINING SYSTEMS by R.Breaux, M.Blind and R.Lynchard 10 9 I-I GENERAL REVIEW OF MILITARY APPLICATIONS OF VOICE PROCESSING DR. BRUNO
Inductive, Analogical, and Communicative Generalization
Directory of Open Access Journals (Sweden)
Adri Smaling
2003-03-01
Full Text Available Three forms of inductive generalization - statistical generalization, variation-based generalization and theory-carried generalization - are insufficient concerning case-to-case generalization, which is a form of analogical generalization. The quality of case-to-case generalization needs to be reinforced by setting up explicit analogical argumentation. To evaluate analogical argumentation six criteria are discussed. Good analogical reasoning is an indispensable support to forms of communicative generalization - receptive and responsive (participative generalization — as well as exemplary generalization.
Applications of Fourier transforms to generalized functions
Rahman, M
2011-01-01
This book explains how Fourier transforms can be applied to generalized functions. The generalized function is one of the important branches of mathematics and is applicable in many practical fields. Its applications to the theory of distribution and signal processing are especially important. The Fourier transform is a mathematical procedure that can be thought of as transforming a function from its time domain to the frequency domain.The book contains six chapters and three appendices. Chapter 1 deals with preliminary remarks on Fourier series from a general point of view and also contains an introduction to the first generalized function. Chapter 2 is concerned with the generalized functions and their Fourier transforms. Chapter 3 contains the Fourier transforms of particular generalized functions. The author has stated and proved 18 formulas dealing with the Fourier transforms of generalized functions, and demonstrated some important problems of practical interest. Chapter 4 deals with the asymptotic esti...
Context-dependent Generalization
Directory of Open Access Journals (Sweden)
Jordan A Taylor
2013-05-01
Full Text Available The pattern of generalization following motor learning can provide a probe on the neural mechanisms underlying learning. For example, the breadth of generalization to untrained regions of space after visuomotor adaptation to targets in a restricted region of space has been attributed to the directional tuning properties of neurons in the motor system. Building on this idea, the effect of different types of perturbations on generalization (e.g., rotation versus visual translation have been attributed to the selection of differentially tuned populations. Overlooked in this discussion is consideration of how the context of the training environment may constrain generalization. Here, we explore the role of context by having participants learn a visuomotor rotation or a translational shift in two different contexts, one in which the array of targets were presented in a circular arrangement and the other in which they were presented in a rectilinear arrangement. The perturbation and environments were either consistent (e.g., rotation with circular arrangement or inconsistent (e.g., rotation with rectilinear arrangement. The pattern of generalization across the workspace was much more dependent on the context of the environment than on the perturbation, with broad generalization for the rectilinear arrangement for both types of perturbations. Moreover, the generalization pattern for this context was evident, even when the perturbation was introduced in a gradual manner, precluding the use of an explicit strategy. We describe how current models of generalization might be modified to incorporate these results, building on the idea that context provides a strong bias for how the motor system infers the nature of the visuomotor perturbation and, in turn, how this information influences the pattern of generalization.
Genesereth, Michael
2014-01-01
General game players are computer systems able to play strategy games based solely on formal game descriptions supplied at ""runtime"" (n other words, they don't know the rules until the game starts). Unlike specialized game players, such as Deep Blue, general game players cannot rely on algorithms designed in advance for specific games; they must discover such algorithms themselves. General game playing expertise depends on intelligence on the part of the game player and not just intelligence of the programmer of the game player.GGP is an interesting application in its own right. It is intell
Generalized estimating equations
Hardin, James W
2002-01-01
Although powerful and flexible, the method of generalized linear models (GLM) is limited in its ability to accurately deal with longitudinal and clustered data. Developed specifically to accommodate these data types, the method of Generalized Estimating Equations (GEE) extends the GLM algorithm to accommodate the correlated data encountered in health research, social science, biology, and other related fields.Generalized Estimating Equations provides the first complete treatment of GEE methodology in all of its variations. After introducing the subject and reviewing GLM, the authors examine th
Nagata, J-I
1985-01-01
This classic work has been fundamentally revised to take account of recent developments in general topology. The first three chapters remain unchanged except for numerous minor corrections and additional exercises, but chapters IV-VII and the new chapter VIII cover the rapid changes that have occurred since 1968 when the first edition appeared.The reader will find many new topics in chapters IV-VIII, e.g. theory of Wallmann-Shanin's compactification, realcompact space, various generalizations of paracompactness, generalized metric spaces, Dugundji type extension theory, linearly ordered topolo
Jiang, Shu-Han; Xu, Zhen-Peng; Su, Hong-Yi; Pati, Arun Kumar; Chen, Jing-Ling
2018-01-01
Here, we present the most general framework for n -particle Hardy's paradoxes, which include Hardy's original one and Cereceda's extension as special cases. Remarkably, for any n ≥3 , we demonstrate that there always exist generalized paradoxes (with the success probability as high as 1 /2n -1) that are stronger than the previous ones in showing the conflict of quantum mechanics with local realism. An experimental proposal to observe the stronger paradox is also presented for the case of three qubits. Furthermore, from these paradoxes we can construct the most general Hardy's inequalities, which enable us to detect Bell's nonlocality for more quantum states.
Generalizations of orthogonal polynomials
Bultheel, A.; Cuyt, A.; van Assche, W.; van Barel, M.; Verdonk, B.
2005-07-01
We give a survey of recent generalizations of orthogonal polynomials. That includes multidimensional (matrix and vector orthogonal polynomials) and multivariate versions, multipole (orthogonal rational functions) variants, and extensions of the orthogonality conditions (multiple orthogonality). Most of these generalizations are inspired by the applications in which they are applied. We also give a glimpse of these applications, which are usually generalizations of applications where classical orthogonal polynomials also play a fundamental role: moment problems, numerical quadrature, rational approximation, linear algebra, recurrence relations, and random matrices.
Uranium enrichment. Enrichment processes
International Nuclear Information System (INIS)
Alexandre, M.; Quaegebeur, J.P.
2009-01-01
Despite the remarkable progresses made in the diversity and the efficiency of the different uranium enrichment processes, only two industrial processes remain today which satisfy all of enriched uranium needs: the gaseous diffusion and the centrifugation. This article describes both processes and some others still at the demonstration or at the laboratory stage of development: 1 - general considerations; 2 - gaseous diffusion: physical principles, implementation, utilisation in the world; 3 - centrifugation: principles, elementary separation factor, flows inside a centrifuge, modeling of separation efficiencies, mechanical design, types of industrial centrifuges, realisation of cascades, main characteristics of the centrifugation process; 4 - aerodynamic processes: vortex process, nozzle process; 5 - chemical exchange separation processes: Japanese ASAHI process, French CHEMEX process; 6 - laser-based processes: SILVA process, SILMO process; 7 - electromagnetic and ionic processes: mass spectrometer and calutron, ion cyclotron resonance, rotating plasmas; 8 - thermal diffusion; 9 - conclusion. (J.S.)
GVS - GENERAL VISUALIZATION SYSTEM
Keith, S. R.
1994-01-01
The primary purpose of GVS (General Visualization System) is to support scientific visualization of data output by the panel method PMARC_12 (inventory number ARC-13362) on the Silicon Graphics Iris computer. GVS allows the user to view PMARC geometries and wakes as wire frames or as light shaded objects. Additionally, geometries can be color shaded according to phenomena such as pressure coefficient or velocity. Screen objects can be interactively translated and/or rotated to permit easy viewing. Keyframe animation is also available for studying unsteady cases. The purpose of scientific visualization is to allow the investigator to gain insight into the phenomena they are examining, therefore GVS emphasizes analysis, not artistic quality. GVS uses existing IRIX 4.0 image processing tools to allow for conversion of SGI RGB files to other formats. GVS is a self-contained program which contains all the necessary interfaces to control interaction with PMARC data. This includes 1) the GVS Tool Box, which supports color histogram analysis, lighting control, rendering control, animation, and positioning, 2) GVS on-line help, which allows the user to access control elements and get information about each control simultaneously, and 3) a limited set of basic GVS data conversion filters, which allows for the display of data requiring simpler data formats. Specialized controls for handling PMARC data include animation and wakes, and visualization of off-body scan volumes. GVS is written in C-language for use on SGI Iris series computers running IRIX. It requires 28Mb of RAM for execution. Two separate hardcopy documents are available for GVS. The basic document price for ARC-13361 includes only the GVS User's Manual, which outlines major features of the program and provides a tutorial on using GVS with PMARC_12 data. Programmers interested in modifying GVS for use with data in formats other than PMARC_12 format may purchase a copy of the draft GVS 3.1 Software Maintenance
Anomaly General Circulation Models.
Navarra, Antonio
The feasibility of the anomaly model is assessed using barotropic and baroclinic models. In the barotropic case, both a stationary and a time-dependent model has been formulated and constructed, whereas only the stationary, linear case is considered in the baroclinic case. Results from the barotropic model indicate that a relation between the stationary solution and the time-averaged non-linear solution exists. The stationary linear baroclinic solution can therefore be considered with some confidence. The linear baroclinic anomaly model poses a formidable mathematical problem because it is necessary to solve a gigantic linear system to obtain the solution. A new method to find solution of large linear system, based on a projection on the Krylov subspace is shown to be successful when applied to the linearized baroclinic anomaly model. The scheme consists of projecting the original linear system on the Krylov subspace, thereby reducing the dimensionality of the matrix to be inverted to obtain the solution. With an appropriate setting of the damping parameters, the iterative Krylov method reaches a solution even using a Krylov subspace ten times smaller than the original space of the problem. This generality allows the treatment of the important problem of linear waves in the atmosphere. A larger class (nonzonally symmetric) of basic states can now be treated for the baroclinic primitive equations. These problem leads to large unsymmetrical linear systems of order 10000 and more which can now be successfully tackled by the Krylov method. The (R7) linear anomaly model is used to investigate extensively the linear response to equatorial and mid-latitude prescribed heating. The results indicate that the solution is deeply affected by the presence of the stationary waves in the basic state. The instability of the asymmetric flows, first pointed out by Simmons et al. (1983), is active also in the baroclinic case. However, the presence of baroclinic processes modifies the
Energy Technology Data Exchange (ETDEWEB)
Kamada, Kohei [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Kobayashi, Tsutomu [Kyoto Univ. (Japan). Hakubi Center; Kyoto Univ. (Japan). Dept. of Physics; Takahashi, Tomo [Saga Univ. (Japan). Dept. of Physics; Yamaguchi, Masahide [Tokyo Institute of Technology (Japan). Dept. of Physics; Yokoyama, Jun' ichi [Tokyo Univ. (JP). Research Center for the Early Universe (RESCEU); Tokyo Univ., Chiba (JP). Inst. for the Physics and Mathematics of the Universe (IPMU)
2012-03-15
We study Higgs inflation in the context of generalized G-inflation, i.e., the most general single-field inflation model with second-order field equations. The four variants of Higgs inflation proposed so far in the literature can be accommodated at one time in our framework. We also propose yet another class of Higgs inflation, the running Einstein inflation model, that can naturally arise from the generalized G-inflation framework. As a result, five Higgs inflation models in all should be discussed on an equal footing. Concise formulas for primordial fluctuations in these generalized Higgs inflation models are provided, which will be helpful to determine which model is favored from the future experiments and observations such as the Large Hadron Collider and the Planck satellite.
Full Text Available ... General ultrasound procedure View full size with caption Pediatric Content Some imaging tests and treatments have special pediatric considerations. The teddy bear denotes child-specific content. ...
Full Text Available ... inserted into a man's rectum to view the prostate. Transvaginal ultrasound. The transducer is inserted into a ... Stenting Ultrasound-Guided Breast Biopsy Obstetric Ultrasound Ultrasound - Prostate Biopsies - Overview Images related to General Ultrasound Videos ...
Engineering general intelligence
Goertzel, Ben; Geisweiller, Nil
2014-01-01
The work outlines a novel conceptual and theoretical framework for understanding Artificial General Intelligence and based on this framework outlines a practical roadmap for the development of AGI with capability at the human level and ultimately beyond.
Engineering general intelligence
Goertzel, Ben; Geisweiller, Nil
2014-01-01
The work outlines a detailed blueprint for the creation of an Artificial General Intelligence system with capability at the human level and ultimately beyond, according to the Cog Prime AGI design and the Open Cog software architecture.
Generalizing smooth transition autoregressions
DEFF Research Database (Denmark)
Chini, Emilio Zanetti
We introduce a variant of the smooth transition autoregression - the GSTAR model - capable to parametrize the asymmetry in the tails of the transition equation by using a particular generalization of the logistic function. A General-to-Specific modelling strategy is discussed in detail, with part......We introduce a variant of the smooth transition autoregression - the GSTAR model - capable to parametrize the asymmetry in the tails of the transition equation by using a particular generalization of the logistic function. A General-to-Specific modelling strategy is discussed in detail......, with particular emphasis on two different LM-type tests for the null of symmetric adjustment towards a new regime and three diagnostic tests, whose power properties are explored via Monte Carlo experiments. Four classical real datasets illustrate the empirical properties of the GSTAR, jointly to a rolling...
Theoretical general relativity: 1979
International Nuclear Information System (INIS)
Bergmann, O.
1979-01-01
The metric and field equations of Einstein's general relativity theory are written down. Solutions to the equations are discussed. Connection is made between relativity theory and elementary particle theory. Possibilities for a unified field theory are considered
Jackson, A. T.
1973-01-01
Reviews theoretical and experimental fundamentals of Einstein's theory of general relativity. Indicates that recent development of the theory of the continually expanding universe may lead to revision of the space-time continuum of the finite and unbounded universe. (CC)
General relativity and experiment
Damour, T.
1994-01-01
The confrontation between Einstein's theory of gravitation and experiment is summarized. Although all current experimental data are compatible with general relativity, the importance of pursuing the quest for possible deviations from Einstein's theory is emphasized.
Full Text Available ... Index A-Z General Ultrasound Ultrasound imaging uses sound waves to produce pictures of the inside of ... pictures of the inside of the body using sound waves. Ultrasound imaging, also called ultrasound scanning or ...
Full Text Available ... More Info Images/Videos About Us News Physician Resources Professions Site Index A-Z General Ultrasound Ultrasound ... computer or television monitor. The image is created based on the amplitude (loudness), frequency (pitch) and time ...
Chemical Speciation - General Information
This page includes general information about the Chemical Speciation Network that is not covered on the main page. Commonly visited documents, including calendars, site lists, and historical files for the program are listed here
Full Text Available ... is General Ultrasound Imaging? What are some common uses of the procedure? How should I prepare? What does the equipment look like? How does the procedure work? How is the procedure performed? What will I ...
Full Text Available ... Videos related to General Ultrasound Sponsored by Please note RadiologyInfo.org is not a medical facility. Please ... is further reviewed by committees from the American College of Radiology (ACR) and the Radiological Society of ...
Full Text Available ... Send us your feedback Did you find the information you were looking for? Yes No Please type your comment or suggestion ... General ultrasound procedure View full size with caption Pediatric ...
Superstability of Generalized Derivations
Directory of Open Access Journals (Sweden)
Ansari-Piri Esmaeil
2010-01-01
Full Text Available We investigate the superstability of the functional equation , where and are the mappings on Banach algebra . We have also proved the superstability of generalized derivations associated to the linear functional equation , where .
Czech Academy of Sciences Publication Activity Database
Zuevsky, Alexander
2016-01-01
Roč. 8, č. 3 (2016), s. 225-231 ISSN 1942-5600 Institutional support: RVO:67985840 Keywords : modular discriminant * Fay's trisecant identities * modular forms Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics
Read, Andrew F.
2013-01-01
General education must develop in students an appreciation of the power of science, how it works, why it is an effective knowledge generation tool, and what it can deliver. Knowing what science has discovered is desirable but less important.
Tuberculosis: General Information
TB Elimination Tuberculosis: General Information What is TB? Tuberculosis (TB) is a disease caused by germs that are spread from person ... Viral Hepatitis, STD, and TB Prevention Division of Tuberculosis Elimination CS227840_A What Does a Positive Test ...
General Chemistry for Engineers.
Kybett, B. D.
1982-01-01
Discusses the relationship between molecular structure, intermolecular forces, and tensile strengths of a polymer and suggests that this is a logical way to introduce polymers into a general chemistry course. (Author/JN)
National Research Council Canada - National Science Library
Elias, Bart
2005-01-01
General aviation (GA) -- a catch-all category that includes about 57% of all civilian aviation activity within the United States -- encompasses a wide range of airports, aircraft, and flight operations...
International Nuclear Information System (INIS)
Majumdar, S.
1988-09-01
A simple proof of Magnus' Freiheitsatz for one-relator groups is given which can be easily applied with slight modifications to prove a generalization of the theorem to any number of relators. (author). 10 refs
Generalized concatenated quantum codes
International Nuclear Information System (INIS)
Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei
2009-01-01
We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.
Ray, J. R.
1982-01-01
Two theories of matter in general relativity, the fluid theory and the kinetic theory, were studied. Results include: (1) a discussion of various methods of completing the fluid equations; (2) a method of constructing charged general relativistic solutions in kinetic theory; and (3) a proof and discussion of the incompatibility of perfect fluid solutions in anisotropic cosmologies. Interpretations of NASA gravitational experiments using the above mentioned results were started. Two papers were prepared for publications based on this work.
International Nuclear Information System (INIS)
Jonsson, Rickard; Westman, Hans
2006-01-01
We show that by employing the standard projected curvature as a measure of spatial curvature, we can make a certain generalization of optical geometry (Abramowicz M A and Lasota J-P 1997 Class. Quantum Grav. A 14 23-30). This generalization applies to any spacetime that admits a hypersurface orthogonal shearfree congruence of worldlines. This is a somewhat larger class of spacetimes than the conformally static spacetimes assumed in standard optical geometry. In the generalized optical geometry, which in the generic case is time dependent, photons move with unit speed along spatial geodesics and the sideways force experienced by a particle following a spatially straight line is independent of the velocity. Also gyroscopes moving along spatial geodesics do not precess (relative to the forward direction). Gyroscopes that follow a curved spatial trajectory precess according to a very simple law of three-rotation. We also present an inertial force formalism in coordinate representation for this generalization. Furthermore, we show that by employing a new sense of spatial curvature (Jonsson R 2006 Class. Quantum Grav. 23 1)) closely connected to Fermat's principle, we can make a more extensive generalization of optical geometry that applies to arbitrary spacetimes. In general this optical geometry will be time dependent, but still geodesic photons move with unit speed and follow lines that are spatially straight in the new sense. Also, the sideways experienced (comoving) force on a test particle following a line that is straight in the new sense will be independent of the velocity
A generalized wavelet extrema representation
Energy Technology Data Exchange (ETDEWEB)
Lu, Jian; Lades, M.
1995-10-01
The wavelet extrema representation originated by Stephane Mallat is a unique framework for low-level and intermediate-level (feature) processing. In this paper, we present a new form of wavelet extrema representation generalizing Mallat`s original work. The generalized wavelet extrema representation is a feature-based multiscale representation. For a particular choice of wavelet, our scheme can be interpreted as representing a signal or image by its edges, and peaks and valleys at multiple scales. Such a representation is shown to be stable -- the original signal or image can be reconstructed with very good quality. It is further shown that a signal or image can be modeled as piecewise monotonic, with all turning points between monotonic segments given by the wavelet extrema. A new projection operator is introduced to enforce piecewise inonotonicity of a signal in its reconstruction. This leads to an enhancement to previously developed algorithms in preventing artifacts in reconstructed signal.
Multivariate Generalized Multiscale Entropy Analysis
Directory of Open Access Journals (Sweden)
Anne Humeau-Heurtier
2016-11-01
Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.
A generalized perturbation program for CANDU reactor
Energy Technology Data Exchange (ETDEWEB)
Kim, Do Heon; Kim, Jong Kyung [Hanyang University, Seoul (Korea, Republic of); Choi, Hang Bok; Roh, Gyu Hong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Yang, Won Sik [Chosun University, Kwangju (Korea, Republic of)
1999-12-31
A generalized perturbation program has been developed for the purpose of estimating zonal power variation of a CANDU reactor upon refueling operation. The forward and adjoint calculation modules of RFSP code were used to construct the generalized perturbation program. The numerical algorithm for the generalized adjoint flux calculation was verified by comparing the zone power estimates upon refueling with those of forward calculation. It was, however, noticed that the truncation error from the iteration process of the generalized adjoint flux is not negligible. 2 refs., 1 figs., 1 tab. (Author)
A generalized perturbation program for CANDU reactor
Energy Technology Data Exchange (ETDEWEB)
Kim, Do Heon; Kim, Jong Kyung [Hanyang University, Seoul (Korea, Republic of); Choi, Hang Bok; Roh, Gyu Hong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Yang, Won Sik [Chosun University, Kwangju (Korea, Republic of)
1998-12-31
A generalized perturbation program has been developed for the purpose of estimating zonal power variation of a CANDU reactor upon refueling operation. The forward and adjoint calculation modules of RFSP code were used to construct the generalized perturbation program. The numerical algorithm for the generalized adjoint flux calculation was verified by comparing the zone power estimates upon refueling with those of forward calculation. It was, however, noticed that the truncation error from the iteration process of the generalized adjoint flux is not negligible. 2 refs., 1 figs., 1 tab. (Author)
General knowledge structure for diagnosis
International Nuclear Information System (INIS)
Steinar Brendeford, T.
1996-01-01
At the OECD Halden Reactor Project work has been going on for several years in the field of automatic fault diagnosis for nuclear power plants. Continuing this work, studies are now carried out to combine different diagnostic systems within the same framework. The goal is to establish a general knowledge structure for diagnosis applied to a NPP process. Such a consistent and generic storage of knowledge will lighten the task of combining different diagnosis techniques. An integration like this is expected to increase the robustness and widen the scope of the diagnosis. Further, verification of system reliability and on-line explanations of hypotheses can be helped. Last but not least there is a potential in reuse of both specific and generic knowledge. The general knowledge framework is also a prerequisite for a successful integration of computerized operator support systems within the process supervision and control complex. Consistency, verification and reuse are keywords also in this respect. Systems that should be considered for integration are; automatic control, computerized operator procedures, alarm - and alarm filtering, signal validation, diagnosis and condition based maintenance. This paper presents three prototype diagnosis systems developed at the OECD Halden Reactor Project. A software arrangement for process simulation with these three systems attached in parallel is briefly described. The central part of this setup is a 'blackboard' system to be used for representing shared knowledge. Examples of such knowledge representations are included in the paper. The conclusions so far in this line of work are only tentative. The studies of existing methodologies for diagnosis, however, show a potential for several generalizations to be made in knowledge representation and use. (author). 14 refs, 6 figs
General knowledge structure for diagnosis
Energy Technology Data Exchange (ETDEWEB)
Steinar Brendeford, T [Institutt for Energiteknikk, Halden (Norway). OECD Halden Reaktor Projekt
1997-12-31
At the OECD Halden Reactor Project work has been going on for several years in the field of automatic fault diagnosis for nuclear power plants. Continuing this work, studies are now carried out to combine different diagnostic systems within the same framework. The goal is to establish a general knowledge structure for diagnosis applied to a NPP process. Such a consistent and generic storage of knowledge will lighten the task of combining different diagnosis techniques. An integration like this is expected to increase the robustness and widen the scope of the diagnosis. Further, verification of system reliability and on-line explanations of hypotheses can be helped. Last but not least there is a potential in reuse of both specific and generic knowledge. The general knowledge framework is also a prerequisite for a successful integration of computerized operator support systems within the process supervision and control complex. Consistency, verification and reuse are keywords also in this respect. Systems that should be considered for integration are; automatic control, computerized operator procedures, alarm - and alarm filtering, signal validation, diagnosis and condition based maintenance. This paper presents three prototype diagnosis systems developed at the OECD Halden Reactor Project. A software arrangement for process simulation with these three systems attached in parallel is briefly described. The central part of this setup is a `blackboard` system to be used for representing shared knowledge. Examples of such knowledge representations are included in the paper. The conclusions so far in this line of work are only tentative. The studies of existing methodologies for diagnosis, however, show a potential for several generalizations to be made in knowledge representation and use. (author). 14 refs, 6 figs.
Definition of Nonequilibrium Entropy of General Systems
Mei, Xiaochun
1999-01-01
The definition of nonequilibrium entropy is provided for the general nonequilibrium processes by connecting thermodynamics with statistical physics, and the principle of entropy increment in the nonequilibrium processes is also proved in the paper. The result shows that the definition of nonequilibrium entropy is not unique.
General principles of the quality management
International Nuclear Information System (INIS)
Koutaniemi, P.
2005-01-01
The objective of the presentation is to outline some general infrastructure of nuclear industry with regard to the quality management; to emphasise significance of safety management as an integral part of the quality management; and to highlight different steps of the management process in a near-time working, at an annual level and as a strategic process
Directory of Open Access Journals (Sweden)
Glória da Anunciação Alves
2012-03-01
propõe a discutir o processo de metropolização parisiense e as transformações socioespaciais resultantes, dando destaque para como o discurso da mixité social é utilizado justificando processos de segregação socioespacial.To speak about the metropolitan phenomenon today, in the XXI century, implies in going beyond the growing and multiplication phenomenon of the large urban agglomerations, in general continuous. The metropolization of the space is a process that today can also be characterized by the territorial discontinuity, which articulates cities from the productive processes due to the existence of (tecno-informational nets that enable the connections among non continuous spaces. According to Lencioni (2003 there’s a limit for that given by the area of influence and articulations with the metropolis.But to speak about the metropolization is also to go beyond the growing and multiplication of agglomeration and wealth. It’s trying to understand the contradictory process implicated in this relationship and in the places that configure the traditional metropolis, in other words, its territorial continuity. It’s also to discuss the growing of poverty, and of the spatial differentiation. In the metropolitan expanding, with or without the public power endorsement, peripheral areas arise which are necessary for the growing of metropolitan wealth.Today the issue of the metropolitan phenomenon is institutionally put. The “Grand Plan Paris” evokes actions to promote the institutionalization of what would be the metropolization of Paris that, according to the state power is necessary to guarantee Paris in the list of the world cities, keeping it inserted in the world frame of global investments.But what does institutionalize the metropolis mean? On one side it implies in officially putting in practice a national project of international competitiveness (world scale and on the other side, putting in focus the Paris-banlieues conflicts and relationships
Lectures on general relativity
Papapetrou, Achille
1974-01-01
This book is an elaboration of lecture notes for the graduate course on General Rela tivity given by the author at Boston University in the spring semester of 1972. It is an introduction to the subject only, as the time available for the course was limited. The author of an introduction to General Relativity is faced from the beginning with the difficult task of choosing which material to include. A general criterion as sisting in this choice is provided by the didactic character of the book: Those chapters have to be included in priority, which will be most useful to the reader in enabling him to understand the methods used in General Relativity, the results obtained so far and possibly the problems still to be solved. This criterion is not sufficient to ensure a unique choice. General Relativity has developed to such a degree, that it is impossible to include in an introductory textbook of a reasonable length even a very condensed treatment of all important problems which have been discussed unt...
International Nuclear Information System (INIS)
Hayashi, K.; Shirafuji, T.
1979-01-01
A gravitational theory is formulated on the Weitzenboeck space-time, characterized by the vanishing curvature tensor (absolute parallelism) and by the torsion tensor formed of four parallel vector fields. This theory is called new general relativity, since Einstein in 1928 first gave its original form. New general relativity has three parameters c 1 , c 2 , and lambda, besides the Einstein constant kappa. In this paper we choose c 1 = 0 = c 2 , leaving open lambda. We prove, among other things, that (i) a static, spherically symmetric gravitational field is given by the Schwarzschild metric, that (ii) in the weak-field approximation an antisymmetric field of zero mass and zero spin exists, besides gravitons, and that (iii) new general relativity agrees with all the experiments so far carried out
Generalized Gaussian Error Calculus
Grabe, Michael
2010-01-01
For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...
Generalized isothermic lattices
International Nuclear Information System (INIS)
Doliwa, Adam
2007-01-01
We study multi-dimensional quadrilateral lattices satisfying simultaneously two integrable constraints: a quadratic constraint and the projective Moutard constraint. When the lattice is two dimensional and the quadric under consideration is the Moebius sphere one obtains, after the stereographic projection, the discrete isothermic surfaces defined by Bobenko and Pinkall by an algebraic constraint imposed on the (complex) cross-ratio of the circular lattice. We derive the analogous condition for our generalized isothermic lattices using Steiner's projective structure of conics, and we present basic geometric constructions which encode integrability of the lattice. In particular, we introduce the Darboux transformation of the generalized isothermic lattice and we derive the corresponding Bianchi permutability principle. Finally, we study two-dimensional generalized isothermic lattices, in particular geometry of their initial boundary value problem
Glückstad, Jesper
2009-01-01
Generalized Phase Contrast elevates the phase contrast technique not only to improve phase imaging but also to cross over and interface with diverse and seemingly disparate fields of contemporary optics and photonics. This book presents a comprehensive introduction to the Generalized Phase Contrast (GPC) method including an overview of the range of current and potential applications of GPC in wavefront sensing and phase imaging, structured laser illumination and image projection, optical trapping and manipulation, and optical encryption and decryption. The GPC method goes further than the restrictive assumptions of conventional Zernike phase contrast analysis and achieves an expanded range of validity beyond weak phase perturbations. The generalized analysis yields design criteria for tuning experimental parameters to achieve optimal performance in terms of accuracy, fidelity and light efficiency. Optimization can address practical issues, such as finding an optimal spatial filter for the chosen application, ...
The evolution of robotic general surgery.
Wilson, E B
2009-01-01
Surgical robotics in general surgery has a relatively short but very interesting evolution. Just as minimally invasive and laparoscopic techniques have radically changed general surgery and fractionated it into subspecialization, robotic technology is likely to repeat the process of fractionation even further. Though it appears that robotics is growing more quickly in other specialties, the changes digital platforms are causing in the general surgical arena are likely to permanently alter general surgery. This review examines the evolution of robotics in minimally invasive general surgery looking forward to a time where robotics platforms will be fundamental to elective general surgery. Learning curves and adoption techniques are explored. Foregut, hepatobiliary, endocrine, colorectal, and bariatric surgery will be examined as growth areas for robotics, as well as revealing the current uses of this technology.
Generalized Jacobi identities in gauge theories
International Nuclear Information System (INIS)
Chaves, F.M.P.
1990-01-01
A spatial generalized Jacobi identity obeyed by the polarization-dependent factors of the vertices in a q q-bar - Wγ process is studied. The amplitude of a scattering gluon-gluon with five particles is worked out. By reorganizing this amplitude in analogy with an interaction process photon-pion, the non existence of the spatial generalized Jacobi identity, but instead many spatial partial identities that compose themselves, in the case of a four particle process, in one single identity is shown. A process with four particles, three of them scalar fields, but in the one loop approximation is studied. In this case also, the non existence of the spatial generalized Jacobi identity is demonstrated. (author)
International Nuclear Information System (INIS)
McGarrie, Moritz
2012-07-01
We extend the framework of general gauge mediation to cases where the mediating fields have a nontrivial spectral function, as might arise from strong dynamics. We demonstrate through examples that this setup describes a broad class of possible models of gauge mediated supersymmetry breaking. A main emphasis is to give general formulas for cross sections for σ(visible → hidden) in these resonance models. We will also give formulas for soft masses, A-terms and demonstrate the framework with a holographic setup.
Fractional Order Generalized Information
Directory of Open Access Journals (Sweden)
José Tenreiro Machado
2014-04-01
Full Text Available This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.
International Nuclear Information System (INIS)
Sladkowski, J.
1991-01-01
Various attempts to formulate the fundamental physical interactions in the framework of unified geometric theories have recently gained considerable success (Kaluza, 1921; Klein, 1926; Trautmann, 1970; Cho, 1975). Symmetries of the spacetime and so-called internal spaces seem to play a key role in investigating both the fundamental interactions and the abundance of elementary particles. The author presents a category-theoretic description of a generalization of the G-theory concept and its application to geometric compactification and dimensional reduction. The main reasons for using categories and functors as tools are the clearness and the level of generalization one can obtain
General minisum circle location
DEFF Research Database (Denmark)
Körner, Mark; Brimberg, Jack; Juel, Henrik
2009-01-01
In our paper we approximate a set of given points by a general circle. More precisely, we consider the problem of locating and scaling the unit ball of some given norm k1 with respect to xed points on the plane such that the sum of weighted distances between the circle and the xed points is minim......In our paper we approximate a set of given points by a general circle. More precisely, we consider the problem of locating and scaling the unit ball of some given norm k1 with respect to xed points on the plane such that the sum of weighted distances between the circle and the xed points...
Implementing general gauge mediation
International Nuclear Information System (INIS)
Carpenter, Linda M.; Dine, Michael; Festuccia, Guido; Mason, John D.
2009-01-01
Recently there has been much progress in building models of gauge mediation, often with predictions different than those of minimal gauge mediation. Meade, Seiberg, and Shih have characterized the most general spectrum which can arise in gauge-mediated models. We discuss some of the challenges of building models of general gauge mediation, especially the problem of messenger parity and issues connected with R symmetry breaking and CP violation. We build a variety of viable, weakly coupled models which exhibit some or all of the possible low energy parameters.
Energy Technology Data Exchange (ETDEWEB)
McGarrie, Moritz
2012-07-15
We extend the framework of general gauge mediation to cases where the mediating fields have a nontrivial spectral function, as might arise from strong dynamics. We demonstrate through examples that this setup describes a broad class of possible models of gauge mediated supersymmetry breaking. A main emphasis is to give general formulas for cross sections for {sigma}(visible {yields} hidden) in these resonance models. We will also give formulas for soft masses, A-terms and demonstrate the framework with a holographic setup.
Morita, K
1989-01-01
Being an advanced account of certain aspects of general topology, the primary purpose of this volume is to provide the reader with an overview of recent developments.The papers cover basic fields such as metrization and extension of maps, as well as newly-developed fields like categorical topology and topological dynamics. Each chapter may be read independently of the others, with a few exceptions. It is assumed that the reader has some knowledge of set theory, algebra, analysis and basic general topology.
Potvin, Guy
2015-10-01
We examine how the Rytov approximation describing log-amplitude and phase fluctuations of a wave propagating through weak uniform turbulence can be generalized to the case of turbulence with a large-scale nonuniform component. We show how the large-scale refractive index field creates Fermat rays using the path integral formulation for paraxial propagation. We then show how the second-order derivatives of the Fermat ray action affect the Rytov approximation, and we discuss how a numerical algorithm would model the general Rytov approximation.
General Drafting. Technical Manual.
Department of the Army, Washington, DC.
The manual provides instructional guidance and reference material in the principles and procedures of general drafting and constitutes the primary study text for personnel in drafting as a military occupational specialty. Included is information on drafting equipment and its use; line weights, conventions and formats; lettering; engineering charts…
Towards General Temporal Aggregation
DEFF Research Database (Denmark)
Boehlen, Michael H.; Gamper, Johann; Jensen, Christian Søndergaard
2008-01-01
associated with the management of temporal data. Indeed, temporal aggregation is complex and among the most difficult, and thus interesting, temporal functionality to support. This paper presents a general framework for temporal aggregation that accommodates existing kinds of aggregation, and it identifies...
Uniqueness of generalized parafermions
International Nuclear Information System (INIS)
Bougourzi, A.H.; Ho-Kim, Q.; Kikuchi, Y.; Lam, C.S.
1991-01-01
This paper gives an explicit construction of the Feigin--Fuchs representations of the generalized parafermions associated with SU(n) and write down the screening charges for the parafermionic model of SU(3). The authors show that the two representations the authors use are equivalent to each other and to two other representations recently proposed
Interrogation: General vs. Local.
Johnson, Jeannette
This paper proposes a set of hypotheses on the nature of interrogration as a possible language universal. Examples and phrase structure rules and diagrams are given. Examining Tamazight and English, genetically unrelated languages with almost no contact, the author distinguishes two types of interrogation: (1) general, querying acceptability to…
Generalized interpolative quantum statistics
International Nuclear Information System (INIS)
Ramanathan, R.
1992-01-01
A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently