WorldWideScience

Sample records for rhwm generally processes

  1. Documented Safety Analysis for the B695 Segment

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D

    2008-09-11

    This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., {sup 90}Sr, {sup 137}Cs, or {sup 3}H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building

  2. Documented Safety Analysis for the B695 Segment

    International Nuclear Information System (INIS)

    Laycak, D.

    2008-01-01

    This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., 90 Sr, 137 Cs, or 3 H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building systems, and keeping

  3. Generalized Ornstein-Uhlenbeck processes and associated self-similar processes

    CERN Document Server

    Lim, S C

    2003-01-01

    We consider three types of generalized Ornstein-Uhlenbeck processes: the stationary process obtained from the Lamperti transformation of fractional Brownian motion, the process with stretched exponential covariance and the process obtained from the solution of the fractional Langevin equation. These stationary Gaussian processes have many common properties, such as the fact that their local covariances share a similar structure and they exhibit identical spectral densities at large frequency limit. In addition, the generalized Ornstein-Uhlenbeck processes can be shown to be local stationary representations of fractional Brownian motion. Two new self-similar Gaussian processes, in addition to fractional Brownian motion, are obtained by applying the (inverse) Lamperti transformation to the generalized Ornstein-Uhlenbeck processes. We study some of the properties of these self-similar processes such as the long-range dependence. We give a simulation of their sample paths based on numerical Karhunan-Loeve expansi...

  4. Generalized Ornstein-Uhlenbeck processes and associated self-similar processes

    International Nuclear Information System (INIS)

    Lim, S C; Muniandy, S V

    2003-01-01

    We consider three types of generalized Ornstein-Uhlenbeck processes: the stationary process obtained from the Lamperti transformation of fractional Brownian motion, the process with stretched exponential covariance and the process obtained from the solution of the fractional Langevin equation. These stationary Gaussian processes have many common properties, such as the fact that their local covariances share a similar structure and they exhibit identical spectral densities at large frequency limit. In addition, the generalized Ornstein-Uhlenbeck processes can be shown to be local stationary representations of fractional Brownian motion. Two new self-similar Gaussian processes, in addition to fractional Brownian motion, are obtained by applying the (inverse) Lamperti transformation to the generalized Ornstein-Uhlenbeck processes. We study some of the properties of these self-similar processes such as the long-range dependence. We give a simulation of their sample paths based on numerical Karhunan-Loeve expansion

  5. Experiments to Distribute Map Generalization Processes

    Science.gov (United States)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  6. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  7. Renewal processes based on generalized Mittag-Leffler waiting times

    Science.gov (United States)

    Cahoy, Dexter O.; Polito, Federico

    2013-03-01

    The fractional Poisson process has recently attracted experts from several fields of study. Its natural generalization of the ordinary Poisson process made the model more appealing for real-world applications. In this paper, we generalized the standard and fractional Poisson processes through the waiting time distribution, and showed their relations to an integral operator with a generalized Mittag-Leffler function in the kernel. The waiting times of the proposed renewal processes have the generalized Mittag-Leffler and stretched-squashed Mittag-Leffler distributions. Note that the generalizations naturally provide greater flexibility in modeling real-life renewal processes. Algorithms to simulate sample paths and to estimate the model parameters are derived. Note also that these procedures are necessary to make these models more usable in practice. State probabilities and other qualitative or quantitative features of the models are also discussed.

  8. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    Science.gov (United States)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  9. Quantum thermodynamics of general quantum processes.

    Science.gov (United States)

    Binder, Felix; Vinjanampathy, Sai; Modi, Kavan; Goold, John

    2015-03-01

    Accurately describing work extraction from a quantum system is a central objective for the extension of thermodynamics to individual quantum systems. The concepts of work and heat are surprisingly subtle when generalizations are made to arbitrary quantum states. We formulate an operational thermodynamics suitable for application to an open quantum system undergoing quantum evolution under a general quantum process by which we mean a completely positive and trace-preserving map. We derive an operational first law of thermodynamics for such processes and show consistency with the second law. We show that heat, from the first law, is positive when the input state of the map majorizes the output state. Moreover, the change in entropy is also positive for the same majorization condition. This makes a strong connection between the two operational laws of thermodynamics.

  10. 20 CFR 405.701 - Expedited appeals process-general.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Expedited appeals process-general. 405.701 Section 405.701 Employees' Benefits SOCIAL SECURITY ADMINISTRATION ADMINISTRATIVE REVIEW PROCESS FOR ADJUDICATING INITIAL DISABILITY CLAIMS Expedited Appeals Process for Constitutional Issues § 405.701 Expedited...

  11. OVPD-processed OLED for general lighting

    OpenAIRE

    Bösing, Manuel

    2012-01-01

    Due to continuous advancements of materials for organic light emitting diodes (OLED) a new field of application currently opens up for OLED technology: General lighting. A significant reduction of OLED production cost might be achieved by employing organic vapor phase deposition (OVPD). OVPD is a novel process for depositing organic thin films from the gas phase. In contrast to the well established process of vacuum thermal evaporation (VTE), OVPD allows to achieve much higher deposition rate...

  12. Learning Theory Estimates with Observations from General Stationary Stochastic Processes.

    Science.gov (United States)

    Hang, Hanyuan; Feng, Yunlong; Steinwart, Ingo; Suykens, Johan A K

    2016-12-01

    This letter investigates the supervised learning problem with observations drawn from certain general stationary stochastic processes. Here by general, we mean that many stationary stochastic processes can be included. We show that when the stochastic processes satisfy a generalized Bernstein-type inequality, a unified treatment on analyzing the learning schemes with various mixing processes can be conducted and a sharp oracle inequality for generic regularized empirical risk minimization schemes can be established. The obtained oracle inequality is then applied to derive convergence rates for several learning schemes such as empirical risk minimization (ERM), least squares support vector machines (LS-SVMs) using given generic kernels, and SVMs using gaussian kernels for both least squares and quantile regression. It turns out that for independent and identically distributed (i.i.d.) processes, our learning rates for ERM recover the optimal rates. For non-i.i.d. processes, including geometrically [Formula: see text]-mixing Markov processes, geometrically [Formula: see text]-mixing processes with restricted decay, [Formula: see text]-mixing processes, and (time-reversed) geometrically [Formula: see text]-mixing processes, our learning rates for SVMs with gaussian kernels match, up to some arbitrarily small extra term in the exponent, the optimal rates. For the remaining cases, our rates are at least close to the optimal rates. As a by-product, the assumed generalized Bernstein-type inequality also provides an interpretation of the so-called effective number of observations for various mixing processes.

  13. Noise suppression via generalized-Markovian processes

    Science.gov (United States)

    Marshall, Jeffrey; Campos Venuti, Lorenzo; Zanardi, Paolo

    2017-11-01

    It is by now well established that noise itself can be useful for performing quantum information processing tasks. We present results which show how one can effectively reduce the error rate associated with a noisy quantum channel by counteracting its detrimental effects with another form of noise. In particular, we consider the effect of adding on top of a purely Markovian (Lindblad) dynamics, a more general form of dissipation, which we refer to as generalized-Markovian noise. This noise has an associated memory kernel and the resulting dynamics are described by an integrodifferential equation. The overall dynamics are characterized by decay rates which depend not only on the original dissipative time scales but also on the new integral kernel. We find that one can engineer this kernel such that the overall rate of decay is lowered by the addition of this noise term. We illustrate this technique for the case where the bare noise is described by a dephasing Pauli channel. We analytically solve this model and show that one can effectively double (or even triple) the length of the channel, while achieving the same fidelity, entanglement, and error threshold. We numerically verify this scheme can also be used to protect against thermal Markovian noise (at nonzero temperature), which models spontaneous emission and excitation processes. A physical interpretation of this scheme is discussed, whereby the added generalized-Markovian noise causes the system to become periodically decoupled from the background Markovian noise.

  14. MOTRIMS as a generalized probe of AMO processes

    International Nuclear Information System (INIS)

    Bredy, R.; Nguyen, H.; Camp, H.; Flechard, X.; De Paola, B.D.

    2003-01-01

    Magneto-optical trap recoil ion momentum spectroscopy (MOTRIMS) is one of the newest offshoots of the generalized TRIMS approach to ion-atom collisions. By using lasers instead of the more usual supersonic expansion to cool the target, MOTRIMS has demonstrated two distinct advantages over conventional TRIMS. The first is better resolution, now limited by detectors instead of target temperature. The second is its suitability for use in the study of laser-excited targets. In this presentation we will present a third advantage: The use of MOTRIMS as a general-purpose probe of AMO processes in cold atomic clouds of atoms and molecules. Specifically, the projectile ion beam can be used as a probe of processes as diverse as target dressing by femtosecond optical pulses, photo-association (laser-assisted cold collisions) photo-ionization, and electromagnetically-induced transparency. We will present data for the processes we have investigated, and speculations on what we expect to see for the processes we plan to investigate in the future

  15. Integer valued autoregressive processes with generalized discrete Mittag-Leffler marginals

    Directory of Open Access Journals (Sweden)

    Kanichukattu K. Jose

    2013-05-01

    Full Text Available In this paper we consider a generalization of discrete Mittag-Leffler distributions. We introduce and study the properties of a new distribution called geometric generalized discrete Mittag-Leffler distribution. Autoregressive processes with geometric generalized discrete Mittag-Leffler distributions are developed and studied. The distributions are further extended to develop a more general class of geometric generalized discrete semi-Mittag-Leffler distributions. The processes are extended to higher orders also. An application with respect to an empirical data on customer arrivals in a bank counter is also given. Various areas of potential applications like human resource development, insect growth, epidemic modeling, industrial risk modeling, insurance and actuaries, town planning etc are also discussed.

  16. General Notes on Processes and Their Spectra

    Directory of Open Access Journals (Sweden)

    Gustav Cepciansky

    2012-01-01

    Full Text Available The frequency spectrum performs one of the main characteristics of a process. The aim of the paper is to show the coherence between the process and its own spectrum and how the behaviour and properties of a process itself can be deduced from its spectrum. Processes are categorized and general principles of their spectra calculation and recognition are given. The main stress is put on power spectra of electric and optic signals, as they also perform a kind of processes. These spectra can be directly measured, observed and examined by means of spectral analyzers and they are very important characteristics which can not be omitted at transmission techniques in telecommunication technologies. Further, the paper also deals with non electric processes, mainly with processes and spectra at mass servicing and how these spectra can be utilised in praxis.

  17. A generalized integral fluctuation theorem for general jump processes

    International Nuclear Information System (INIS)

    Liu Fei; Ouyang Zhongcan; Luo Yupin; Huang Mingchang

    2009-01-01

    Using the Feynman-Kac and Cameron-Martin-Girsanov formulae, we obtain a generalized integral fluctuation theorem (GIFT) for discrete jump processes by constructing a time-invariable inner product. The existing discrete IFTs can be derived as its specific cases. A connection between our approach and the conventional time-reversal method is also established. Unlike the latter approach that has been extensively employed in the existing literature, our approach can naturally bring out the definition of a time reversal of a Markovian stochastic system. Additionally, we find that the robust GIFT usually does not result in a detailed fluctuation theorem. (fast track communication)

  18. Visual Processing in Generally Gifted and Mathematically Excelling Adolescents

    Science.gov (United States)

    Paz-Baruch, Nurit; Leikin, Roza; Leikin, Mark

    2016-01-01

    Little empirical data are available concerning the cognitive abilities of gifted individuals in general and especially those who excel in mathematics. We examined visual processing abilities distinguishing between general giftedness (G) and excellence in mathematics (EM). The research population consisted of 190 students from four groups of 10th-…

  19. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  20. Technical Safety Requirements for the Waste Storage Facilities May 2014

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-04-16

    This document contains the Technical Safety Requirements (TSR) for the Radioactive and Hazardous Waste Management (RHWM) WASTE STORAGE FACILITIES, which include Area 625 (A625) and the Building 693 (B693) Yard Area of the Decontamination and Waste Treatment Facility (DWTF) at LLNL. The TSRs constitute requirements for safe operation of the WASTE STORAGE FACILITIES. These TSRs are derived from the Documented Safety Analyses for the Waste Storage Facilities (DSA) (LLNL 2011). The analysis presented therein concluded that the WASTE STORAGE FACILITIES are low-chemical hazard, Hazard Category 2 non-reactor nuclear facilities. The TSRs consist primarily of inventory limits and controls to preserve the underlying assumptions in the hazard and accident analyses. Further, appropriate commitments to safety programs are presented in the administrative controls sections of the TSRs. The WASTE STORAGE FACILITIES are used by RHWM to handle and store hazardous waste, TRANSURANIC (TRU) WASTE, LOW-LEVEL WASTE (LLW), mixed waste, California combined waste, nonhazardous industrial waste, and conditionally accepted waste generated at LLNL as well as small amounts of waste from other DOE facilities, as described in the DSA. In addition, several minor treatments (e.g., size reduction and decontamination) are carried out in these facilities.

  1. Technical Safety Requirements for the Waste Storage Facilities May 2014

    International Nuclear Information System (INIS)

    Laycak, D. T.

    2014-01-01

    This document contains the Technical Safety Requirements (TSR) for the Radioactive and Hazardous Waste Management (RHWM) WASTE STORAGE FACILITIES, which include Area 625 (A625) and the Building 693 (B693) Yard Area of the Decontamination and Waste Treatment Facility (DWTF) at LLNL. The TSRs constitute requirements for safe operation of the WASTE STORAGE FACILITIES. These TSRs are derived from the Documented Safety Analyses for the Waste Storage Facilities (DSA) (LLNL 2011). The analysis presented therein concluded that the WASTE STORAGE FACILITIES are low-chemical hazard, Hazard Category 2 non-reactor nuclear facilities. The TSRs consist primarily of inventory limits and controls to preserve the underlying assumptions in the hazard and accident analyses. Further, appropriate commitments to safety programs are presented in the administrative controls sections of the TSRs. The WASTE STORAGE FACILITIES are used by RHWM to handle and store hazardous waste, TRANSURANIC (TRU) WASTE, LOW-LEVEL WASTE (LLW), mixed waste, California combined waste, nonhazardous industrial waste, and conditionally accepted waste generated at LLNL as well as small amounts of waste from other DOE facilities, as described in the DSA. In addition, several minor treatments (e.g., size reduction and decontamination) are carried out in these facilities.

  2. General simulation algorithm for autocorrelated binary processes.

    Science.gov (United States)

    Serinaldi, Francesco; Lombardo, Federico

    2017-02-01

    The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.

  3. General simulation algorithm for autocorrelated binary processes

    Science.gov (United States)

    Serinaldi, Francesco; Lombardo, Federico

    2017-02-01

    The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.

  4. A new iteration process for finite families of generalized lipschitz pseudo-contractive and generalized lipschitz accretive mappings

    International Nuclear Information System (INIS)

    Chidume, C.E.; Ofoedu, E.U.

    2007-07-01

    In this paper, we introduce a new iteration process and prove that it converges strongly to a common fixed point for a finite family of generalized Lipschitz nonlinear mappings in a real reflexive Banach space E with a with uniformly Gateaux differentiable norm if at least one member of the family is pseudo-contractive. We also prove that a slight modification of the process converges to a common zero for a finite family of generalized Lipschitz accretive operators defined on E. Results for nonexpansive families are obtained as easy corollaries. Finally, our new iteration process and our method of proof are of independent interest. (author)

  5. Markov Jump Processes Approximating a Non-Symmetric Generalized Diffusion

    International Nuclear Information System (INIS)

    Limić, Nedžad

    2011-01-01

    Consider a non-symmetric generalized diffusion X(⋅) in ℝ d determined by the differential operator A(x) = -Σ ij ∂ i a ij (x)∂ j + Σ i b i (x)∂ i . In this paper the diffusion process is approximated by Markov jump processes X n (⋅), in homogeneous and isotropic grids G n ⊂ℝ d , which converge in distribution in the Skorokhod space D([0,∞),ℝ d ) to the diffusion X(⋅). The generators of X n (⋅) are constructed explicitly. Due to the homogeneity and isotropy of grids, the proposed method for d≥3 can be applied to processes for which the diffusion tensor {a ij (x)} 11 dd fulfills an additional condition. The proposed construction offers a simple method for simulation of sample paths of non-symmetric generalized diffusion. Simulations are carried out in terms of jump processes X n (⋅). For piece-wise constant functions a ij on ℝ d and piece-wise continuous functions a ij on ℝ 2 the construction and principal algorithm are described enabling an easy implementation into a computer code.

  6. PECULIARITIES OF GENERALIZATION OF SIMILAR PHENOMENA IN THE PROCESS OF FISH HEAT TREATMENT

    Directory of Open Access Journals (Sweden)

    V. A. Pokhol’chenko

    2015-01-01

    Full Text Available The theoretical presuppositions for the possibility of generalizing and similarity founding in dehydration and wet materials heating processes are studieded in this article. It is offered to carry out the given processes generalization by using dimensionless numbers of similarity. At the detailed analyzing of regularities of heat treatment processes of fish in different modes a significant amount of experienced material was successfully generalized on the basis of dimensionless simplex (similarity numbers. Using the dimensionless simplex allowed to detect a number of simple mathematical models for the studied phenomena. The generalized kinetic models of fish dehydration, the generalized dynamic models (changing moisture diffusion coefficients, the generalized kinetic models of fish heating (the temperature field changing in the products thickness, average volume and center were founded. These generalized mathematical models showed also relationship of dehydration and heating at the processes of fish semi-hot, hot smoking (drying and frying. The relationship of the results from the physical nature of the dehydration process, including a change in the binding energy of the moisture with the material to the extent of the process and the shrinkage impact on the rate of the product moisture removal is given in the article. The factors influencing the internal structure and properties of the raw material changing and retarding the dehydration processes are described there. There was a heating rate dependence of fish products on the chemical composition the geometric dimensions of the object of heating and on the coolant regime parameters. A unique opportunity is opened by using the generalized models, combined with empirically derived equations and the technique of engineering calculation of these processes, to design a rational modes of heat treatment of raw materials and to optimize the performance of thermal equipment.

  7. A general conservative extension theorem in process algebras with inequalities

    NARCIS (Netherlands)

    d' Argenio, P.R.; Verhoef, Chris

    1997-01-01

    We prove a general conservative extension theorem for transition system based process theories with easy-to-check and reasonable conditions. The core of this result is another general theorem which gives sufficient conditions for a system of operational rules and an extension of it in order to

  8. A General Accelerated Degradation Model Based on the Wiener Process.

    Science.gov (United States)

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  9. A General Accelerated Degradation Model Based on the Wiener Process

    Directory of Open Access Journals (Sweden)

    Le Liu

    2016-12-01

    Full Text Available Accelerated degradation testing (ADT is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  10. A generalized fluctuation-dissipation theorem for the one-dimensional diffusion process

    International Nuclear Information System (INIS)

    Okabe, Y.

    1985-01-01

    The [α,β,γ]-Langevin equation describes the time evolution of a real stationary process with T-positivity (reflection positivity) originating in the axiomatic quantum field theory. For this [α,β,γ]-Langevin equation a generalized fluctuation-dissipation theorem is proved. We shall obtain, as its application, a generalized fluctuation-dissipation theorem for the one-dimensional non-linear diffusion process, which presents one solution of Ryogo Kubo's problem in physics. (orig.)

  11. SIMULATION AND PREDICTION OF THE PROCESS BASED ON THE GENERAL LOGISTIC MAPPING

    Directory of Open Access Journals (Sweden)

    V. V. Skalozub

    2013-11-01

    Full Text Available Purpose. The aim of the research is to build a model of the generalzed logistic mapping and assessment of the possibilities of its use for the formation of the mathematical description, as well as operational forecasts of parameters of complex dynamic processes described by the time series. Methodology. The research results are obtained on the basis of mathematical modeling and simulation of nonlinear systems using the tools of chaotic dynamics. Findings. A model of the generalized logistic mapping, which is used to interpret the characteristics of dynamic processes was proposed. We consider some examples of representations of processes based on enhanced logistic mapping varying the values of model parameters. The procedures of modeling and interpretation of the data on the investigated processes, represented by the time series, as well as the operational forecasting of parameters using the generalized model of logistic mapping were proposed. Originality. The paper proposes an improved mathematical model, generalized logistic mapping, designed for the study of nonlinear discrete dynamic processes. Practical value. The carried out research using the generalized logistic mapping of railway transport processes, in particular, according to assessment of the parameters of traffic volumes, indicate the great potential of its application in practice for solving problems of analysis, modeling and forecasting complex nonlinear discrete dynamical processes. The proposed model can be used, taking into account the conditions of uncertainty, irregularity, the manifestations of the chaotic nature of the technical, economic and other processes, including the railway ones.

  12. Generalized epidemic process on modular networks.

    Science.gov (United States)

    Chung, Kihong; Baek, Yongjoo; Kim, Daniel; Ha, Meesoon; Jeong, Hawoong

    2014-05-01

    Social reinforcement and modular structure are two salient features observed in the spreading of behavior through social contacts. In order to investigate the interplay between these two features, we study the generalized epidemic process on modular networks with equal-sized finite communities and adjustable modularity. Using the analytical approach originally applied to clique-based random networks, we show that the system exhibits a bond-percolation type continuous phase transition for weak social reinforcement, whereas a discontinuous phase transition occurs for sufficiently strong social reinforcement. Our findings are numerically verified using the finite-size scaling analysis and the crossings of the bimodality coefficient.

  13. Generalized Poisson processes in quantum mechanics and field theory

    International Nuclear Information System (INIS)

    Combe, P.; Rodriguez, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Hoegh-Krohn, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Sirugue, M.; Sirugue-Collin, M.; Centre National de la Recherche Scientifique, 13 - Marseille

    1981-01-01

    In section 2 we describe more carefully the generalized Poisson processes, giving a realization of the underlying probability space, and we characterize these processes by their characteristic functionals. Section 3 is devoted to the proof of the previous formula for quantum mechanical systems, with possibly velocity dependent potentials and in section 4 we give an application of the previous theory to some relativistic Bose field models. (orig.)

  14. General programmed system for physiological signal processing

    Energy Technology Data Exchange (ETDEWEB)

    Tournier, E; Monge, J; Magnet, C; Sonrel, C

    1975-01-01

    Improvements made to the general programmed signal acquisition and processing system, Plurimat S, are described, the aim being to obtain a less specialized system adapted to the biological and medical field. In this modified system the acquisition will be simplified. The standard processings offered will be integrated to a real advanced language which will enable the user to create his own processings, the loss of speed being compensated by a greater flexibility and universality. The observation screen will be large and the quality of the recording very good so that a large signal fraction may be displayed. The data will be easily indexed and filed for subsequent display and processing. This system will be used for two kinds of task: it can either be specialized, as an integral part of measurement and diagnostic preparation equipment used routinely in clinical work (e.g. vectocardiographic examination), or its versatility can be used for studies of limited duration to gain information in a given field or to study new diagnosis or treatment methods.

  15. Domain-General Factors Influencing Numerical and Arithmetic Processing

    Directory of Open Access Journals (Sweden)

    André Knops

    2017-12-01

    Full Text Available This special issue contains 18 articles that address the question how numerical processes interact with domain-general factors. We start the editorial with a discussion of how to define domain-general versus domain-specific factors and then discuss the contributions to this special issue grouped into two core numerical domains that are subject to domain-general influences (see Figure 1. The first group of contributions addresses the question how numbers interact with spatial factors. The second group of contributions is concerned with factors that determine and predict arithmetic understanding, performance and development. This special issue shows that domain-general (Table 1a as well as domain-specific (Table 1b abilities influence numerical and arithmetic performance virtually at all levels and make it clear that for the field of numerical cognition a sole focus on one or several domain-specific factors like the approximate number system or spatial-numerical associations is not sufficient. Vice versa, in most studies that included domain-general and domain-specific variables, domain-specific numerical variables predicted arithmetic performance above and beyond domain-general variables. Therefore, a sole focus on domain-general aspects such as, for example, working memory, to explain, predict and foster arithmetic learning is also not sufficient. Based on the articles in this special issue we conclude that both domain-general and domain-specific factors contribute to numerical cognition. But the how, why and when of their contribution still needs to be better understood. We hope that this special issue may be helpful to readers in constraining future theory and model building about the interplay of domain-specific and domain-general factors.

  16. Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz

    Science.gov (United States)

    Vanicat, Matthieu

    2018-04-01

    We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.

  17. Lévy processes on a generalized fractal comb

    Science.gov (United States)

    Sandev, Trifce; Iomin, Alexander; Méndez, Vicenç

    2016-09-01

    Comb geometry, constituted of a backbone and fingers, is one of the most simple paradigm of a two-dimensional structure, where anomalous diffusion can be realized in the framework of Markov processes. However, the intrinsic properties of the structure can destroy this Markovian transport. These effects can be described by the memory and spatial kernels. In particular, the fractal structure of the fingers, which is controlled by the spatial kernel in both the real and the Fourier spaces, leads to the Lévy processes (Lévy flights) and superdiffusion. This generalization of the fractional diffusion is described by the Riesz space fractional derivative. In the framework of this generalized fractal comb model, Lévy processes are considered, and exact solutions for the probability distribution functions are obtained in terms of the Fox H-function for a variety of the memory kernels, and the rate of the superdiffusive spreading is studied by calculating the fractional moments. For a special form of the memory kernels, we also observed a competition between long rests and long jumps. Finally, we considered the fractal structure of the fingers controlled by a Weierstrass function, which leads to the power-law kernel in the Fourier space. This is a special case, when the second moment exists for superdiffusion in this competition between long rests and long jumps.

  18. Lévy processes on a generalized fractal comb

    International Nuclear Information System (INIS)

    Sandev, Trifce; Iomin, Alexander; Méndez, Vicenç

    2016-01-01

    Comb geometry, constituted of a backbone and fingers, is one of the most simple paradigm of a two-dimensional structure, where anomalous diffusion can be realized in the framework of Markov processes. However, the intrinsic properties of the structure can destroy this Markovian transport. These effects can be described by the memory and spatial kernels. In particular, the fractal structure of the fingers, which is controlled by the spatial kernel in both the real and the Fourier spaces, leads to the Lévy processes (Lévy flights) and superdiffusion. This generalization of the fractional diffusion is described by the Riesz space fractional derivative. In the framework of this generalized fractal comb model, Lévy processes are considered, and exact solutions for the probability distribution functions are obtained in terms of the Fox H -function for a variety of the memory kernels, and the rate of the superdiffusive spreading is studied by calculating the fractional moments. For a special form of the memory kernels, we also observed a competition between long rests and long jumps. Finally, we considered the fractal structure of the fingers controlled by a Weierstrass function, which leads to the power-law kernel in the Fourier space. This is a special case, when the second moment exists for superdiffusion in this competition between long rests and long jumps. (paper)

  19. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  20. Weldability of general purpose heat source new-process iridium

    International Nuclear Information System (INIS)

    Kanne, W.R.

    1987-01-01

    Weldability tests on General Purpose Heat Source (GPHS) iridium capsules showed that a new iridium fabrication process reduced susceptibility to underbead cracking. Seventeen capsules were welded (a total of 255 welds) in four categories and the number of cracks in each weld was measured

  1. The Burden of the Fellowship Interview Process on General Surgery Residents and Programs.

    Science.gov (United States)

    Watson, Shawna L; Hollis, Robert H; Oladeji, Lasun; Xu, Shin; Porterfield, John R; Ponce, Brent A

    This study evaluated the effect of the fellowship interview process in a cohort of general surgery residents. We hypothesized that the interview process would be associated with significant clinical time lost, monetary expenses, and increased need for shift coverage. An online anonymous survey link was sent via e-mail to general surgery program directors in June 2014. Program directors distributed an additional survey link to current residents in their program who had completed the fellowship interview process. United States allopathic general surgery programs. Overall, 50 general surgery program directors; 72 general surgery residents. Program directors reported a fellowship application rate of 74.4%. Residents most frequently attended 8 to 12 interviews (35.2%). Most (57.7%) of residents reported missing 7 or more days of clinical training to attend interviews; these shifts were largely covered by other residents. Most residents (62.3%) spent over $4000 on the interview process. Program directors rated fellowship burden as an average of 6.7 on a 1 to 10 scale of disruption, with 10 being a significant disruption. Most of the residents (57.3%) were in favor of change in the interview process. We identified potential areas for improvement including options for coordinated interviews and improved content on program websites. The surgical fellowship match is relatively burdensome to residents and programs alike, and merits critical assessment for potential improvement. Published by Elsevier Inc.

  2. Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.

    2006-01-01

    Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.

  3. Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory

    Science.gov (United States)

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-03-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.

  4. Is general intelligence little more than the speed of higher-order processing?

    Science.gov (United States)

    Schubert, Anna-Lena; Hagemann, Dirk; Frischkorn, Gidon T

    2017-10-01

    Individual differences in the speed of information processing have been hypothesized to give rise to individual differences in general intelligence. Consistent with this hypothesis, reaction times (RTs) and latencies of event-related potential have been shown to be moderately associated with intelligence. These associations have been explained either in terms of individual differences in some brain-wide property such as myelination, the speed of neural oscillations, or white-matter tract integrity, or in terms of individual differences in specific processes such as the signal-to-noise ratio in evidence accumulation, executive control, or the cholinergic system. Here we show in a sample of 122 participants, who completed a battery of RT tasks at 2 laboratory sessions while an EEG was recorded, that more intelligent individuals have a higher speed of higher-order information processing that explains about 80% of the variance in general intelligence. Our results do not support the notion that individuals with higher levels of general intelligence show advantages in some brain-wide property. Instead, they suggest that more intelligent individuals benefit from a more efficient transmission of information from frontal attention and working memory processes to temporal-parietal processes of memory storage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. A generalized logarithmic image processing model based on the gigavision sensor model.

    Science.gov (United States)

    Deng, Guang

    2012-03-01

    The logarithmic image processing (LIP) model is a mathematical theory providing generalized linear operations for image processing. The gigavision sensor (GVS) is a new imaging device that can be described by a statistical model. In this paper, by studying these two seemingly unrelated models, we develop a generalized LIP (GLIP) model. With the LIP model being its special case, the GLIP model not only provides new insights into the LIP model but also defines new image representations and operations for solving general image processing problems that are not necessarily related to the GVS. A new parametric LIP model is also developed. To illustrate the application of the new scalar multiplication operation, we propose an energy-preserving algorithm for tone mapping, which is a necessary step in image dehazing. By comparing with results using two state-of-the-art algorithms, we show that the new scalar multiplication operation is an effective tool for tone mapping.

  6. On the 2-orthogonal polynomials and the generalized birth and death processes

    Directory of Open Access Journals (Sweden)

    Zerouki Ebtissem

    2006-01-01

    Full Text Available We discuss the connections between the 2-orthogonal polynomials and the generalized birth and death processes. Afterwards, we find the sufficient conditions to give an integral representation of the transition probabilities from these processes.

  7. Appendix D-12A Building 332C Waste Accumulation Area

    International Nuclear Information System (INIS)

    Chase, D

    2005-01-01

    This appendix is designed to provide information specific to the Building 332C Waste Accumulation Area (B-332C WAA), a waste storage area. This appendix is not designed to be used as a sole source of information. All general information that is not specific to the B-332C WAA is included in the Contingency Plan for Waste Accumulation Areas, dated July 2004, and should be referenced. The B-332C WAA is located in the southwest quadrant of the LLNL Main Site in Building 332, Room 1330. Hazardous and mixed wastes may be stored at the B-332C WAA for 90 days or less, until transferred to the appropriate Radioactive and Hazardous Waste Management (RHWM) facility or other permitted treatment, storage or disposal facility (TSDF). Radioactive waste may also be stored at the WAA. The design storage capacity of this WAA is 2,200 gallons

  8. The semi-Markov process. Generalizations and calculation rules for application in the analysis of systems

    International Nuclear Information System (INIS)

    Hirschmann, H.

    1983-06-01

    The consequences of the basic assumptions of the semi-Markov process as defined from a homogeneous renewal process with a stationary Markov condition are reviewed. The notion of the semi-Markov process is generalized by its redefinition from a nonstationary Markov renewal process. For both the nongeneralized and the generalized case a representation of the first order conditional state probabilities is derived in terms of the transition probabilities of the Markov renewal process. Some useful calculation rules (regeneration rules) are derived for the conditional state probabilities of the semi-Markov process. Compared to the semi-Markov process in its usual definition the generalized process allows the analysis of a larger class of systems. For instance systems with arbitrarily distributed lifetimes of their components can be described. There is also a chance to describe systems which are modified during time by forces or manipulations from outside. (Auth.)

  9. Examination of Turkish Junior High-School Students' Perceptions of the General Problem-Solving Process

    Science.gov (United States)

    Ekici, Didem Inel

    2016-01-01

    This study aimed to determine Turkish junior high-school students' perceptions of the general problem-solving process. The Turkish junior high-school students' perceptions of the general problem-solving process were examined in relation to their gender, grade level, age and their grade point with regards to the science course identified in the…

  10. General definitions of chaos for continuous and discrete-time processes

    OpenAIRE

    Vieru, Andrei

    2008-01-01

    A precise definition of chaos for discrete processes based on iteration already exists. We shall first reformulate it in a more general frame, taking into account the fact that discrete chaotic behavior is neither necessarily based on iteration nor strictly related to compact metric spaces or to bounded functions. Then we shall apply the central idea of this definition to continuous processes. We shall try to see what chaos is, regardless of the way it is generated.

  11. [Contents of general flavonoides in Epimedium acuminatum Franch. and its differently-processed products].

    Science.gov (United States)

    Chen, H L; Wang, J K; Zhang, L L; Wu, Z Y

    2000-04-01

    Determining and comparing the contents of general flavonoides in four kinds of differently-processed products of Epimedium acuminatum. Determining the contents by ultraviolet spectrophotometry. The contents were found in the following seguence: unprocessed product, clearly-fried product, alcohol-broiled product, salt-broiled product, sheep-fat-broiled product. The average recovery rate was 96.01%, with a 0.74% RSD(n = 5). Heating causes the contents of general flavonoides in the processed products to decrease. These processed products are still often used in clinical treatment, for the reason that the adjuvant features certain coordinating and promoting functions. The study is to be pursued further.

  12. Process mapping as a framework for performance improvement in emergency general surgery.

    Science.gov (United States)

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2018-02-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  13. Generalization bounds of ERM-based learning processes for continuous-time Markov chains.

    Science.gov (United States)

    Zhang, Chao; Tao, Dacheng

    2012-12-01

    Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.

  14. Process error rates in general research applications to the Human ...

    African Journals Online (AJOL)

    Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...

  15. GENERAL ALGORITHMIC SCHEMA OF THE PROCESS OF THE CHILL AUXILIARIES PROJECTION

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2006-01-01

    Full Text Available The general algorithmic diagram of systematization of the existing approaches to the process of projection is offered and the foundation of computer system of the chill mold arming construction is laid.

  16. Non-rigid ultrasound image registration using generalized relaxation labeling process

    Science.gov (United States)

    Lee, Jong-Ha; Seong, Yeong Kyeong; Park, MoonHo; Woo, Kyoung-Gu; Ku, Jeonghun; Park, Hee-Jun

    2013-03-01

    This research proposes a novel non-rigid registration method for ultrasound images. The most predominant anatomical features in medical images are tissue boundaries, which appear as edges. In ultrasound images, however, other features can be identified as well due to the specular reflections that appear as bright lines superimposed on the ideal edge location. In this work, an image's local phase information (via the frequency domain) is used to find the ideal edge location. The generalized relaxation labeling process is then formulated to align the feature points extracted from the ideal edge location. In this work, the original relaxation labeling method was generalized by taking n compatibility coefficient values to improve non-rigid registration performance. This contextual information combined with a relaxation labeling process is used to search for a correspondence. Then the transformation is calculated by the thin plate spline (TPS) model. These two processes are iterated until the optimal correspondence and transformation are found. We have tested our proposed method and the state-of-the-art algorithms with synthetic data and bladder ultrasound images of in vivo human subjects. Experiments show that the proposed method improves registration performance significantly, as compared to other state-of-the-art non-rigid registration algorithms.

  17. General induction at companies - between an administrative process and a sociological phenomenon

    Directory of Open Access Journals (Sweden)

    Héctor L. Bermúdez Restrepo

    2012-12-01

    Full Text Available From the example of the process of general induction into the organization and using certain sociological resources, shows paradox are specialists in human management: is to carefor the motivation and the welfare of workers to achieve its high performance, their fidelity and his tenure at the company; However, current mutations of the social architecture in general and of work in particular –as structure of organized action– force thinking that organizational loyalty tends to be increasingly unlikely and that, conversely, the current personnel administration processes appear made inappropriate notions and appear to contribute directly to the adversities of human beings in organizational settings.

  18. 41 CFR 102-37.50 - What is the general process for requesting surplus property for donation?

    Science.gov (United States)

    2010-07-01

    ... process for requesting surplus property for donation? 102-37.50 Section 102-37.50 Public Contracts and... REGULATION PERSONAL PROPERTY 37-DONATION OF SURPLUS PERSONAL PROPERTY General Provisions Donation Overview § 102-37.50 What is the general process for requesting surplus property for donation? The process for...

  19. General methodology for exergy balance in ProSimPlus® process simulator

    International Nuclear Information System (INIS)

    Ghannadzadeh, Ali; Thery-Hetreux, Raphaële; Baudouin, Olivier; Baudet, Philippe; Floquet, Pascal; Joulia, Xavier

    2012-01-01

    This paper presents a general methodology for exergy balance in chemical and thermal processes integrated in ProSimPlus ® as a well-adapted process simulator for energy efficiency analysis. In this work, as well as using the general expressions for heat and work streams, the whole exergy balance is presented within only one software in order to fully automate exergy analysis. In addition, after exergy balance, the essential elements such as source of irreversibility for exergy analysis are presented to help the user for modifications on either process or utility system. The applicability of the proposed methodology in ProSimPlus ® is shown through a simple scheme of Natural Gas Liquids (NGL) recovery process and its steam utility system. The methodology does not only provide the user with necessary exergetic criteria to pinpoint the source of exergy losses, it also helps the user to find the way to reduce the exergy losses. These features of the proposed exergy calculator make it preferable for its implementation in ProSimPlus ® to define the most realistic and profitable retrofit projects on the existing chemical and thermal plants. -- Highlights: ► A set of new expressions for calculation of exergy of material streams is developed. ► A general methodology for exergy balance in ProSimPlus ® is presented. ► A panel of solutions based on exergy analysis is provided to help the user for modifications on process flowsheets. ► The exergy efficiency is chosen as a variable in a bi-criteria optimization.

  20. On-line validation of linear process models using generalized likelihood ratios

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1981-12-01

    A real-time method for testing the validity of linear models of nonlinear processes is described and evaluated. Using generalized likelihood ratios, the model dynamics are continually monitored to see if the process has moved far enough away from the nominal linear model operating point to justify generation of a new linear model. The method is demonstrated using a seventh-order model of a natural circulation steam generator

  1. Use of general purpose graphics processing units with MODFLOW

    Science.gov (United States)

    Hughes, Joseph D.; White, Jeremy T.

    2013-01-01

    To evaluate the use of general-purpose graphics processing units (GPGPUs) to improve the performance of MODFLOW, an unstructured preconditioned conjugate gradient (UPCG) solver has been developed. The UPCG solver uses a compressed sparse row storage scheme and includes Jacobi, zero fill-in incomplete, and modified-incomplete lower-upper (LU) factorization, and generalized least-squares polynomial preconditioners. The UPCG solver also includes options for sequential and parallel solution on the central processing unit (CPU) using OpenMP. For simulations utilizing the GPGPU, all basic linear algebra operations are performed on the GPGPU; memory copies between the central processing unit CPU and GPCPU occur prior to the first iteration of the UPCG solver and after satisfying head and flow criteria or exceeding a maximum number of iterations. The efficiency of the UPCG solver for GPGPU and CPU solutions is benchmarked using simulations of a synthetic, heterogeneous unconfined aquifer with tens of thousands to millions of active grid cells. Testing indicates GPGPU speedups on the order of 2 to 8, relative to the standard MODFLOW preconditioned conjugate gradient (PCG) solver, can be achieved when (1) memory copies between the CPU and GPGPU are optimized, (2) the percentage of time performing memory copies between the CPU and GPGPU is small relative to the calculation time, (3) high-performance GPGPU cards are utilized, and (4) CPU-GPGPU combinations are used to execute sequential operations that are difficult to parallelize. Furthermore, UPCG solver testing indicates GPGPU speedups exceed parallel CPU speedups achieved using OpenMP on multicore CPUs for preconditioners that can be easily parallelized.

  2. Information processing speed mediates the relationship between white matter and general intelligence in schizophrenia.

    Science.gov (United States)

    Alloza, Clara; Cox, Simon R; Duff, Barbara; Semple, Scott I; Bastin, Mark E; Whalley, Heather C; Lawrie, Stephen M

    2016-08-30

    Several authors have proposed that schizophrenia is the result of impaired connectivity between specific brain regions rather than differences in local brain activity. White matter abnormalities have been suggested as the anatomical substrate for this dysconnectivity hypothesis. Information processing speed may act as a key cognitive resource facilitating higher order cognition by allowing multiple cognitive processes to be simultaneously available. However, there is a lack of established associations between these variables in schizophrenia. We hypothesised that the relationship between white matter and general intelligence would be mediated by processing speed. White matter water diffusion parameters were studied using Tract-based Spatial Statistics and computed within 46 regions-of-interest (ROI). Principal component analysis was conducted on these white matter ROI for fractional anisotropy (FA) and mean diffusivity, and on neurocognitive subtests to extract general factors of white mater structure (gFA, gMD), general intelligence (g) and processing speed (gspeed). There was a positive correlation between g and gFA (r= 0.67, p =0.001) that was partially and significantly mediated by gspeed (56.22% CI: 0.10-0.62). These findings suggest a plausible model of structure-function relations in schizophrenia, whereby white matter structure may provide a neuroanatomical substrate for general intelligence, which is partly supported by speed of information processing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Generalized Inferences about the Mean Vector of Several Multivariate Gaussian Processes

    Directory of Open Access Journals (Sweden)

    Pilar Ibarrola

    2015-01-01

    Full Text Available We consider in this paper the problem of comparing the means of several multivariate Gaussian processes. It is assumed that the means depend linearly on an unknown vector parameter θ and that nuisance parameters appear in the covariance matrices. More precisely, we deal with the problem of testing hypotheses, as well as obtaining confidence regions for θ. Both methods will be based on the concepts of generalized p value and generalized confidence region adapted to our context.

  4. Information in general medical practices: the information processing model.

    Science.gov (United States)

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  5. Generalized enthalpy model of a high-pressure shift freezing process

    KAUST Repository

    Smith, N. A. S.

    2012-05-02

    High-pressure freezing processes are a novel emerging technology in food processing, offering significant improvements to the quality of frozen foods. To be able to simulate plateau times and thermal history under different conditions, in this work, we present a generalized enthalpy model of the high-pressure shift freezing process. The model includes the effects of pressure on conservation of enthalpy and incorporates the freezing point depression of non-dilute food samples. In addition, the significant heat-transfer effects of convection in the pressurizing medium are accounted for by solving the two-dimensional Navier-Stokes equations. We run the model for several numerical tests where the food sample is agar gel, and find good agreement with experimental data from the literature. © 2012 The Royal Society.

  6. The role of culture in the general practice consultation process.

    Science.gov (United States)

    Ali, Nasreen; Atkin, Karl; Neal, Richard

    2006-11-01

    In this paper, we will examine the importance of culture and ethnicity in the general practice consultation process. Good communication is associated with positive health outcomes. We will, by presenting qualitative material from an empirical study, examine the way in which communication within the context of a general practitioner (GP) consultation may be affected by ethnicity and cultural factors. The aim of the study was to provide a detailed understanding of the ways in which white and South Asian patients communicate with white GPs and to explore any similarities and differences in communication. This paper reports on South Asian and white patients' explanations of recent videotaped consultations with their GP. We specifically focus on the ways in which issues of ethnic identity impacted upon the GP consultation process, by exploring how our sample of predominantly white GPs interacted with their South Asian patients and the extent to which the GP listened to the patients' needs, gave patients information, engaged in social conversation and showed friendliness. We then go on to examine patients' suggestions on improvements (if any) to the consultation. We conclude, by showing how a non-essentialist understanding of culture helps to comprehend the consultation process when the patients are from Great Britain's ethnicised communities. Our findings, however, raise generic issues of relevance to all multi-racial and multi-ethnic societies.

  7. Understanding price discovery in interconnected markets: Generalized Langevin process approach and simulation

    Science.gov (United States)

    Schenck, Natalya A.; Horvath, Philip A.; Sinha, Amit K.

    2018-02-01

    While the literature on price discovery process and information flow between dominant and satellite market is exhaustive, most studies have applied an approach that can be traced back to Hasbrouck (1995) or Gonzalo and Granger (1995). In this paper, however, we propose a Generalized Langevin process with asymmetric double-well potential function, with co-integrated time series and interconnected diffusion processes to model the information flow and price discovery process in two, a dominant and a satellite, interconnected markets. A simulated illustration of the model is also provided.

  8. General Template for the FMEA Applications in Primary Food Processing.

    Science.gov (United States)

    Özilgen, Sibel; Özilgen, Mustafa

    Data on the hazards involved in the primary steps of processing cereals, fruit and vegetables, milk and milk products, meat and meat products, and fats and oils are compiled with a wide-ranging literature survey. After determining the common factors from these data, a general FMEA template is offered, and its use is explained with a case study on pasteurized milk production.

  9. Ergodicity breaking, ageing, and confinement in generalized diffusion processes with position and time dependent diffusivity

    International Nuclear Information System (INIS)

    Cherstvy, Andrey G; Metzler, Ralf

    2015-01-01

    We study generalized anomalous diffusion processes whose diffusion coefficient D(x, t) ∼ D 0 |x| α t β depends on both the position x of the test particle and the process time t. This process thus combines the features of scaled Brownian motion and heterogeneous diffusion parent processes. We compute the ensemble and time averaged mean squared displacements of this generalized diffusion process. The scaling exponent of the ensemble averaged mean squared displacement is shown to be the product of the critical exponents of the parent processes, and describes both subdiffusive and superdiffusive systems. We quantify the amplitude fluctuations of the time averaged mean squared displacement as function of the length of the time series and the lag time. In particular, we observe a weak ergodicity breaking of this generalized diffusion process: even in the long time limit the ensemble and time averaged mean squared displacements are strictly disparate. When we start to observe this process some time after its initiation we observe distinct features of ageing. We derive a universal ageing factor for the time averaged mean squared displacement containing all information on the ageing time and the measurement time. External confinement is shown to alter the magnitudes and statistics of the ensemble and time averaged mean squared displacements. (paper)

  10. The process of patient enablement in general practice nurse consultations: a grounded theory study.

    Science.gov (United States)

    Desborough, Jane; Banfield, Michelle; Phillips, Christine; Mills, Jane

    2017-05-01

    The aim of this study was to gain insight into the process of patient enablement in general practice nursing consultations. Enhanced roles for general practice nurses may benefit patients through a range of mechanisms, one of which may be increasing patient enablement. In studies with general practitioners enhanced patient enablement has been associated with increases in self-efficacy and skill development. This study used a constructivist grounded theory design. In-depth interviews were conducted with 16 general practice nurses and 23 patients from 21 general practices between September 2013 - March 2014. Data generation and analysis were conducted concurrently using constant comparative analysis and theoretical sampling focussing on the process and outcomes of patient enablement. Use of the storyline technique supported theoretical coding and integration of the data into a theoretical model. A clearly defined social process that fostered and optimised patient enablement was constructed. The theory of 'developing enabling healthcare partnerships between nurses and patients in general practice' incorporates three stages: triggering enabling healthcare partnerships, tailoring care and the manifestation of patient enablement. Patient enablement was evidenced through: 1. Patients' understanding of their unique healthcare requirements informing their health seeking behaviours and choices; 2. Patients taking an increased lead in their partnership with a nurse and seeking choices in their care and 3. Patients getting health care that reflected their needs, preferences and goals. This theoretical model is in line with a patient-centred model of health care and is particularly suited to patients with chronic disease. © 2016 John Wiley & Sons Ltd.

  11. Technical Safety Requirements for the B695 Segment

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D

    2008-09-11

    This document contains Technical Safety Requirements (TSRs) for the Radioactive and Hazardous Waste Management (RHWM) Division's B695 Segment of the Decontamination and Waste Treatment Facility (DWTF) at Lawrence Livermore National Laboratory (LLNL). The TSRs constitute requirements regarding the safe operation of the B695 Segment. The TSRs are derived from the Documented Safety Analysis (DSA) for the B695 Segment (LLNL 2007). The analysis presented there determined that the B695 Segment is a low-chemical hazard, Hazard Category 3, nonreactor nuclear facility. The TSRs consist primarily of inventory limits as well as controls to preserve the underlying assumptions in the hazard analyses. Furthermore, appropriate commitments to safety programs are presented in the administrative controls section of the TSRs. The B695 Segment (B695 and the west portion of B696) is a waste treatment and storage facility located in the northeast quadrant of the LLNL main site. The approximate area and boundary of the B695 Segment are shown in the B695 Segment DSA. Activities typically conducted in the B695 Segment include container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. B695 is used to store and treat radioactive, mixed, and hazardous waste, and it also contains equipment used in conjunction with waste processing operations to treat various liquid and solid wastes. The portion of the building called Building 696 Solid Waste Processing Area (SWPA), also referred to as B696S in this report, is used primarily to manage solid radioactive, mixed, and hazardous waste. Operations specific to the SWPA include sorting and segregating waste, lab-packing, sampling, and crushing empty drums that previously contained waste. Furthermore, a Waste Packaging Unit will be permitted to treat hazardous and mixed waste. RHWM generally processes LLW with no, or extremely low, concentrations of transuranics (i.e., much less than 100 n

  12. Technical Safety Requirements for the B695 Segment

    International Nuclear Information System (INIS)

    Laycak, D.

    2008-01-01

    This document contains Technical Safety Requirements (TSRs) for the Radioactive and Hazardous Waste Management (RHWM) Division's B695 Segment of the Decontamination and Waste Treatment Facility (DWTF) at Lawrence Livermore National Laboratory (LLNL). The TSRs constitute requirements regarding the safe operation of the B695 Segment. The TSRs are derived from the Documented Safety Analysis (DSA) for the B695 Segment (LLNL 2007). The analysis presented there determined that the B695 Segment is a low-chemical hazard, Hazard Category 3, nonreactor nuclear facility. The TSRs consist primarily of inventory limits as well as controls to preserve the underlying assumptions in the hazard analyses. Furthermore, appropriate commitments to safety programs are presented in the administrative controls section of the TSRs. The B695 Segment (B695 and the west portion of B696) is a waste treatment and storage facility located in the northeast quadrant of the LLNL main site. The approximate area and boundary of the B695 Segment are shown in the B695 Segment DSA. Activities typically conducted in the B695 Segment include container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. B695 is used to store and treat radioactive, mixed, and hazardous waste, and it also contains equipment used in conjunction with waste processing operations to treat various liquid and solid wastes. The portion of the building called Building 696 Solid Waste Processing Area (SWPA), also referred to as B696S in this report, is used primarily to manage solid radioactive, mixed, and hazardous waste. Operations specific to the SWPA include sorting and segregating waste, lab-packing, sampling, and crushing empty drums that previously contained waste. Furthermore, a Waste Packaging Unit will be permitted to treat hazardous and mixed waste. RHWM generally processes LLW with no, or extremely low, concentrations of transuranics (i.e., much less than 100 n

  13. Toward a model framework of generalized parallel componential processing of multi-symbol numbers.

    Science.gov (United States)

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-05-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining and investigating a sign-decade compatibility effect for the comparison of positive and negative numbers, which extends the unit-decade compatibility effect in 2-digit number processing. Then, we evaluated whether the model is capable of accounting for previous findings in negative number processing. In a magnitude comparison task, in which participants had to single out the larger of 2 integers, we observed a reliable sign-decade compatibility effect with prolonged reaction times for incompatible (e.g., -97 vs. +53; in which the number with the larger decade digit has the smaller, i.e., negative polarity sign) as compared with sign-decade compatible number pairs (e.g., -53 vs. +97). Moreover, an analysis of participants' eye fixation behavior corroborated our model of parallel componential processing of multi-symbol numbers. These results are discussed in light of concurrent theoretical notions about negative number processing. On the basis of the present results, we propose a generalized integrated model framework of parallel componential multi-symbol processing. (c) 2015 APA, all rights reserved).

  14. Process evaluation of a practice nurse-led smoking cessation trial in Australian general practice: views of general practitioners and practice nurses.

    Science.gov (United States)

    Halcomb, Elizabeth J; Furler, John S; Hermiz, Oshana S; Blackberry, Irene D; Smith, Julie P; Richmond, Robyn L; Zwar, Nicholas A

    2015-08-01

    Support in primary care can assist smokers to quit successfully, but there are barriers to general practitioners (GPs) providing this support routinely. Practice nurses (PNs) may be able to effectively take on this role. The aim of this study was to perform a process evaluation of a PN-led smoking cessation intervention being tested in a randomized controlled trial in Australian general practice. Process evaluation was conducted by means of semi-structured telephone interviews with GPs and PNs allocated in the intervention arm (Quit with PN) of the Quit in General Practice trial. Interviews focussed on nurse training, content and implementation of the intervention. Twenty-two PNs and 15 GPs participated in the interviews. The Quit with PN intervention was viewed positively. Most PNs were satisfied with the training and the materials provided. Some challenges in managing patient data and follow-up were identified. The Quit with PN intervention was acceptable to participating PNs and GPs. Issues to be addressed in the planning and wider implementation of future trials of nurse-led intervention in general practice include providing ongoing mentoring support, integration into practice management systems and strategies to promote greater collaboration in GPs and PN teams in general practice. The ongoing feasibility of the intervention was impacted by the funding model supporting PN employment and the competing demands on the PNs time. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. 75 FR 29217 - Office of the Attorney General; Certification Process for State Capital Counsel Systems; Removal...

    Science.gov (United States)

    2010-05-25

    ... Office of the Attorney General; Certification Process for State Capital Counsel Systems; Removal of Final Rule AGENCY: Office of the Attorney General, Department of Justice. ACTION: Notice of proposed... the Attorney General has certified ``that [the] State has established a mechanism for providing...

  16. Electoral Violence and Democratization Process in Nigeria: A Reference of 2011 and 2015 General Elections

    Directory of Open Access Journals (Sweden)

    Lawrence I. EDET

    2015-09-01

    Full Text Available The general account of Nigeria’s post-independence electoral processes has always been characterized by violence. Nigeria’s 2015 general elections marked the fifth multi-party elections in the country and the second handover of civilian administrations since the inception of the Fourth Republic democratic experiment in 1999. This account cannot be analyzed without issues of electoral violence. Electoral violence had been a permanent feature of Nigeria’s democratic process, except 2015 general elections where the international observers described as a “significant improvement” over the previous elections in terms of violence related cases. Electoral related violence in the country particularly in 2011 got to an unprecedented dimension resulting in destruction of lives and property worth millions of naira. This paper expatiates on electoral violence and its general implications on the democratization process in the country, with major emphasis on the 2011 and 2015 general elections. The paper argued that the high incidence of pre and post electoral violence in the country within the periods has to do with the way Nigerian politicians regard politics, weak political institutions and weak electoral management body as well as bias nature of the security agencies, etc. However, the paper examined the general implications of electoral violence on democratization process and how the country can handle the electoral process to avoid threats associated with the electoral violence. Archival analysis, which widely extracted data from newspapers, journals, workshop papers, books, as well as publications of non-governmental organizations was adopted for the study. The major significance of this study is to expose the negative implications associated with electoral violence and how it can be curbed. The position canvassed in this paper will serve as a useful political literature for political leaders, policy makers and the general reading public who

  17. Generalized Hofmann quantum process fidelity bounds for quantum filters

    Science.gov (United States)

    Sedlák, Michal; Fiurášek, Jaromír

    2016-04-01

    We propose and investigate bounds on the quantum process fidelity of quantum filters, i.e., probabilistic quantum operations represented by a single Kraus operator K . These bounds generalize the Hofmann bounds on the quantum process fidelity of unitary operations [H. F. Hofmann, Phys. Rev. Lett. 94, 160504 (2005), 10.1103/PhysRevLett.94.160504] and are based on probing the quantum filter with pure states forming two mutually unbiased bases. Determination of these bounds therefore requires far fewer measurements than full quantum process tomography. We find that it is particularly suitable to construct one of the probe bases from the right eigenstates of K , because in this case the bounds are tight in the sense that if the actual filter coincides with the ideal one, then both the lower and the upper bounds are equal to 1. We theoretically investigate the application of these bounds to a two-qubit optical quantum filter formed by the interference of two photons on a partially polarizing beam splitter. For an experimentally convenient choice of factorized input states and measurements we study the tightness of the bounds. We show that more stringent bounds can be obtained by more sophisticated processing of the data using convex optimization and we compare our methods for different choices of the input probe states.

  18. 75 FR 71353 - Office of the Attorney General; Certification Process for State Capital Counsel Systems; Removal...

    Science.gov (United States)

    2010-11-23

    ... the Attorney General; Certification Process for State Capital Counsel Systems; Removal of Final Rule... only if the Attorney General has certified ``that [the] State has established a mechanism for providing... State to qualify for the special habeas procedures, the Attorney General must determine that ``the State...

  19. GENERAL ISSUES CONSIDERING BRAND EQUITY WITHIN THE NATION BRANDING PROCESS

    Directory of Open Access Journals (Sweden)

    Denisa, COTÎRLEA

    2014-11-01

    Full Text Available The present work-paper was written in order to provide an overview of the intangible values that actively contribute to brand capital formation within the nation branding process; through this article, the author tried to emphasize the differences existent between brand capital and brand equity within the context of the nation branding process, which has became a widely approached subject both in the national and international literature. Also, the evolution of brand capital and brand equity was approached, in order to identify and explain their components and their role, by highlighting the entire process of their evolution under a sequence of steps scheme. The results of this paper are focused on the identification of a structured flowchart through which the process of nation branding -and the brand capital itself- are to be perceived as holistic concepts, integrator and inter-correlated ones, easily understood.The methodology used in order to write the present article resumes to all appropriate methods and techniques used for collecting and processing empirical data and information, respectively to observing, sorting, correlating, categorizing, comparing and analyzing data, so that the addressed theoretical elements could have been founded; in the center of the qualitative thematic research addressed in the present article lie general elements belonging to Romania's image and identity promotion.

  20. Introduction of a pyramid guiding process for general musculoskeletal physical rehabilitation

    Directory of Open Access Journals (Sweden)

    Stark Timothy W

    2006-06-01

    Full Text Available Abstract Successful instruction of a complicated subject as Physical Rehabilitation demands organization. To understand principles and processes of such a field demands a hierarchy of steps to achieve the intended outcome. This paper is intended to be an introduction to a proposed pyramid scheme of general physical rehabilitation principles. The purpose of the pyramid scheme is to allow for a greater understanding for the student and patient. As the respected Food Guide Pyramid accomplishes, the student will further appreciate and apply supported physical rehabilitation principles and the patient will understand that there is a progressive method to their functional healing process.

  1. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    Science.gov (United States)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  2. Adherence to diabetes care processes at general practices in the National Capital Region-Delhi, India

    Directory of Open Access Journals (Sweden)

    Roopa Shivashankar

    2016-01-01

    Full Text Available Aim: To assess the level of adherence to diabetes care processes, and associated clinic and patient factors at general practices in Delhi, India. Methods: We interviewed physicians (n = 23 and patients with diabetes (n = 406, and reviewed patient charts at general practices (government = 5; private = 18. We examined diabetes care processes, specifically measurement of weight, blood pressure (BP, glycated hemoglobin (HbA1c, lipids, electrocardiogram, dilated eye, and a foot examination in the last one year. We analyzed clinic and patient factors associated with a number of care processes achieved using multilevel Poisson regression model. Results: The average number of clinic visits per patient was 8.8/year (standard deviation = 5.7, and physicians had access to patient's previous records in only 19.7% of patients. Dilated eye exam, foot exam, and electrocardiogram were completed in 7.4%, 15.1%, and 29.1% of patients, respectively. An estimated 51.7%, 88.4%, and 28.1% had ≥1 measurement of HbA1c, BP, and lipids, respectively. Private clinics, physician access to patient's previous records, use of nonphysicians, patient education, and the presence of diabetes complication were positively associated with a number of care processes in the multivariable model. Conclusion: Adherence to diabetes care processes was suboptimal. Encouraging implementation of quality improvement strategies like Chronic Care Model elements at general practices may improve diabetes care.

  3. General distributions in process algebra

    NARCIS (Netherlands)

    Katoen, Joost P.; d' Argenio, P.R.; Brinksma, Hendrik; Hermanns, H.; Katoen, Joost P.

    2001-01-01

    This paper is an informal tutorial on stochastic process algebras, i.e., process calculi where action occurrences may be subject to a delay that is governed by a (mostly continuous) random variable. Whereas most stochastic process algebras consider delays determined by negative exponential

  4. Profile of science process skills of Preservice Biology Teacher in General Biology Course

    Science.gov (United States)

    Susanti, R.; Anwar, Y.; Ermayanti

    2018-04-01

    This study aims to obtain portrayal images of science process skills among preservice biology teacher. This research took place in Sriwijaya University and involved 41 participants. To collect the data, this study used multiple choice test comprising 40 items to measure the mastery of science process skills. The data were then analyzed in descriptive manner. The results showed that communication aspect outperfomed the other skills with that 81%; while the lowest one was identifying variables and predicting (59%). In addition, basic science process skills was 72%; whereas for integrated skills was a bit lower, 67%. In general, the capability of doing science process skills varies among preservice biology teachers.

  5. Generalized atomic processes for interaction of intense femtosecond XUV- and X-ray radiation with solids

    International Nuclear Information System (INIS)

    Deschaud, B.; Peyrusse, O.; Rosmej, F.B.

    2014-01-01

    Generalized atomic processes are proposed to establish a consistent description from the free-atom approach to the heated and even up to the cold solid. It is based on a rigorous introduction of the Fermi-Dirac statistics, Pauli blocking factors and on the respect of the principle of detailed balance via the introduction of direct and inverse processes. A probability formalism driven by the degeneracy of the free electrons enables to establish a link of atomic rates valid from the heated atom up to the cold solid. This allows to describe photoionization processes in atomic population kinetics and subsequent solid matter heating on a femtosecond time scale. The Auger effect is linked to the 3-body recombination via a generalized 3-body recombination that is identified as a key mechanism, along with the collisional ionization, that follows energy deposition by photoionization of inner shells when short, intense and high-energy radiation interacts with matter. Detailed simulations are carried out for aluminum that highlight the importance of the generalized approach. (authors)

  6. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    CERN Document Server

    Vamos, C; Vereecken, H

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested.

  7. Generalized random walk algorithm for the numerical modeling of complex diffusion processes

    International Nuclear Information System (INIS)

    Vamos, Calin; Suciu, Nicolae; Vereecken, Harry

    2003-01-01

    A generalized form of the random walk algorithm to simulate diffusion processes is introduced. Unlike the usual approach, at a given time all the particles from a grid node are simultaneously scattered using the Bernoulli repartition. This procedure saves memory and computing time and no restrictions are imposed for the maximum number of particles to be used in simulations. We prove that for simple diffusion the method generalizes the finite difference scheme and gives the same precision for large enough number of particles. As an example, simulations of diffusion in random velocity field are performed and the main features of the stochastic mathematical model are numerically tested

  8. DNA Processing and Reassembly on General Purpose FPGA-based Development Boards

    Directory of Open Access Journals (Sweden)

    SZÁSZ Csaba

    2017-05-01

    Full Text Available The great majority of researchers involved in microelectronics generally agree that many scientific challenges in life sciences have associated with them a powerful computational requirement that must be solved before scientific progress can be made. The current trend in Deoxyribonucleic Acid (DNA computing technologies is to develop special hardware platforms capable to provide the needed processing performance at lower cost. In this endeavor the FPGA-based (Field Programmable Gate Arrays configurations aimed to accelerate genome sequencing and reassembly plays a leading role. This paper emphasizes benefits and advantages using general purpose FPGA-based development boards in DNA reassembly applications beside the special hardware architecture solutions. An original approach is unfolded which outlines the versatility of high performance ready-to-use manufacturer development platforms endowed with powerful hardware resources fully optimized for high speed processing applications. The theoretical arguments are supported via an intuitive implementation example where the designer it is discharged from any hardware development effort and completely assisted in exclusive concentration only on software design issues providing greatly reduced application development cycles. The experiments prove that such boards available on the market are suitable to fulfill in all a wide range of DNA sequencing and reassembly applications.

  9. Transition probabilities for general birth-death processes with applications in ecology, genetics, and evolution

    Science.gov (United States)

    Crawford, Forrest W.; Suchard, Marc A.

    2011-01-01

    A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with n current particles, a new particle is born with instantaneous rate λn and a particle dies with instantaneous rate μn. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics. PMID:21984359

  10. General description of few-body break-up processes at threshold

    International Nuclear Information System (INIS)

    Barrachina, R.O.

    2004-01-01

    Full text: In this communication we present a general description of the behavior of fragmentation processes near threshold by analyzing the break-up into two, three and N bodies in steps of increasing complexity. In particular, we describe the effects produced by an N-body threshold behavior in N+1 body break-up processes, as it occurs in situations where one of the fragments acquires almost all the excess energy of the system. Furthermore, we relate the appearance of cusps and discontinuities in single-particle multiply differential cross sections to the threshold behavior of the remaining particles, and apply these ideas to different systems from atomic, molecular and nuclear collision physics. We finally show that, even though the study of ultracold collisions represents the direct way of gathering information on a break-up system near threshold, the analysis of high-energy collisions provides an alternative, and sometimes advantageous, approach

  11. Cortical processes of speech illusions in the general population.

    Science.gov (United States)

    Schepers, E; Bodar, L; van Os, J; Lousberg, R

    2016-10-18

    There is evidence that experimentally elicited auditory illusions in the general population index risk for psychotic symptoms. As little is known about underlying cortical mechanisms of auditory illusions, an experiment was conducted to analyze processing of auditory illusions in a general population sample. In a follow-up design with two measurement moments (baseline and 6 months), participants (n = 83) underwent the White Noise task under simultaneous recording with a 14-lead EEG. An auditory illusion was defined as hearing any speech in a sound fragment containing white noise. A total number of 256 speech illusions (SI) were observed over the two measurements, with a high degree of stability of SI over time. There were 7 main effects of speech illusion on the EEG alpha band-the most significant indicating a decrease in activity at T3 (t = -4.05). Other EEG frequency bands (slow beta, fast beta, gamma, delta, theta) showed no significant associations with SI. SIs are characterized by reduced alpha activity in non-clinical populations. Given the association of SIs with psychosis, follow-up research is required to examine the possibility of reduced alpha activity mediating SIs in high risk and symptomatic populations.

  12. A general-purpose process modelling framework for marine energy systems

    International Nuclear Information System (INIS)

    Dimopoulos, George G.; Georgopoulou, Chariklia A.; Stefanatos, Iason C.; Zymaris, Alexandros S.; Kakalis, Nikolaos M.P.

    2014-01-01

    Highlights: • Process modelling techniques applied in marine engineering. • Systems engineering approaches to manage the complexity of modern ship machinery. • General purpose modelling framework called COSSMOS. • Mathematical modelling of conservation equations and related chemical – transport phenomena. • Generic library of ship machinery component models. - Abstract: High fuel prices, environmental regulations and current shipping market conditions impose ships to operate in a more efficient and greener way. These drivers lead to the introduction of new technologies, fuels, and operations, increasing the complexity of modern ship energy systems. As a means to manage this complexity, in this paper we present the introduction of systems engineering methodologies in marine engineering via the development of a general-purpose process modelling framework for ships named as DNV COSSMOS. Shifting the focus from components – the standard approach in shipping- to systems, widens the space for optimal design and operation solutions. The associated computer implementation of COSSMOS is a platform that models, simulates and optimises integrated marine energy systems with respect to energy efficiency, emissions, safety/reliability and costs, under both steady-state and dynamic conditions. DNV COSSMOS can be used in assessment and optimisation of design and operation problems in existing vessels, new builds as well as new technologies. The main features and our modelling approach are presented and key capabilities are illustrated via two studies on the thermo-economic design and operation optimisation of a combined cycle system for large bulk carriers, and the transient operation simulation of an electric marine propulsion system

  13. Generalized hardware post-processing technique for chaos-based pseudorandom number generators

    KAUST Repository

    Barakat, Mohamed L.

    2013-06-01

    This paper presents a generalized post-processing technique for enhancing the pseudorandomness of digital chaotic oscillators through a nonlinear XOR-based operation with rotation and feedback. The technique allows full utilization of the chaotic output as pseudorandom number generators and improves throughput without a significant area penalty. Digital design of a third-order chaotic system with maximum function nonlinearity is presented with verified chaotic dynamics. The proposed post-processing technique eliminates statistical degradation in all output bits, thus maximizing throughput compared to other processing techniques. Furthermore, the technique is applied to several fully digital chaotic oscillators with performance surpassing previously reported systems in the literature. The enhancement in the randomness is further examined in a simple image encryption application resulting in a better security performance. The system is verified through experiment on a Xilinx Virtex 4 FPGA with throughput up to 15.44 Gbit/s and logic utilization less than 0.84% for 32-bit implementations. © 2013 ETRI.

  14. A general theory for radioactive processes in rare earth compounds

    International Nuclear Information System (INIS)

    Acevedo, R.; Meruane, T.

    1998-01-01

    The formal theory of radiative processes in centrosymmetric coordination compounds of the Ln X 3+ is a trivalent lanthanide ion and X -1 =Cl -1 , Br -1 ) is put forward based on a symmetry vibronic crystal field-ligand polarisation model. This research considers a truncated basis set for the intermediate states of the central metal ion and have derived general master equations to account for both the overall observed spectral intensities and the measured relative vibronic intensity distributions for parity forbidden but vibronically allowed electronic transitions. In addition, a procedure which includes the closure approximation over the intermediate electronic states is included in order to estimate quantitative crystal field contribution to the total transition dipole moments of various and selected electronic transitions. This formalism is both general and flexible and it may be employed in any electronic excitations involving f N type configurations for the rare earths in centrosymmetric co-ordination compounds in cubic environments and also in doped host crystals belonging to the space group Fm 3m. (author)

  15. A Poisson process approximation for generalized K-5 confidence regions

    Science.gov (United States)

    Arsham, H.; Miller, D. R.

    1982-01-01

    One-sided confidence regions for continuous cumulative distribution functions are constructed using empirical cumulative distribution functions and the generalized Kolmogorov-Smirnov distance. The band width of such regions becomes narrower in the right or left tail of the distribution. To avoid tedious computation of confidence levels and critical values, an approximation based on the Poisson process is introduced. This aproximation provides a conservative confidence region; moreover, the approximation error decreases monotonically to 0 as sample size increases. Critical values necessary for implementation are given. Applications are made to the areas of risk analysis, investment modeling, reliability assessment, and analysis of fault tolerant systems.

  16. General Self-Esteem of Adolescents from Ethnic Minorities in the Netherlands and the Reflected Appraisal Process.

    Science.gov (United States)

    Verkuyten, Maykel

    1988-01-01

    Examined lack of differences in general self-esteem between adolescents of ethnic minorities and Dutch adolescents, focusing on reflected appraisal process. Found significant relationship between general self-esteem and perceived evaluation of family members (and no such relationship with nonfamily members) for ethnic minority adolescents;…

  17. Technical Safety Requirements for the B695 Segment of the Decontamination and Waste Treatment Facility

    International Nuclear Information System (INIS)

    Larson, H L

    2007-01-01

    waste. Upon approval of the permit modification, B696S rooms 1007, 1008, and 1009 will be able to store hazardous and mixed waste for up to 1 year. Furthermore, an additional drum crusher and a Waste Packaging Unit will be permitted to treat hazardous and mixed waste. RHWM generally processes LLW with no, or extremely low, concentrations of transuranics (i.e., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., 90 Sr, 137 Cs, 3 H. Chapter 5 of the DSA documents the derivation of TSRs and develops the operational limits that protect the safety envelope defined for this facility. The DSA is applicable to the handling of radioactive waste stored and treated in the B695 Segment of the DWTF. Section 5 of the TSR, Administrative Controls, contains those Administrative Controls necessary to ensure safe operation of the B695 Segment of the DWTF. A basis explanation follows each of the requirements described in Section 5.5, Specific Administrative Controls. The basis explanation does not constitute an additional requirement, but is intended as an expansion of the logic and reasoning behind development of the requirement. Programmatic Administrative Controls are addressed in Section 5.6

  18. General emotion processing in social anxiety disorder: neural issues of cognitive control.

    Science.gov (United States)

    Brühl, Annette Beatrix; Herwig, Uwe; Delsignore, Aba; Jäncke, Lutz; Rufer, Michael

    2013-05-30

    Anxiety disorders are characterized by deficient emotion regulation prior to and in anxiety-evoking situations. Patients with social anxiety disorder (SAD) have increased brain activation also during the anticipation and perception of non-specific emotional stimuli pointing to biased general emotion processing. In the current study we addressed the neural correlates of emotion regulation by cognitive control during the anticipation and perception of non-specific emotional stimuli in patients with SAD. Thirty-two patients with SAD underwent functional magnetic resonance imaging during the announced anticipation and perception of emotional stimuli. Half of them were trained and instructed to apply reality-checking as a control strategy, the others anticipated and perceived the stimuli. Reality checking significantly (pperception of negative emotional stimuli. The medial prefrontal cortex was comparably active in both groups (p>0.50). The results suggest that cognitive control in patients with SAD influences emotion processing structures, supporting the usefulness of emotion regulation training in the psychotherapy of SAD. In contrast to studies in healthy subjects, cognitive control was not associated with increased activation of prefrontal regions in SAD. This points to possibly disturbed general emotion regulating circuits in SAD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Improvement of the Model of Enterprise Management Process on the Basis of General Management Functions

    Directory of Open Access Journals (Sweden)

    Ruslan Skrynkovskyy

    2017-12-01

    Full Text Available The purpose of the article is to improve the model of the enterprise (institution, organization management process on the basis of general management functions. The graphic model of the process of management according to the process-structured management is presented. It has been established that in today's business environment, the model of the management process should include such general management functions as: 1 controlling the achievement of results; 2 planning based on the main goal; 3 coordination and corrective actions (in the system of organization of work and production; 4 action as a form of act (conscious, volitional, directed; 5 accounting system (accounting, statistical, operational-technical and managerial; 6 diagnosis (economic, legal with such subfunctions as: identification of the state and capabilities; analysis (economic, legal, systemic with argumentation; assessment of the state, trends and prospects of development. The prospect of further research in this direction is: 1 the formation of a system of interrelation of functions and management methods, taking into account the presented research results; 2 development of the model of effective and efficient communication business process of the enterprise.

  20. Audit and account billing process in a private general hospital: a case study

    Directory of Open Access Journals (Sweden)

    Raquel Silva Bicalho Zunta

    2017-12-01

    Full Text Available Our study aimed to map, describe and, validate the audit, account billing and billing reports processes in a large, private general hospital.  An exploratory, descriptive, case report study. We conducted non-participatory observation moments in Internal Audit Sectors and  Billing Reports from the hospital, aiming to map the processes which were the study objects. The data obtained was validated by internal and external audit specialists in hospital bills. The described and illustrated processes in three flow-charts favor professionals to rationalize their activities and the time spent in hospital billing, avoiding or minimizing the occurrence of flaws and, generating more effective financial results. The mapping, the description and the audit validation process and billing and, the billing reports propitiated more visibility and legitimacy to actions developed by auditor nurses.

  1. Improving the training process of highly skilled bodybuilders in the preparatory period, general preparatory phase

    Directory of Open Access Journals (Sweden)

    Olexandr Tyhorskyy

    2015-08-01

    Full Text Available Purpose: to improve the method of training highly skilled bodybuilders during the general preparatory phase. Material and Methods: the study involved eight highly skilled athletes, members of the team of Ukraine on bodybuilding. Results: comparative characteristics of the most commonly used methods of training process in bodybuilding. Developed and substantiated the optimal method of training highly skilled bodybuilders during the general preparatory phase of the preparatory period, which can increase body weight through muscle athletes component. Conclusions: based on studies, recommended the optimum method of training highly skilled bodybuilders depending on mezotsykles and microcycles general preparatory phase

  2. How does processing affect storage in working memory tasks? Evidence for both domain-general and domain-specific effects.

    Science.gov (United States)

    Jarrold, Christopher; Tam, Helen; Baddeley, Alan D; Harvey, Caroline E

    2011-05-01

    Two studies that examine whether the forgetting caused by the processing demands of working memory tasks is domain-general or domain-specific are presented. In each, separate groups of adult participants were asked to carry out either verbal or nonverbal operations on exactly the same processing materials while maintaining verbal storage items. The imposition of verbal processing tended to produce greater forgetting even though verbal processing operations took no longer to complete than did nonverbal processing operations. However, nonverbal processing did cause forgetting relative to baseline control conditions, and evidence from the timing of individuals' processing responses suggests that individuals in both processing groups slowed their responses in order to "refresh" the memoranda. Taken together the data suggest that processing has a domain-general effect on working memory performance by impeding refreshment of memoranda but can also cause effects that appear domain-specific and that result from either blocking of rehearsal or interference.

  3. Generalized Fractional Processes with Long Memory and Time Dependent Volatility Revisited

    Directory of Open Access Journals (Sweden)

    M. Shelton Peiris

    2016-09-01

    Full Text Available In recent years, fractionally-differenced processes have received a great deal of attention due to their flexibility in financial applications with long-memory. This paper revisits the class of generalized fractionally-differenced processes generated by Gegenbauer polynomials and the ARMA structure (GARMA with both the long-memory and time-dependent innovation variance. We establish the existence and uniqueness of second-order solutions. We also extend this family with innovations to follow GARCH and stochastic volatility (SV. Under certain regularity conditions, we give asymptotic results for the approximate maximum likelihood estimator for the GARMA-GARCH model. We discuss a Monte Carlo likelihood method for the GARMA-SV model and investigate finite sample properties via Monte Carlo experiments. Finally, we illustrate the usefulness of this approach using monthly inflation rates for France, Japan and the United States.

  4. A national general pediatric clerkship curriculum: the process of development and implementation.

    Science.gov (United States)

    Olson, A L; Woodhead, J; Berkow, R; Kaufman, N M; Marshall, S G

    2000-07-01

    To describe a new national general pediatrics clerkship curriculum, the development process that built national support for its use, and current progress in implementing the curriculum in pediatric clerkships at US allopathic medical schools. CURRICULUM DEVELOPMENT: A curriculum project team of pediatric clerkship directors and an advisory committee representing professional organizations invested in pediatric student education developed the format and content in collaboration with pediatric educators from the Council on Medical Student Education in Pediatrics (COMSEP) and the Ambulatory Pediatric Association (APA). An iterative process or review by clerkship directors, pediatric departmental chairs, and students finalized the content and built support for the final product. The national dissemination process resulted in consensus among pediatric educators that this curriculum should be used as the national curricular guideline for clerkships. MONITORING IMPLEMENTATION: Surveys were mailed to all pediatric clerkship directors before dissemination (November 1994), and in the first and third academic years after national dissemination (March 1996 and September 1997). The 3 surveys assessed schools' implementation of specific components of the curriculum. The final survey also assessed ways the curriculum was used and barriers to implementation. The final curriculum provided objectives and competencies for attitudes, skills, and 18 knowledge areas of general pediatrics. A total of 216 short clinical cases were also provided as an alternative learning method. An accompanying resource manual provided suggested strategies for implementation, teaching, and evaluation. A total of 103 schools responded to survey 1; 84 schools to survey 2; and 85 schools responded to survey 3 from the 125 medical schools surveyed. Before dissemination, 16% of schools were already using the clinical cases. In the 1995-1996 academic year, 70% of schools were using some or all of the curricular

  5. Poisson-process generalization for the trading waiting-time distribution in a double-auction mechanism

    Science.gov (United States)

    Cincotti, Silvano; Ponta, Linda; Raberto, Marco; Scalas, Enrico

    2005-05-01

    In this paper, empirical analyses and computational experiments are presented on high-frequency data for a double-auction (book) market. Main objective of the paper is to generalize the order waiting time process in order to properly model such empirical evidences. The empirical study is performed on the best bid and best ask data of 7 U.S. financial markets, for 30-stock time series. In particular, statistical properties of trading waiting times have been analyzed and quality of fits is evaluated by suitable statistical tests, i.e., comparing empirical distributions with theoretical models. Starting from the statistical studies on real data, attention has been focused on the reproducibility of such results in an artificial market. The computational experiments have been performed within the Genoa Artificial Stock Market. In the market model, heterogeneous agents trade one risky asset in exchange for cash. Agents have zero intelligence and issue random limit or market orders depending on their budget constraints. The price is cleared by means of a limit order book. The order generation is modelled with a renewal process. Based on empirical trading estimation, the distribution of waiting times between two consecutive orders is modelled by a mixture of exponential processes. Results show that the empirical waiting-time distribution can be considered as a generalization of a Poisson process. Moreover, the renewal process can approximate real data and implementation on the artificial stocks market can reproduce the trading activity in a realistic way.

  6. A Dirichlet process mixture of generalized Dirichlet distributions for proportional data modeling.

    Science.gov (United States)

    Bouguila, Nizar; Ziou, Djemel

    2010-01-01

    In this paper, we propose a clustering algorithm based on both Dirichlet processes and generalized Dirichlet distribution which has been shown to be very flexible for proportional data modeling. Our approach can be viewed as an extension of the finite generalized Dirichlet mixture model to the infinite case. The extension is based on nonparametric Bayesian analysis. This clustering algorithm does not require the specification of the number of mixture components to be given in advance and estimates it in a principled manner. Our approach is Bayesian and relies on the estimation of the posterior distribution of clusterings using Gibbs sampler. Through some applications involving real-data classification and image databases categorization using visual words, we show that clustering via infinite mixture models offers a more powerful and robust performance than classic finite mixtures.

  7. 76 FR 4360 - Guidance for Industry on Process Validation: General Principles and Practices; Availability

    Science.gov (United States)

    2011-01-25

    ... and Development (HFM-40), Center for Biologics Evaluation and Research (CBER), Food and Drug...] Guidance for Industry on Process Validation: General Principles and Practices; Availability AGENCY: Food... of Drug Information, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New...

  8. A novel theory: biological processes mostly involve two types of mediators, namely general and specific mediators Endogenous small radicals such as superoxide and nitric oxide may play a role of general mediator in biological processes.

    Science.gov (United States)

    Mo, Jian

    2005-01-01

    A great number of papers have shown that free radicals as well as bioactive molecules can play a role of mediator in a wide spectrum of biological processes, but the biological actions and chemical reactivity of the free radicals are quite different from that of the bioactive molecules, and that a wide variety of bioactive molecules can be easily modified by free radicals due to having functional groups sensitive to redox, and the significance of the interaction between the free radicals and the bioactive molecules in biological processes has been confirmed by the results of some in vitro and in vivo studies. Based on these evidence, this article presented a novel theory about the mediators of biological processes. The essentials of the theory are: (a) mediators of biological processes can be classified into general and specific mediators; the general mediators include two types of free radicals, namely superoxide and nitric oxide; the specific mediators include a wide variety of bioactive molecules, such as specific enzymes, transcription factors, cytokines and eicosanoids; (b) a general mediator can modify almost any class of the biomolecules, and thus play a role of mediator in nearly every biological process via diverse mechanisms; a specific mediator always acts selectively on certain classes of the biomolecules, and may play a role of mediator in different biological processes via a same mechanism; (c) biological processes are mostly controlled by networks of their mediators, so the free radicals can regulate the last consequence of a biological process by modifying some types of the bioactive molecules, or in cooperation with these bioactive molecules; the biological actions of superoxide and nitric oxide may be synergistic or antagonistic. According to this theory, keeping the integrity of these networks and the balance between the free radicals and the bioactive molecules as well as the balance between the free radicals and the free radical scavengers

  9. ABOUT THE GENERAL CONCEPT OF THE UNIVERSAL STORAGE SYSTEM AND PRACTICE-ORIENTED DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    L. V. Rudikova

    2017-01-01

    Full Text Available Approaches evolution and concept of data accumulation in warehouse and subsequent Data Mining use is perspective due to the fact that, Belarusian segment of the same IT-developments is organizing. The article describes the general concept for creation a system of storage and practice-oriented data analysis, based on the data warehousing technology. The main aspect in universal system design on storage layer and working with data is approach uses extended data warehouse, based on universal platform of stored data, which grants access to storage and subsequent data analysis different structure and subject domains have compound’s points (nodes and extended functional with data structure choice option for data storage and subsequent intrasystem integration. Describe the universal system general architecture of storage and analysis practice-oriented data, structural elements. Main components of universal system for storage and processing practice-oriented data are: online data sources, ETL-process, data warehouse, subsystem of analysis, users. An important place in the system is analytical processing of data, information search, document’s storage and providing a software interface for accessing the functionality of the system from the outside. An universal system based on describing concept will allow collection information of different subject domains, get analytical summaries, do data processing and apply appropriate Data Mining methods and algorithms.

  10. Understanding the process of patient satisfaction with nurse-led chronic disease management in general practice.

    Science.gov (United States)

    Mahomed, Rosemary; St John, Winsome; Patterson, Elizabeth

    2012-11-01

      To investigate the process of patient satisfaction with nurse-led chronic disease management in Australian general practice.   Nurses working in the primary care context of general practice, referred to as practice nurses, are expanding their role in chronic disease management; this is relatively new to Australia. Therefore, determining patient satisfaction with this trend is pragmatically and ethically important. However, the concept of patient satisfaction is not well understood particularly in relation to care provided by practice nurses.   A grounded theory study underpinned by a relativist ontological position and a relativist epistemology.   Grounded theory was used to develop a theory from data collected through in-depth interviews with 38 participants between November 2007-April 2009. Participants were drawn from a larger project that trialled a practice nurse-led, collaborative model of chronic disease management in three Australian general practices. Theoretical sampling, data collection, and analysis were conducted concurrently consistent with grounded theory methods.   Patients undergo a cyclical process of Navigating Care involving three stages, Determining Care Needs, Forming Relationship, and Having Confidence. The latter two processes are inter-related and a feedback loop from them informs subsequent cycles of Determining Care Needs. If any of these steps fails to develop adequately, patients are likely to opt out of nurse-led care.   Navigating Care explains how and why time, communication, continuity, and trust in general practitioners and nurses are important to patient satisfaction. It can be used in identifying suitable patients for practice nurse-led care and to inform the practice and organization of practice nurse-led care to enhance patient satisfaction. © 2012 Blackwell Publishing Ltd.

  11. Kerr-de Sitter spacetime, Penrose process, and the generalized area theorem

    Science.gov (United States)

    Bhattacharya, Sourav

    2018-04-01

    We investigate various aspects of energy extraction via the Penrose process in the Kerr-de Sitter spacetime. We show that the increase in the value of a positive cosmological constant, Λ , always reduces the efficiency of this process. The Kerr-de Sitter spacetime has two ergospheres associated with the black hole and the cosmological event horizons. We prove by analyzing turning points of the trajectory that the Penrose process in the cosmological ergoregion is never possible. We next show that in this process both the black hole and cosmological event horizons' areas increase, and the latter becomes possible when the particle coming from the black hole ergoregion escapes through the cosmological event horizon. We identify a new, local mass function instead of the mass parameter, to prove this generalized area theorem. This mass function takes care of the local spacetime energy due to the cosmological constant as well, including that which arises due to the frame-dragging effect due to spacetime rotation. While the current observed value of Λ is quite small, its effect in this process could be considerable in the early Universe scenario where its value is much larger, where the two horizons could have comparable sizes. In particular, the various results we obtain here are also evaluated in a triply degenerate limit of the Kerr-de Sitter spacetime we find, in which radial values of the inner, the black hole and the cosmological event horizons are nearly coincident.

  12. General Purpose Graphics Processing Unit Based High-Rate Rice Decompression and Reed-Solomon Decoding

    Energy Technology Data Exchange (ETDEWEB)

    Loughry, Thomas A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    As the volume of data acquired by space-based sensors increases, mission data compression/decompression and forward error correction code processing performance must likewise scale. This competency development effort was explored using the General Purpose Graphics Processing Unit (GPGPU) to accomplish high-rate Rice Decompression and high-rate Reed-Solomon (RS) decoding at the satellite mission ground station. Each algorithm was implemented and benchmarked on a single GPGPU. Distributed processing across one to four GPGPUs was also investigated. The results show that the GPGPU has considerable potential for performing satellite communication Data Signal Processing, with three times or better performance improvements and up to ten times reduction in cost over custom hardware, at least in the case of Rice Decompression and Reed-Solomon Decoding.

  13. Adolescents with Developmental Dyscalculia Do Not Have a Generalized Magnitude Deficit - Processing of Discrete and Continuous Magnitudes.

    Science.gov (United States)

    McCaskey, Ursina; von Aster, Michael; O'Gorman Tuura, Ruth; Kucian, Karin

    2017-01-01

    The link between number and space has been discussed in the literature for some time, resulting in the theory that number, space and time might be part of a generalized magnitude system. To date, several behavioral and neuroimaging findings support the notion of a generalized magnitude system, although contradictory results showing a partial overlap or separate magnitude systems are also found. The possible existence of a generalized magnitude processing area leads to the question how individuals with developmental dyscalculia (DD), known for deficits in numerical-arithmetical abilities, process magnitudes. By means of neuropsychological tests and functional magnetic resonance imaging (fMRI) we aimed to examine the relationship between number and space in typical and atypical development. Participants were 16 adolescents with DD (14.1 years) and 14 typically developing (TD) peers (13.8 years). In the fMRI paradigm participants had to perform discrete (arrays of dots) and continuous magnitude (angles) comparisons as well as a mental rotation task. In the neuropsychological tests, adolescents with dyscalculia performed significantly worse in numerical and complex visuo-spatial tasks. However, they showed similar results to TD peers when making discrete and continuous magnitude decisions during the neuropsychological tests and the fMRI paradigm. A conjunction analysis of the fMRI data revealed commonly activated higher order visual (inferior and middle occipital gyrus) and parietal (inferior and superior parietal lobe) magnitude areas for the discrete and continuous magnitude tasks. Moreover, no differences were found when contrasting both magnitude processing conditions, favoring the possibility of a generalized magnitude system. Group comparisons further revealed that dyscalculic subjects showed increased activation in domain general regions, whilst TD peers activate domain specific areas to a greater extent. In conclusion, our results point to the existence of a

  14. Identifying critical success factors for designing selection processes into postgraduate specialty training: the case of UK general practice.

    Science.gov (United States)

    Plint, Simon; Patterson, Fiona

    2010-06-01

    The UK national recruitment process into general practice training has been developed over several years, with incremental introduction of stages which have been piloted and validated. Previously independent processes, which encouraged multiple applications and produced inconsistent outcomes, have been replaced by a robust national process which has high reliability and predictive validity, and is perceived to be fair by candidates and allocates applicants equitably across the country. Best selection practice involves a job analysis which identifies required competencies, then designs reliable assessment methods to measure them, and over the long term ensures that the process has predictive validity against future performance. The general practitioner recruitment process introduced machine markable short listing assessments for the first time in the UK postgraduate recruitment context, and also adopted selection centre workplace simulations. The key success factors have been identified as corporate commitment to the goal of a national process, with gradual convergence maintaining locus of control rather than the imposition of change without perceived legitimate authority.

  15. The Green-Kubo formula for general Markov processes with a continuous time parameter

    International Nuclear Information System (INIS)

    Yang Fengxia; Liu Yong; Chen Yong

    2010-01-01

    For general Markov processes, the Green-Kubo formula is shown to be valid under a mild condition. A class of stochastic evolution equations on a separable Hilbert space and three typical infinite systems of locally interacting diffusions on Z d (irreversible in most cases) are shown to satisfy the Green-Kubo formula, and the Einstein relations for these stochastic evolution equations are shown explicitly as a corollary.

  16. Bio-inspired Artificial Intelligence: А Generalized Net Model of the Regularization Process in MLP

    Directory of Open Access Journals (Sweden)

    Stanimir Surchev

    2013-10-01

    Full Text Available Many objects and processes inspired by the nature have been recreated by the scientists. The inspiration to create a Multilayer Neural Network came from human brain as member of the group. It possesses complicated structure and it is difficult to recreate, because of the existence of too many processes that require different solving methods. The aim of the following paper is to describe one of the methods that improve learning process of Artificial Neural Network. The proposed generalized net method presents Regularization process in Multilayer Neural Network. The purpose of verification is to protect the neural network from overfitting. The regularization is commonly used in neural network training process. Many methods of verification are present, the subject of interest is the one known as Regularization. It contains function in order to set weights and biases with smaller values to protect from overfitting.

  17. Shore Shapers: Introducing children and the general public to biogeomorphological processes and geodiversity

    Science.gov (United States)

    Naylor, Larissa; Coombes, Martin; Sewell, Jack; White, Anissia

    2014-05-01

    Coastal processes shape the coast into a variety of eye-catching and enticing landforms that attract people to marvel at, relax and enjoy coastal geomorphology. Field guides to explain these processes (and the geodiversity that results) to the general public and children are few and far between. In contrast, there is a relative wealth of resources and organised activities introducing people to coastal wildlife, especially on rocky shores. These biological resources typically focus on the biology and climatic controls on their distribution, rather than how the biology interacts with its physical habitat. As an outcome of two recent rock coast biogeomorphology projects (www.biogeomorph.org/coastal/coastaldefencedbiodiversity and www.biogeomorph.org/coastal/bioprotection ), we produced the first known guide to understanding how biogeomorphological processes help create coastal landforms. The 'Shore Shapers' guide (www.biogeomorph.org/coastal/shoreshapers) is designed to: a) bring biotic-geomorphic interactions to life and b) introduce some of the geomorphological and geological controls on biogeomorphic processes and landform development. The guide provides scientific information in an accessible and interactive way - to help sustain children's interest and extend their learning. We tested a draft version of our guide with children, the general public and volunteers on rocky shore rambles using social science techniques and of 74 respondents, 75.6% were more interested in understanding how rock pools (i.e. coastal landforms) develop after seeing the guide. Respondents' opinions about key bioprotective species also changed as a result of seeing the guide - 58% of people found barnacles unattractive before they saw the guide whilst 36% of respondents were more interested in barnacles after seeing the guide. These results demonstrate that there is considerable interest in more educational materials on coastal biogeomorphology and geodiversity.

  18. A Business Process Management System based on a General Optimium Criterion

    Directory of Open Access Journals (Sweden)

    Vasile MAZILESCU

    2009-01-01

    Full Text Available Business Process Management Systems (BPMS provide a broadrange of facilities to manage operational business processes. These systemsshould provide support for the complete Business Process Management (BPMlife-cycle [16]: (redesign, configuration, execution, control, and diagnosis ofprocesses. BPMS can be seen as successors of Workflow Management (WFMsystems. However, already in the seventies people were working on officeautomation systems which are comparable with today’s WFM systems.Recently, WFM vendors started to position their systems as BPMS. Our paper’sgoal is a proposal for a Tasks-to-Workstations Assignment Algorithm (TWAAfor assembly lines which is a special implementation of a stochastic descenttechnique, in the context of BPMS, especially at the control level. Both cases,single and mixed-model, are treated. For a family of product models having thesame generic structure, the mixed-model assignment problem can be formulatedthrough an equivalent single-model problem. A general optimum criterion isconsidered. As the assembly line balancing, this kind of optimisation problemleads to a graph partitioning problem meeting precedence and feasibilityconstraints. The proposed definition for the "neighbourhood" function involvesan efficient way for treating the partition and precedence constraints. Moreover,the Stochastic Descent Technique (SDT allows an implicit treatment of thefeasibility constraint. The proposed algorithm converges with probability 1 toan optimal solution.

  19. Attention allocation: Relationships to general working memory or specific language processing.

    Science.gov (United States)

    Archibald, Lisa M D; Levee, Tyler; Olino, Thomas

    2015-11-01

    Attention allocation, updating working memory, and language processing are interdependent cognitive tasks related to the focused direction of limited resources, refreshing and substituting information in the current focus of attention, and receiving/sending verbal communication, respectively. The current study systematically examined the relationship among executive attention, working memory executive skills, and language abilities while adjusting for individual differences in short-term memory. School-age children completed a selective attention task requiring them to recall whether a presented shape was in the same place as a previous target shape shown in an array imposing a low or high working memory load. Results revealed a selective attention cost when working above but not within memory span capacity. Measures of general working memory were positively related to overall task performance, whereas language abilities were related to response time. In particular, higher language skills were associated with faster responses under low load conditions. These findings suggest that attentional control and storage demands have an additive impact on working memory resources but provide only limited evidence for a domain-general mechanism in language learning. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  20. A Latent Variable Analysis of Working Memory Capacity, Short-Term Memory Capacity, Processing Speed, and General Fluid Intelligence.

    Science.gov (United States)

    Conway, Andrew R. A.; Cowan, Nelsin; Bunting, Michael F.; Therriault, David J.; Minkoff, Scott R. B.

    2002-01-01

    Studied the interrelationships among general fluid intelligence, short-term memory capacity, working memory capacity, and processing speed in 120 young adults and used structural equation modeling to determine the best predictor of general fluid intelligence. Results suggest that working memory capacity, but not short-term memory capacity or…

  1. Adolescents with Developmental Dyscalculia Do Not Have a Generalized Magnitude Deficit – Processing of Discrete and Continuous Magnitudes

    Science.gov (United States)

    McCaskey, Ursina; von Aster, Michael; O’Gorman Tuura, Ruth; Kucian, Karin

    2017-01-01

    The link between number and space has been discussed in the literature for some time, resulting in the theory that number, space and time might be part of a generalized magnitude system. To date, several behavioral and neuroimaging findings support the notion of a generalized magnitude system, although contradictory results showing a partial overlap or separate magnitude systems are also found. The possible existence of a generalized magnitude processing area leads to the question how individuals with developmental dyscalculia (DD), known for deficits in numerical-arithmetical abilities, process magnitudes. By means of neuropsychological tests and functional magnetic resonance imaging (fMRI) we aimed to examine the relationship between number and space in typical and atypical development. Participants were 16 adolescents with DD (14.1 years) and 14 typically developing (TD) peers (13.8 years). In the fMRI paradigm participants had to perform discrete (arrays of dots) and continuous magnitude (angles) comparisons as well as a mental rotation task. In the neuropsychological tests, adolescents with dyscalculia performed significantly worse in numerical and complex visuo-spatial tasks. However, they showed similar results to TD peers when making discrete and continuous magnitude decisions during the neuropsychological tests and the fMRI paradigm. A conjunction analysis of the fMRI data revealed commonly activated higher order visual (inferior and middle occipital gyrus) and parietal (inferior and superior parietal lobe) magnitude areas for the discrete and continuous magnitude tasks. Moreover, no differences were found when contrasting both magnitude processing conditions, favoring the possibility of a generalized magnitude system. Group comparisons further revealed that dyscalculic subjects showed increased activation in domain general regions, whilst TD peers activate domain specific areas to a greater extent. In conclusion, our results point to the existence of a

  2. Technical Safety Requirements for the B695 Segment of the Decontamination and Waste Treatment Facility

    Energy Technology Data Exchange (ETDEWEB)

    Larson, H L

    2007-09-07

    mixed waste. Upon approval of the permit modification, B696S rooms 1007, 1008, and 1009 will be able to store hazardous and mixed waste for up to 1 year. Furthermore, an additional drum crusher and a Waste Packaging Unit will be permitted to treat hazardous and mixed waste. RHWM generally processes LLW with no, or extremely low, concentrations of transuranics (i.e., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., {sup 90}Sr, {sup 137}Cs, {sup 3}H. Chapter 5 of the DSA documents the derivation of TSRs and develops the operational limits that protect the safety envelope defined for this facility. The DSA is applicable to the handling of radioactive waste stored and treated in the B695 Segment of the DWTF. Section 5 of the TSR, Administrative Controls, contains those Administrative Controls necessary to ensure safe operation of the B695 Segment of the DWTF. A basis explanation follows each of the requirements described in Section 5.5, Specific Administrative Controls. The basis explanation does not constitute an additional requirement, but is intended as an expansion of the logic and reasoning behind development of the requirement. Programmatic Administrative Controls are addressed in Section 5.6.

  3. 17 CFR 279.4 - Form ADV-NR, appointment of agent for service of process by non-resident general partner and non...

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Form ADV-NR, appointment of agent for service of process by non-resident general partner and non-resident managing agent of an... agent for service of process by non-resident general partner and non-resident managing agent of an...

  4. Attorney General forces Infectious Diseases Society of America to redo Lyme guidelines due to flawed development process.

    Science.gov (United States)

    Johnson, L; Stricker, R B

    2009-05-01

    Lyme disease is one of the most controversial illnesses in the history of medicine. In 2006 the Connecticut Attorney General launched an antitrust investigation into the Lyme guidelines development process of the Infectious Diseases Society of America (IDSA). In a recent settlement with IDSA, the Attorney General noted important commercial conflicts of interest and suppression of scientific evidence that had tainted the guidelines process. This paper explores two broad ethical themes that influenced the IDSA investigation. The first is the growing problem of conflicts of interest among guidelines developers, and the second is the increasing centralisation of medical decisions by insurance companies, which use treatment guidelines as a means of controlling the practices of individual doctors and denying treatment for patients. The implications of the first-ever antitrust investigation of medical guidelines and the proposed model to remediate the tainted IDSA guidelines process are also discussed.

  5. The Generalized Support Software (GSS) Domain Engineering Process: An Object-Oriented Implementation and Reuse Success at Goddard Space Flight Center

    Science.gov (United States)

    Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren

    1997-01-01

    The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions

  6. Analysis of Memory Formation during General Anesthesia (Propofol/Remifentanil) for Elective Surgery Using the Process-dissociation Procedure.

    Science.gov (United States)

    Hadzidiakos, Daniel; Horn, Nadja; Degener, Roland; Buchner, Axel; Rehberg, Benno

    2009-08-01

    There have been reports of memory formation during general anesthesia. The process-dissociation procedure has been used to determine if these are controlled (explicit/conscious) or automatic (implicit/unconscious) memories. This study used the process-dissociation procedure with the original measurement model and one which corrected for guessing to determine if more accurate results were obtained in this setting. A total of 160 patients scheduled for elective surgery were enrolled. Memory for words presented during propofol and remifentanil general anesthesia was tested postoperatively by using a word-stem completion task in a process-dissociation procedure. To assign possible memory effects to different levels of anesthetic depth, the authors measured depth of anesthesia using the BIS XP monitor (Aspect Medical Systems, Norwood, MA). Word-stem completion performance showed no evidence of memory for intraoperatively presented words. Nevertheless, an evaluation of these data using the original measurement model for process-dissociation data suggested an evidence of controlled (C = 0.05; 95% confidence interval [CI] 0.02-0.08) and automatic (A = 0.11; 95% CI 0.09-0.12) memory processes (P memory processes was obtained. The authors report and discuss parallel findings for published data sets that were generated by using the process-dissociation procedure. Patients had no memories for auditory information presented during propofol/remifentanil anesthesia after midazolam premedication. The use of the process-dissociation procedure with the original measurement model erroneously detected memories, whereas the extended model, corrected for guessing, correctly revealed no memory.

  7. Generalized Least Energy of Separation for Desalination and Other Chemical Separation Processes

    Directory of Open Access Journals (Sweden)

    Karan H. Mistry

    2013-05-01

    Full Text Available Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies driven by different combinations of heat, work, and chemical energy. This paper develops a consistent basis for comparing the energy consumption of such technologies using Second Law efficiency. The Second Law efficiency for a chemical separation process is defined in terms of the useful exergy output, which is the minimum least work of separation required to extract a unit of product from a feed stream of a given composition. For a desalination process, this is the minimum least work of separation for producing one kilogram of product water from feed of a given salinity. While definitions in terms of work and heat input have been proposed before, this work generalizes the Second Law efficiency to allow for systems that operate on a combination of energy inputs, including fuel. The generalized equation is then evaluated through a parametric study considering work input, heat inputs at various temperatures, and various chemical fuel inputs. Further, since most modern, large-scale desalination plants operate in cogeneration schemes, a methodology for correctly evaluating Second Law efficiency for the desalination plant based on primary energy inputs is demonstrated. It is shown that, from a strictly energetic point of view and based on currently available technology, cogeneration using electricity to power a reverse osmosis system is energetically superior to thermal systems such as multiple effect distillation and multistage flash distillation, despite the very low grade heat input normally applied in those systems.

  8. Teaching Improvisation through Processes. Applications in Music Education and Implications for General Education

    Directory of Open Access Journals (Sweden)

    Michele Biasutti

    2017-06-01

    Full Text Available Improvisation is an articulated multidimensional activity based on an extemporaneous creative performance. Practicing improvisation, participants expand sophisticated skills such as sensory and perceptual encoding, memory storage and recall, motor control, and performance monitoring. Improvisation abilities have been developed following several methodologies mainly with a product-oriented perspective. A model framed under the socio-cultural theory of learning for designing didactic activities on processes instead of outcomes is presented in the current paper. The challenge is to overcome the mere instructional dimension of some practices of teaching improvisation by designing activities that stimulate self-regulated learning strategies in the students. In the article the present thesis is declined in three ways, concerning the following three possible areas of application: (1 high-level musical learning, (2 musical pedagogy with children, (3 general pedagogy. The applications in the music field focusing mainly on an expert's use of improvisation are discussed. The last section considers how these ideas should transcend music studies, presenting the benefits and the implications of improvisation activities for general learning. Moreover, the application of music education to the following cognitive processes are discussed: anticipation, use of repertoire, emotive communication, feedback and flow. These characteristics could be used to outline a pedagogical method for teaching music improvisation based on the development of reflection, reasoning, and meta-cognition.

  9. Teaching Improvisation through Processes. Applications in Music Education and Implications for General Education.

    Science.gov (United States)

    Biasutti, Michele

    2017-01-01

    Improvisation is an articulated multidimensional activity based on an extemporaneous creative performance. Practicing improvisation, participants expand sophisticated skills such as sensory and perceptual encoding, memory storage and recall, motor control, and performance monitoring. Improvisation abilities have been developed following several methodologies mainly with a product-oriented perspective. A model framed under the socio-cultural theory of learning for designing didactic activities on processes instead of outcomes is presented in the current paper. The challenge is to overcome the mere instructional dimension of some practices of teaching improvisation by designing activities that stimulate self-regulated learning strategies in the students. In the article the present thesis is declined in three ways, concerning the following three possible areas of application: (1) high-level musical learning, (2) musical pedagogy with children, (3) general pedagogy. The applications in the music field focusing mainly on an expert's use of improvisation are discussed. The last section considers how these ideas should transcend music studies, presenting the benefits and the implications of improvisation activities for general learning. Moreover, the application of music education to the following cognitive processes are discussed: anticipation, use of repertoire, emotive communication, feedback and flow. These characteristics could be used to outline a pedagogical method for teaching music improvisation based on the development of reflection, reasoning, and meta-cognition.

  10. Degradation data analysis based on a generalized Wiener process subject to measurement error

    Science.gov (United States)

    Li, Junxing; Wang, Zhihua; Zhang, Yongbo; Fu, Huimin; Liu, Chengrui; Krishnaswamy, Sridhar

    2017-09-01

    Wiener processes have received considerable attention in degradation modeling over the last two decades. In this paper, we propose a generalized Wiener process degradation model that takes unit-to-unit variation, time-correlated structure and measurement error into considerations simultaneously. The constructed methodology subsumes a series of models studied in the literature as limiting cases. A simple method is given to determine the transformed time scale forms of the Wiener process degradation model. Then model parameters can be estimated based on a maximum likelihood estimation (MLE) method. The cumulative distribution function (CDF) and the probability distribution function (PDF) of the Wiener process with measurement errors are given based on the concept of the first hitting time (FHT). The percentiles of performance degradation (PD) and failure time distribution (FTD) are also obtained. Finally, a comprehensive simulation study is accomplished to demonstrate the necessity of incorporating measurement errors in the degradation model and the efficiency of the proposed model. Two illustrative real applications involving the degradation of carbon-film resistors and the wear of sliding metal are given. The comparative results show that the constructed approach can derive a reasonable result and an enhanced inference precision.

  11. Differing influence on delays in the case-finding process for tuberculosis between general physicians and specialists in Mongolia.

    Science.gov (United States)

    Enkhbat, S; Toyota, M; Yasuda, N; Ohara, H

    1997-06-01

    The objective of this study is to compare the influence on delays in the tuberculosis case-finding process according to the types of medical facilities initially visited. The subjects include 107 patients 16 years and older who were diagnosed with bacteriologically confirmed pulmonary tuberculosis at nine tuberculosis specialized facilities in Ulaanbaatar, Mongolia from May 1995 to March 1996. Patients were interviewed about their demographic and socioeconomic factors and their medical records were reviewed for measuring delays. Fifty-five patients initially consulted general physicians and the remaining 52 patients initially visited other types of facilities including tuberculosis specialized facilities. Patients who initially consulted general physicians had shorter patient's delays and longer doctor's delays than those who had visited other facilities first. Since the reduction of patient's delay outweighs the extension of doctor's delay among patients who initially consulted general physicians, their total delay was shorter than that of patients who visited other facilities first. The beneficial influence of consulting general physicians first on the total delay was observed after adjusting for patient's age, sex, residence area, family income and family history of tuberculosis. This finding indicates that general physicians play an important role in improving the passive case-finding process in Mongolia.

  12. Forbidden Raman scattering processes. I. General considerations and E1--M1 scattering

    International Nuclear Information System (INIS)

    Harney, R.C.

    1979-01-01

    The generalized theory of forbidden Raman scattering processes is developed in terms of the multipole expansion of the electromagnetic interaction Hamiltonian. Using the general expressions, the theory of electric dipole--magnetic dipole (E1--M1) Raman scattering is derived in detail. The 1 S 0 → 3 P 1 E1--M1 Raman scattering cross section in atomic magnesium is calculated for two applicable laser wavelengths using published f-value data. Since resonantly enhanced cross sections larger than 10 -29 cm 2 /sr are predicted it should be possible to experimentally observe this scattering phenomenon. In addition, by measuring the frequency dependence of the cross section near resonance, it may be possible to directly determine the relative magnitudes of the Axp and AxA contributions to the scattering cross section. Finally, possible applications of the effect in atomic and molecular physics are discussed

  13. Kolmogorov's refined similarity hypotheses for turbulence and general stochastic processes

    International Nuclear Information System (INIS)

    Stolovitzky, G.; Sreenivasan, K.R.

    1994-01-01

    Kolmogorov's refined similarity hypotheses are shown to hold true for a variety of stochastic processes besides high-Reynolds-number turbulent flows, for which they were originally proposed. In particular, just as hypothesized for turbulence, there exists a variable V whose probability density function attains a universal form. Analytical expressions for the probability density function of V are obtained for Brownian motion as well as for the general case of fractional Brownian motion---the latter under some mild assumptions justified a posteriori. The properties of V for the case of antipersistent fractional Brownian motion with the Hurst exponent of 1/3 are similar in many details to those of high-Reynolds-number turbulence in atmospheric boundary layers a few meters above the ground. The one conspicuous difference between turbulence and the antipersistent fractional Brownian motion is that the latter does not possess the required skewness. Broad implications of these results are discussed

  14. The amblyopic deficit for 2nd order processing: Generality and laterality.

    Science.gov (United States)

    Gao, Yi; Reynaud, Alexandre; Tang, Yong; Feng, Lixia; Zhou, Yifeng; Hess, Robert F

    2015-09-01

    A number of previous reports have suggested that the processing of second-order stimuli by the amblyopic eye (AE) is defective and that the fellow non-amblyopic eye (NAE) also exhibits an anomaly. Second-order stimuli involve extra-striate as well as striate processing and provide a means of exploring the extent of the cortical anomaly in amblyopia using psychophysics. We use a range of different second-order stimuli to investigate how general the deficit is for detecting second-order stimuli in adult amblyopes. We compare these results to our previously published adult normative database using the same stimuli and approach to determine the extent to which the detection of these stimuli is defective for both amblyopic and non-amblyopic eye stimulation. The results suggest that the second-order deficit affects a wide range of second-order stimuli, and by implication a large area of extra-striate cortex, both dorsally and ventrally. The NAE is affected only in motion-defined form judgments, suggesting a difference in the degree to which ocular dominance is disrupted in dorsal and ventral extra-striate regions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. General Process for Business Idea Generation

    OpenAIRE

    Halinen, Anu

    2017-01-01

    This thesis presents a process for generating ideas with the intent to propagate new business within a micro-company. Utilizing this newly proposed process, generation of new ideas will be initiated allowing for subsequent business plans to be implemented to grow the existing customer base. Cloudberrywind is a family-owned and family-operated micro company in the Finnish region that offers information technology consulting services and support for project management to improve company efficie...

  16. How Does Processing Affect Storage in Working Memory Tasks? Evidence for Both Domain-General and Domain-Specific Effects

    Science.gov (United States)

    Jarrold, Christopher; Tam, Helen; Baddeley, Alan D.; Harvey, Caroline E.

    2011-01-01

    Two studies that examine whether the forgetting caused by the processing demands of working memory tasks is domain-general or domain-specific are presented. In each, separate groups of adult participants were asked to carry out either verbal or nonverbal operations on exactly the same processing materials while maintaining verbal storage items.…

  17. 17 CFR 249.510 - Form 10-M, consent to service of process by a nonresident general partner of a broker-dealer firm.

    Science.gov (United States)

    2010-04-01

    ..., consent to service of process by a nonresident general partner of a broker-dealer firm. This form shall be... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Form 10-M, consent to service of process by a nonresident general partner of a broker-dealer firm. 249.510 Section 249.510...

  18. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution.

    Science.gov (United States)

    Correia, J R C C C; Martins, C J A P

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  19. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution

    Science.gov (United States)

    Correia, J. R. C. C. C.; Martins, C. J. A. P.

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  20. Information processing during general anesthesia: Evidence for unconscious memory

    NARCIS (Netherlands)

    A.E. Bonebakker (Annette); B. Bonke (Benno); J. Klein (Jan); G. Wolters (G.); Th. Stijnen (Theo); J. Passchier (Jan); P.M. Merikle (P.)

    1996-01-01

    textabstractMemory for words presented during general anesthesia was studied in two experiments. In Experiment 1, surgical patients (n=80) undergoing elective procedures under general anesthesia were presented shortly before and during surgery with words via headphones. At the earliest convenient

  1. Cognitive load and emotional processing in Generalized Anxiety Disorder: Electrocortical evidence for increased distractibility

    OpenAIRE

    MacNamara, Annmarie; Proudfit, Greg Hajcak

    2014-01-01

    Generalized Anxiety Disorder (GAD) may be characterized by emotion regulation deficits attributable to an imbalance between top-down (i.e., goal-driven) and bottom-up (i.e., stimulus-driven) attention. In prior work, these attentional processes were examined by presenting unpleasant and neutral pictures within a working memory paradigm. The late positive potential (LPP) measured attention toward task-irrelevant pictures. Results from this prior work showed that working memory load reduced the...

  2. Generalization of fear inhibition by disrupting hippocampal protein synthesis-dependent reconsolidation process.

    Science.gov (United States)

    Yang, Chih-Hao; Huang, Chiung-Chun; Hsu, Kuei-Sen

    2011-09-01

    Repetitive replay of fear memories may precipitate the occurrence of post-traumatic stress disorder and other anxiety disorders. Hence, the suppression of fear memory retrieval may help prevent and treat these disorders. The formation of fear memories is often linked to multiple environmental cues and these interconnected cues may act as reminders for the recall of traumatic experiences. However, as a convenience, a simple paradigm of one cue pairing with the aversive stimulus is usually used in studies of fear conditioning in animals. Here, we built a more complex fear conditioning model by presenting several environmental stimuli during fear conditioning and characterize the effectiveness of extinction training and the disruption of reconsolidation process on the expression of learned fear responses. We demonstrate that extinction training with a single-paired cue resulted in cue-specific attenuation of fear responses but responses to other cures were unchanged. The cue-specific nature of the extinction persisted despite training sessions combined with D-cycloserine treatment reveals a significant weakness in extinction-based treatment. In contrast, the inhibition of the dorsal hippocampus (DH) but not the basolateral amygdala (BLA)-dependent memory reconsolidation process using either protein synthesis inhibitors or genetic disruption of cAMP-response-element-binding protein-mediated transcription comprehensively disrupted the learned connections between fear responses and all paired environmental cues. These findings emphasize the distinct role of the DH and the BLA in the reconsolidation process of fear memories and further indicate that the disruption of memory reconsolidation process in the DH may result in generalization of fear inhibition.

  3. The utility of an online diagnostic decision support system (Isabel) in general practice: a process evaluation.

    Science.gov (United States)

    Henderson, Emily J; Rubin, Greg P

    2013-05-01

    To evaluate the utility of Isabel, an online diagnostic decision support system developed by Isabel Healthcare primarily for secondary medical care, in the general practice setting. Focus groups were conducted with clinicians to understand why and how they used the system. A modified online post-use survey asked practitioners about its impact on their decision-making. Normalization process theory (NPT) was used as a theoretical framework to determine whether the system could be incorporated into routine clinical practice. The system was introduced by NHS County Durham and Darlington in the UK in selected general practices as a three-month pilot. General practitioners and nurse practitioners who had access to Isabel as part of the Primary Care Trust's pilot. General practitioners' views, experiences and usage of the system. Seven general practices agreed to pilot Isabel. Two practices did not subsequently use it. The remaining five practices conducted searches on 16 patients. Post-use surveys (n = 10) indicated that Isabel had little impact on diagnostic decision-making. Focus group participants stated that, although the diagnoses produced by Isabel in general did not have an impact on their decision-making, they would find the tool useful if it were better tailored to the primary care setting. Our analysis concluded that normalization was not likely to occur in its current form. Isabel was of limited utility in this short pilot study and may need further modification for use in general practice.

  4. A generalized multi-dimensional mathematical model for charging and discharging processes in a supercapacitor

    Energy Technology Data Exchange (ETDEWEB)

    Allu, Srikanth [ORNL; Velamur Asokan, Badri [Exxon Mobil Research and Engineering; Shelton, William A [Louisiana State University; Philip, Bobby [ORNL; Pannala, Sreekanth [ORNL

    2014-01-01

    A generalized three dimensional computational model based on unied formulation of electrode- electrolyte-electrode system of a electric double layer supercapacitor has been developed. The model accounts for charge transport across the solid-liquid system. This formulation based on volume averaging process is a widely used concept for the multiphase ow equations ([28] [36]) and is analogous to porous media theory typically employed for electrochemical systems [22] [39] [12]. This formulation is extended to the electrochemical equations for a supercapacitor in a consistent fashion, which allows for a single-domain approach with no need for explicit interfacial boundary conditions as previously employed ([38]). In this model it is easy to introduce the spatio-temporal variations, anisotropies of physical properties and it is also conducive for introducing any upscaled parameters from lower length{scale simulations and experiments. Due to the irregular geometric congurations including porous electrode, the charge transport and subsequent performance characteristics of the super-capacitor can be easily captured in higher dimensions. A generalized model of this nature also provides insight into the applicability of 1D models ([38]) and where multidimensional eects need to be considered. In addition, simple sensitivity analysis on key input parameters is performed in order to ascertain the dependence of the charge and discharge processes on these parameters. Finally, we demonstarted how this new formulation can be applied to non-planar supercapacitors

  5. Factors Affecting the Integration of Information Literacy in the Teaching and Learning Processes of General Education Courses

    Directory of Open Access Journals (Sweden)

    Therdsak Maitaouthong

    2011-11-01

    Full Text Available This article presents the factors affecting the integration of information literacy in the teaching and learning processes of general education courses at an undergraduate level, where information literacy is used as a tool in the student-centered teaching approach. The research was divided into two phases: (1 The study of factors affecting at a policy level – a qualitative research method conducted through an in-depth interview of the vice president for academic affairs and the Director of the General Education Management Center, and (2 The survey of factors affecting in the teaching and learning processes, which is concluded through the questioning of lecturers of general education courses, and librarians. The qualitative data was analyzed on content, and the quantitative data was analyzed through the use of descriptive statistics, weight of score prioritization and percentage. Two major categories were found to have an impact on integrating information literacy in the teaching and learning of general education courses at an undergraduate level. (1 Six factors at a policy level, namely, institutional policy, administrative structure and system, administrators’ roles, resources and infrastructures, learning resources and supporting programs, and teacher evaluation and development. (2 There are eleven instructional factors: roles of lecturers, roles of librarians, roles of learners, knowledge and understanding of information literacy of lecturers and librarians, cooperation between librarians and lecturers, learning outcomes, teaching plans, teaching methods, teaching activities, teaching aids, and student assessment and evaluation.

  6. General principles of the nuclear criticality safety for handling, processing and transportation fissile materials in the USSR

    International Nuclear Information System (INIS)

    Vnukov, V.S.; Rjazanov, B.G.; Sviridov, V.I.; Frolov, V.V.; Zubkov, Y.N.

    1991-01-01

    The paper describes the general principles of nuclear criticality safety for handling, processing, transportation and fissile materials storing. Measures to limit the consequences of critical accidents are discussed for the fuel processing plants and fissile materials storage. The system of scientific and technical measures on nuclear criticality safety as well as the system of control and state supervision based on the rules, limits and requirements are described. The criticality safety aspects for various stages of handling nuclear materials are considered. The paper gives descriptions of the methods and approaches for critical risk assessments for the processing facilities, plants and storages. (Author)

  7. Beyond the Floquet theorem: generalized Floquet formalisms and quasienergy methods for atomic and molecular multiphoton processes in intense laser fields

    International Nuclear Information System (INIS)

    Chu, S.-I.; Telnov, D.A.

    2004-01-01

    The advancement of high-power and short-pulse laser technology in the past two decades has generated considerable interest in the study of multiphoton and very high-order nonlinear optical processes of atomic and molecular systems in intense and superintense laser fields, leading to the discovery of a host of novel strong-field phenomena which cannot be understood by the conventional perturbation theory. The Floquet theorem and the time-independent Floquet Hamiltonian method are powerful theoretical framework for the study of bound-bound multiphoton transitions driven by periodically time-dependent fields. However, there are a number of significant strong-field processes cannot be directly treated by the conventional Floquet methods. In this review article, we discuss several recent developments of generalized Floquet theorems, formalisms, and quasienergy methods, beyond the conventional Floquet theorem, for accurate nonperturbative treatment of a broad range of strong-field atomic and molecular processes and phenomena of current interests. Topics covered include (a) artificial intelligence (AI)-most-probable-path approach (MPPA) for effective treatment of ultralarge Floquet matrix problem; (b) non-Hermitian Floquet formalisms and complex quasienergy methods for nonperturbative treatment of bound-free and free-free processes such as multiphoton ionization (MPI) and above-threshold ionization (ATI) of atoms and molecules, multiphoton dissociation (MPD) and above-threshold dissociation (ATD) of molecules, chemical bond softening and hardening, charge-resonance enhanced ionization (CREI) of molecular ions, and multiple high-order harmonic generation (HHG), etc.; (c) many-mode Floquet theorem (MMFT) for exact treatment of multiphoton processes in multi-color laser fields with nonperiodic time-dependent Hamiltonian; (d) Floquet-Liouville supermatrix (FLSM) formalism for exact nonperturbative treatment of time-dependent Liouville equation (allowing for relaxations and

  8. Beyond the Floquet theorem: generalized Floquet formalisms and quasienergy methods for atomic and molecular multiphoton processes in intense laser fields

    Science.gov (United States)

    Chu, Shih-I.; Telnov, Dmitry A.

    2004-02-01

    The advancement of high-power and short-pulse laser technology in the past two decades has generated considerable interest in the study of multiphoton and very high-order nonlinear optical processes of atomic and molecular systems in intense and superintense laser fields, leading to the discovery of a host of novel strong-field phenomena which cannot be understood by the conventional perturbation theory. The Floquet theorem and the time-independent Floquet Hamiltonian method are powerful theoretical framework for the study of bound-bound multiphoton transitions driven by periodically time-dependent fields. However, there are a number of significant strong-field processes cannot be directly treated by the conventional Floquet methods. In this review article, we discuss several recent developments of generalized Floquet theorems, formalisms, and quasienergy methods, beyond the conventional Floquet theorem, for accurate nonperturbative treatment of a broad range of strong-field atomic and molecular processes and phenomena of current interests. Topics covered include (a) artificial intelligence (AI)-most-probable-path approach (MPPA) for effective treatment of ultralarge Floquet matrix problem; (b) non-Hermitian Floquet formalisms and complex quasienergy methods for nonperturbative treatment of bound-free and free-free processes such as multiphoton ionization (MPI) and above-threshold ionization (ATI) of atoms and molecules, multiphoton dissociation (MPD) and above-threshold dissociation (ATD) of molecules, chemical bond softening and hardening, charge-resonance enhanced ionization (CREI) of molecular ions, and multiple high-order harmonic generation (HHG), etc.; (c) many-mode Floquet theorem (MMFT) for exact treatment of multiphoton processes in multi-color laser fields with nonperiodic time-dependent Hamiltonian; (d) Floquet-Liouville supermatrix (FLSM) formalism for exact nonperturbative treatment of time-dependent Liouville equation (allowing for relaxations and

  9. Negative ion formation processes: A general review

    International Nuclear Information System (INIS)

    Alton, G.D.

    1990-01-01

    The principal negative ion formation processes will be briefly reviewed. Primary emphasis will be placed on the more efficient and universal processes of charge transfer and secondary ion formation through non-thermodynamic surface ionization. 86 refs., 20 figs

  10. Generalization of the photo process window and its application to OPC test pattern design

    Science.gov (United States)

    Eisenmann, Hans; Peter, Kai; Strojwas, Andrzej J.

    2003-07-01

    From the early development phase up to the production phase, test pattern play a key role for microlithography. The requirement for test pattern is to represent the design well and to cover the space of all process conditions, e.g. to investigate the full process window and all other process parameters. This paper shows that the current state-of-the-art test pattern do not address these requirements sufficiently and makes suggestions for a better selection of test pattern. We present a new methodology to analyze an existing layout (e.g. logic library, test pattern or full chip) for critical layout situations which does not need precise process data. We call this method "process space decomposition", because it is aimed at decomposing the process impact to a layout feature into a sum of single independent contributions, the dimensions of the process space. This is a generalization of the classical process window, which examines defocus and exposure dependency of given test pattern, e.g. CD value of dense and isolated lines. In our process space we additionally define the dimensions resist effects, etch effects, mask error and misalignment, which describe the deviation of the printed silicon pattern from its target. We further extend it by the pattern space using a product based layout (library, full chip or synthetic test pattern). The criticality of pattern is defined by their deviation due to aerial image, their sensitivity to the respective dimension or several combinations of these. By exploring the process space for a given design, the method allows to find the most critical patterns independent of specific process parameters. The paper provides examples for different applications of the method: (1) selection of design oriented test pattern for lithography development (2) test pattern reduction in process characterization (3) verification/optimization of printability and performance of post processing procedures (like OPC) (4) creation of a sensitive process

  11. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    Directory of Open Access Journals (Sweden)

    Chuancun Yin

    2015-01-01

    Full Text Available We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy.

  12. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    Science.gov (United States)

    Yuen, Kam Chuen; Shen, Ying

    2015-01-01

    We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy. PMID:26351655

  13. Evaluation of Aqueous and Powder Processing Techniques for Production of Pu-238-Fueled General Purpose Heat Sources

    Energy Technology Data Exchange (ETDEWEB)

    2008-06-01

    This report evaluates alternative processes that could be used to produce Pu-238 fueled General Purpose Heat Sources (GPHS) for radioisotope thermoelectric generators (RTG). Fabricating GPHSs with the current process has remained essentially unchanged since its development in the 1970s. Meanwhile, 30 years of technological advancements have been made in the fields of chemistry, manufacturing, ceramics, and control systems. At the Department of Energy’s request, alternate manufacturing methods were compared to current methods to determine if alternative fabrication processes could reduce the hazards, especially the production of respirable fines, while producing an equivalent GPHS product. An expert committee performed the evaluation with input from four national laboratories experienced in Pu-238 handling.

  14. Outcrossings of safe regions by generalized hyperbolic processes

    DEFF Research Database (Denmark)

    Klüppelberg, Claudia; Rasmussen, Morten Grud

    2013-01-01

    We present a simple Gaussian mixture model in space and time with generalized hyperbolic marginals. Starting with Rice’s celebrated formula for level upcrossings and outcrossings of safe regions we investigate the consequences of the mean-variance mixture model on such quantities. We obtain...

  15. General edition program

    International Nuclear Information System (INIS)

    Vaturi, Sylvain

    1969-01-01

    Computerized edition is essential for data processing exploitation. When a more or less complex edition program is required for each task, then the need for a general edition program become obvious. The aim of this study is to create a general edition program. Universal programs are capable to execute numerous and varied tasks. For a more precise processing, the execution of which is frequently required, the use of a specialized program is preferable because, contradictory to the universal program, it goes straight to the point [fr

  16. A Fast General-Purpose Clustering Algorithm Based on FPGAs for High-Throughput Data Processing

    CERN Document Server

    Annovi, A; The ATLAS collaboration; Castegnaro, A; Gatta, M

    2012-01-01

    We present a fast general-purpose algorithm for high-throughput clustering of data ”with a two dimensional organization”. The algorithm is designed to be implemented with FPGAs or custom electronics. The key feature is a processing time that scales linearly with the amount of data to be processed. This means that clustering can be performed in pipeline with the readout, without suffering from combinatorial delays due to looping multiple times through all the data. This feature makes this algorithm especially well suited for problems where the data has high density, e.g. in the case of tracking devices working under high-luminosity condition such as those of LHC or Super-LHC. The algorithm is organized in two steps: the first step (core) clusters the data; the second step analyzes each cluster of data to extract the desired information. The current algorithm is developed as a clustering device for modern high-energy physics pixel detectors. However, the algorithm has much broader field of applications. In ...

  17. 40 CFR 68.12 - General requirements.

    Science.gov (United States)

    2010-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS General § 68.12 General requirements. (a) General requirements. The... the five-year accident history for the process as provided in § 68.42 of this part and submit it in... §§ 68.150 to 68.185. The RMP shall include a registration that reflects all covered processes. (b...

  18. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    Science.gov (United States)

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  19. Geometric correction of radiographic images using general purpose image processing program

    International Nuclear Information System (INIS)

    Kim, Eun Kyung; Cheong, Ji Seong; Lee, Sang Hoon

    1994-01-01

    The present study was undertaken to compare geometric corrected image by general-purpose image processing program for the Apple Macintosh II computer (NIH Image, Adobe Photoshop) with standardized image by individualized custom fabricated alignment instrument. Two non-standardized periapical films with XCP film holder only were taken at the lower molar portion of 19 volunteers. Two standardized periapical films with customized XCP film holder with impression material on the bite-block were taken for each person. Geometric correction was performed with Adobe Photoshop and NIH Image program. Specially, arbitrary image rotation function of 'Adobe Photoshop' and subtraction with transparency function of 'NIH Image' were utilized. The standard deviations of grey values of subtracted images were used to measure image similarity. Average standard deviation of grey values of subtracted images if standardized group was slightly lower than that of corrected group. However, the difference was found to be statistically insignificant (p>0.05). It is considered that we can use 'NIH Image' and 'Adobe Photoshop' program for correction of nonstandardized film, taken with XCP film holder at lower molar portion.

  20. Results of comparative evaluations concerning the psychic process of perceiving and assessing risk-objects by the general public

    International Nuclear Information System (INIS)

    Peters, H.P.; Renn, O.

    1983-01-01

    The perception of risk has become a mayor research field, after scientists and politicians recognized that scientific risk studies like the Rasmussen-Report on nuclear energy had no large impact on the public acceptance. With our surveys we aimed to combine two methodological approaches (object perception and attitude theory) and to develop a technique in which the psychic process of perceiving and assessing risk-objects by the general public was followed up and analyzed. Psychological experiments in the field of isolating relevant factors of qualitative risk properties as well as demographic surveys for the measurement of the belief structure were carried out. Our results indicate that in objection to the common conception by natural scientists people in general have a good estimative ability to judge the expected value of different risks. But beyond this estimation of fatalities people also use other criteria (like personal control) to order different objects in respect to their riskiness. The perceived risk is but one factor influencing attitude. A simplified model of the acceptance-building process is carried out showing that acceptance-building is not a purely individual process. Individuals are linked together by their social environment so that every individual decision is influenced by the decision of other people

  1. Improving the training process of highly skilled bodybuilders in the preparatory period, general preparatory phase

    Directory of Open Access Journals (Sweden)

    Olexandr Tyhorskyy

    2015-10-01

    Full Text Available Purpose: to improve the method of training highly skilled bodybuilders. Material and Methods: the study involved eight highly skilled athletes, members of the team of Ukraine on bodybuilding. Results: comparative characteristics of the most commonly used methods of training process in bodybuilding. Developed and substantiated the optimal method of training highly skilled bodybuilders during the general preparatory phase of the preparatory period, which can increase body weight through muscle athletes component. Conclusions: dynamic load factor to raise the intensity of training loads allows orientation help to increase volumes shoulder muscles

  2. Generalizing entanglement

    Science.gov (United States)

    Jia, Ding

    2017-12-01

    The expected indefinite causal structure in quantum gravity poses a challenge to the notion of entanglement: If two parties are in an indefinite causal relation of being causally connected and not, can they still be entangled? If so, how does one measure the amount of entanglement? We propose to generalize the notions of entanglement and entanglement measure to address these questions. Importantly, the generalization opens the path to study quantum entanglement of states, channels, networks, and processes with definite or indefinite causal structure in a unified fashion, e.g., we show that the entanglement distillation capacity of a state, the quantum communication capacity of a channel, and the entanglement generation capacity of a network or a process are different manifestations of one and the same entanglement measure.

  3. General relativity and mathematics; Relatividad General y Matematicas

    Energy Technology Data Exchange (ETDEWEB)

    Mars, M.

    2015-07-01

    General relativity is more than a theory of gravity, since any physical process occupies space and lasts for a time, forcing to reconcile that physical theory that describes what the dynamic nature of space-time itself. (Author)

  4. Cancer Investigation in General Practice

    DEFF Research Database (Denmark)

    Jensen, Jacob Reinholdt; Møller, Henrik; Thomsen, Janus Laust

    2014-01-01

    Initiation of cancer investigations in general practice Background Close to 90% of all cancers are diagnosed because the patient presents symptoms and signs. Of these patients, 85% initiate the diagnostic pathway in general practice. Therefore, the initiation of a diagnostic pathway in general...... practice becomes extremely important. On average, a general practitioner (GP) is involved in 7500 consultations each year, and in the diagnostic process of 8-10 incident cancers. One half of cancer patients consult their GP with either general symptoms, which are not indicative of cancer, or vague and non......-specific symptoms. The other half present with what the GP assess as alarm symptoms. Three months prior to diagnosis, patients who are later diagnosed with cancer have twice as many GP consultations than a comparable reference population. Thus the complex diagnostic process in general practice requires the GP...

  5. Generalized renewal process for repairable systems based on finite Weibull mixture

    International Nuclear Information System (INIS)

    Veber, B.; Nagode, M.; Fajdiga, M.

    2008-01-01

    Repairable systems can be brought to one of possible states following a repair. These states are: 'as good as new', 'as bad as old' and 'better than old but worse than new'. The probabilistic models traditionally used to estimate the expected number of failures account for the first two states, but they do not properly apply to the last one, which is more realistic in practice. In this paper, a probabilistic model that is applicable to all of the three after-repair states, called generalized renewal process (GRP), is applied. Simplistically, GRP addresses the repair assumption by introducing the concept of virtual age into the stochastic point processes to enable them to represent the full spectrum of repair assumptions. The shape of measured or design life distributions of systems can vary considerably, and therefore frequently cannot be approximated by simple distribution functions. The scope of the paper is to prove that a finite Weibull mixture, with positive component weights only, can be used as underlying distribution of the time to first failure (TTFF) of the GRP model, on condition that the unknown parameters can be estimated. To support the main idea, three examples are presented. In order to estimate the unknown parameters of the GRP model with m-fold Weibull mixture, the EM algorithm is applied. The GRP model with m mixture components distributions is compared to the standard GRP model based on two-parameter Weibull distribution by calculating the expected number of failures. It can be concluded that the suggested GRP model with Weibull mixture with an arbitrary but finite number of components is suitable for predicting failures based on the past performance of the system

  6. ''Sheiva'' : a general purpose multi-parameter data acquisition and processing system at VECC

    International Nuclear Information System (INIS)

    Viyogi, Y.P.; Ganguly, N.K.

    1982-01-01

    A general purpose interactive software to be used with the PDP-15/76 on-line computer at VEC Centre for the acquisition and processing of data in nuclear physics experiments is described. The program can accommodate a maximum of thirty two inputs although the present hardware limits the number of inputs to eight. Particular emphasis is given to the problems of flexibility and ease of operation, memory optimisation and techniques dealing with experimenter-computer interaction. Various graphical methods for one- and two-dimensional data presentation are discussed. Specific problems of particle identification using detector telescopes have been dealt with carefully to handle experiments using several detector telescopes and those involving light particle-heavy particle coincidence studies. Steps needed to tailor this program towards utilisation for special experiments are also described. (author)

  7. SYSTEM OF COMPUTER MODELING OBJECTS AND PROCESSES AND FEATURES OF ITS USE IN THE EDUCATIONAL PROCESS OF GENERAL SECONDARY EDUCATION

    Directory of Open Access Journals (Sweden)

    Svitlana G. Lytvynova

    2018-04-01

    Full Text Available The article analyzes the historical aspect of the formation of computer modeling as one of the perspective directions of educational process development. The notion of “system of computer modeling”, conceptual model of system of computer modeling (SCMod, its components (mathematical, animation, graphic, strategic, functions, principles and purposes of use are grounded. The features of the organization of students work using SCMod, individual and group work, the formation of subject competencies are described; the aspect of students’ motivation to learning is considered. It is established that educational institutions can use SCMod at different levels and stages of training and in different contexts, which consist of interrelated physical, social, cultural and technological aspects. It is determined that the use of SCMod in general secondary school would increase the capacity of teachers to improve the training of students in natural and mathematical subjects and contribute to the individualization of the learning process, in order to meet the pace, educational interests and capabilities of each particular student. It is substantiated that the use of SCMod in the study of natural-mathematical subjects contributes to the formation of subject competencies, develops the skills of analysis and decision-making, increases the level of digital communication, develops vigilance, raises the level of knowledge, increases the duration of attention of students. Further research requires the justification of the process of forming students’ competencies in natural-mathematical subjects and designing cognitive tasks using SCMod.

  8. Generalized Jacobi identities in gauge theories

    International Nuclear Information System (INIS)

    Chaves, F.M.P.

    1990-01-01

    A spatial generalized Jacobi identity obeyed by the polarization-dependent factors of the vertices in a q q-bar - Wγ process is studied. The amplitude of a scattering gluon-gluon with five particles is worked out. By reorganizing this amplitude in analogy with an interaction process photon-pion, the non existence of the spatial generalized Jacobi identity, but instead many spatial partial identities that compose themselves, in the case of a four particle process, in one single identity is shown. A process with four particles, three of them scalar fields, but in the one loop approximation is studied. In this case also, the non existence of the spatial generalized Jacobi identity is demonstrated. (author)

  9. Evaluating the implementation of a quality improvement process in General Practice using a realist evaluation framework.

    Science.gov (United States)

    Moule, Pam; Clompus, Susan; Fieldhouse, Jon; Ellis-Jones, Julie; Barker, Jacqueline

    2018-05-25

    Underuse of anticoagulants in atrial fibrillation is known to increase the risk of stroke and is an international problem. The National Institute for Health Care and Excellence guidance CG180 seeks to reduce atrial fibrillation related strokes through prescriptions of Non-vitamin K antagonist Oral Anticoagulants. A quality improvement programme was established by the West of England Academic Health Science Network (West of England AHSN) to implement this guidance into General Practice. A realist evaluation identified whether the quality improvement programme worked, determining how and in what circumstances. Six General Practices in 1 region, became the case study sites. Quality improvement team, doctor, and pharmacist meetings within each of the General Practices were recorded at 3 stages: initial planning, review, and final. Additionally, 15 interviews conducted with the practice leads explored experiences of the quality improvement process. Observation and interview data were analysed and compared against the initial programme theory. The quality improvement resources available were used variably, with the training being valued by all. The initial programme theories were refined. In particular, local workload pressures and individual General Practitioner experiences and pre-conceived ideas were acknowledged. Where key motivators were in place, such as prior experience, the programme achieved optimal outcomes and secured a lasting quality improvement legacy. The employment of a quality improvement programme can deliver practice change and improvement legacy outcomes when particular mechanisms are employed and in contexts where there is a commitment to improve service. © 2018 John Wiley & Sons, Ltd.

  10. General framework for adsorption processes on dynamic interfaces

    International Nuclear Information System (INIS)

    Schmuck, Markus; Kalliadasis, Serafim

    2016-01-01

    We propose a novel and general variational framework modelling particle adsorption mechanisms on evolving immiscible fluid interfaces. A by-product of our thermodynamic approach is that we systematically obtain analytic adsorption isotherms for given equilibrium interfacial geometries. We validate computationally our mathematical methodology by demonstrating the fundamental properties of decreasing interfacial free energies by increasing interfacial particle densities and of decreasing surface pressure with increasing surface area. (paper)

  11. PROCEDIMIENTO GENERAL DE REDISEÑO ORGANIZACIONAL PARA MEJORAR EL ENFOQUE A PROCESOS / GENERAL PROCEDURE OF ORGANIZATIONAL REDESIGN TO IMPROVE THE PROCESS APPROACH

    Directory of Open Access Journals (Sweden)

    Daniel Alfonso-Robaina

    2011-09-01

    Full Text Available

    Para mejorar el enfoque a procesos en el rediseño de la organización de la empresa, es necesaria la adecuación de 6 fases. En la propuesta se presentan las actividades de cada fase del Procedimiento de rediseño organizacional para mejorar el enfoque a procesos, así como sus entradas y salidas. El procedimiento propuesto en esta investigación es el resultado de la fusión de varios de los estudiados, teniendo como base el procedimiento de Rummler y Brache (1995 [1]. En la investigación fue útil la utilización de técnicas, entre las que se destacan: las entrevistas, la tormenta de ideas y la búsqueda bibliográfica; además del empleo de herramientas como: el Mapa de Procesos y el Modelo General de Organización. Con el uso de estas técnicas y herramientas se identificó como asunto crítico de negocio en la empresa Explomat, la insuficiente gestión integrada de los procesos, lo que debilita las posibilidades de la entidad para aprovechar las oportunidades que le brinda el entorno, poniendo en peligro el cumplimiento de su misión. Teniendo en cuenta el análisis del nivel de integración del sistema de dirección, a partir de las matrices de relaciones, se contribuyó a proyectar mejoras, confeccionando el debe “ser”.

    Abstract

    In order to improve the process approach relative to the organizational redesign, it is necessary the adaptation of 6 phases. In the proposal, the activities of each phase of the Procedure of organizational redesign to improve the process approach, as well as its inputs and outputs, are presented. The proposed procedure in this investigation is the result of the merger of several of those studied, taking as a starting point the procedure of Rummler and Brache (1995 [1]. In this investigation it was useful the use of techniques, such as the interviews, the brainstorm and bibliographical search; besides the employment of tools like the Processes Map and the General Model of

  12. 40 CFR 425.02 - General definitions.

    Science.gov (United States)

    2010-07-01

    ... STANDARDS LEATHER TANNING AND FINISHING POINT SOURCE CATEGORY General Provisions § 425.02 General...) “Chrome tan” means the process of converting hide into leather using a form of chromium. (g) “Vegetable tan” means the process of converting hides into leather using chemicals either derived from vegetable...

  13. Students' General Knowledge of the Learning Process: A Mixed Methods Study Illustrating Integrated Data Collection and Data Consolidation

    Science.gov (United States)

    van Velzen, Joke H.

    2018-01-01

    There were two purposes for this mixed methods study: to investigate (a) the realistic meaning of awareness and understanding as the underlying constructs of general knowledge of the learning process and (b) a procedure for data consolidation. The participants were 11th-grade high school and first-year university students. Integrated data…

  14. TRIO-EF a general thermal hydraulics computer code applied to the Avlis process

    International Nuclear Information System (INIS)

    Magnaud, J.P.; Claveau, M.; Coulon, N.; Yala, P.; Guilbaud, D.; Mejane, A.

    1993-01-01

    TRIO(EF is a general purpose Fluid Mechanics 3D Finite Element Code. The system capabilities cover areas such as steady state or transient, laminar or turbulent, isothermal or temperature dependent fluid flows; it is applicable to the study of coupled thermo-fluid problems involving heat conduction and possibly radiative heat transfer. It has been used to study the thermal behaviour of the AVLIS process separation module. In this process, a linear electron beam impinges the free surface of a uranium ingot, generating a two dimensional curtain emission of vapour from a water-cooled crucible. The energy transferred to the metal causes its partial melting, forming a pool where strong convective motion increases heat transfer towards the crucible. In the upper part of the Separation Module, the internal structures are devoted to two main functions: vapor containment and reflux, irradiation and physical separation. They are subjected to very high temperature levels and heat transfer occurs mainly by radiation. Moreover, special attention has to be paid to electron backscattering. These two major points have been simulated numerically with TRIO-EF and the paper presents and comments the results of such a computation, for each of them. After a brief overview of the computer code, two examples of the TRIO-EF capabilities are given: a crucible thermal hydraulics model, a thermal analysis of the internal structures

  15. [Informed consent process in clinical trials: Insights of researchers, patients and general practitioners].

    Science.gov (United States)

    Giménez, Nuria; Pedrazas, David; Redondo, Susana; Quintana, Salvador

    2016-10-01

    Adequate information for patients and respect for their autonomy are mandatory in research. This article examined insights of researchers, patients and general practitioners (GPs) on the informed consent process in clinical trials, and the role of the GP. A cross-sectional study using three questionnaires, informed consent reviews, medical records, and hospital discharge reports. GPs, researchers and patients involved in clinical trials. Included, 504 GPs, 108 researchers, and 71 patients. Consulting the GP was recommended in 50% of the informed consents. Participation in clinical trials was shown in 33% of the medical records and 3% of the hospital discharge reports. GPs scored 3.54 points (on a 1-10 scale) on the assessment of the information received by the principal investigator. The readability of the informed consent sheet was rated 8.03 points by researchers, and the understanding was rated 7.68 points by patients. Patient satisfaction was positively associated with more time for reflection. GPs were not satisfied with the information received on the participation of patients under their in clinical trials. Researchers were satisfied with the information they offered to patients, and were aware of the need to improve the information GPs received. Patients collaborated greatly towards biomedical research, expressed satisfaction with the overall process, and minimised the difficulties associated with participation. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  16. Neural Generalized Predictive Control of a non-linear Process

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Nørgård, Peter Magnus; Ravn, Ole

    1998-01-01

    The use of neural network in non-linear control is made difficult by the fact the stability and robustness is not guaranteed and that the implementation in real time is non-trivial. In this paper we introduce a predictive controller based on a neural network model which has promising stability qu...... detail and discuss the implementation difficulties. The neural generalized predictive controller is tested on a pneumatic servo sys-tem.......The use of neural network in non-linear control is made difficult by the fact the stability and robustness is not guaranteed and that the implementation in real time is non-trivial. In this paper we introduce a predictive controller based on a neural network model which has promising stability...... qualities. The controller is a non-linear version of the well-known generalized predictive controller developed in linear control theory. It involves minimization of a cost function which in the present case has to be done numerically. Therefore, we develop the numerical algorithms necessary in substantial...

  17. General birth-death processes: probabilities, inference, and applications

    OpenAIRE

    Crawford, Forrest Wrenn

    2012-01-01

    A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. Each particle can give birth to another particle or die, and the rate of births and deaths at any given time depends on how many extant particles there are. Birth-death processes are popular modeling tools in evolution, population biology, genetics, epidemiology, and ecology. Despite the widespread interest in birth-death models, no efficient method exists to evaluate the fini...

  18. A new general methodology for incorporating physico-chemical transformations into multi-phase wastewater treatment process models.

    Science.gov (United States)

    Lizarralde, I; Fernández-Arévalo, T; Brouckaert, C; Vanrolleghem, P; Ikumi, D S; Ekama, G A; Ayesa, E; Grau, P

    2015-05-01

    This paper introduces a new general methodology for incorporating physico-chemical and chemical transformations into multi-phase wastewater treatment process models in a systematic and rigorous way under a Plant-Wide modelling (PWM) framework. The methodology presented in this paper requires the selection of the relevant biochemical, chemical and physico-chemical transformations taking place and the definition of the mass transport for the co-existing phases. As an example a mathematical model has been constructed to describe a system for biological COD, nitrogen and phosphorus removal, liquid-gas transfer, precipitation processes, and chemical reactions. The capability of the model has been tested by comparing simulated and experimental results for a nutrient removal system with sludge digestion. Finally, a scenario analysis has been undertaken to show the potential of the obtained mathematical model to study phosphorus recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. UN Secretary-General Normative Capability to Influence The Security Council Decision-Making Process

    Directory of Open Access Journals (Sweden)

    Dmitry Guennadievich Novik

    2016-01-01

    Full Text Available The present article studies the issue of the interrelation between the senior UN official - the Secretary-General and the main UN body - the Security Council. The nature of the Secretary-General role is ambiguous since the very creation of the UN. On one hand, the Secretary-General leads the Secretariat - the body that carries out technical and subsidiary functions in relation to other UN Main Bodies. This is the way the Secretary-General position was initially viewed by the UN authors. On the other hand, the UN Charter contains certain provisions that, with a certain representation, give the Secretary-General vigorous powers, including political ones. Since the very beginning of the UN operation the Secretary-Generals have tried to define the nature of these auxiliary powers, formalize the practice of their use. Special place among these powers have the provisions given in the Charter article 99. This article give to the Secretary-General the right to directly appeal to the Security Council and draw its attention to the situation that, in his (Secretary-General's opinion may threaten the international peace and security. This right was used by some Secretary-Generals during different crises occurred after the creation of the UN. This article covers consecutively the crisis in Congo, Iran hostage crisis and the situation in Lebanon. These are three situations that forced Secretary-Generals Hammarskjold, Waldheim and de Cuellar to explicitly use their right to appeal to the Security Council. Other cases in UN history involving the Secretary-General appealing to the Security Council while mentioning article 99 cannot be considered as the use of the nature of this article in full sense of its spirit. Such cases were preceded by other appeals to the Council on the same situations by other subjects (notably, the UN member states or other actions that made Secretary-General to merely perform its technical function. The main research problem here is

  20. Evaluating the Generalization Value of Process-based Models in a Deep-in-time Machine Learning framework

    Science.gov (United States)

    Shen, C.; Fang, K.

    2017-12-01

    Deep Learning (DL) methods have made revolutionary strides in recent years. A core value proposition of DL is that abstract notions and patterns can be extracted purely from data, without the need for domain expertise. Process-based models (PBM), on the other hand, can be regarded as repositories of human knowledge or hypotheses about how systems function. Here, through computational examples, we argue that there is merit in integrating PBMs with DL due to the imbalance and lack of data in many situations, especially in hydrology. We trained a deep-in-time neural network, the Long Short-Term Memory (LSTM), to learn soil moisture dynamics from Soil Moisture Active Passive (SMAP) Level 3 product. We show that when PBM solutions are integrated into LSTM, the network is able to better generalize across regions. LSTM is able to better utilize PBM solutions than simpler statistical methods. Our results suggest PBMs have generalization value which should be carefully assessed and utilized. We also emphasize that when properly regularized, the deep network is robust and is of superior testing performance compared to simpler methods.

  1. Clinical Process Intelligence

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2006-01-01

    .e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

  2. Generalized Software Architecture Applied to the Continuous Lunar Water Separation Process and the Lunar Greenhouse Amplifier

    Science.gov (United States)

    Perusich, Stephen; Moos, Thomas; Muscatello, Anthony

    2011-01-01

    This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not

  3. How does care coordination provided by registered nurses "fit" within the organisational processes and professional relationships in the general practice context?

    Science.gov (United States)

    Ehrlich, Carolyn; Kendall, Elizabeth; St John, Winsome

    2013-01-01

    The aim of this study was to develop understanding about how a registered nurse-provided care coordination model can "fit" within organisational processes and professional relationships in general practice. In this project, registered nurses were involved in implementation of registered nurse-provided care coordination, which aimed to improve quality of care and support patients with chronic conditions to maintain their care and manage their lifestyle. Focus group interviews were conducted with nurses using a semi-structured interview protocol. Interpretive analysis of interview data was conducted using Normalization Process Theory to structure data analysis and interpretation. Three core themes emerged: (1) pre-requisites for care coordination, (2) the intervention in context, and (3) achieving outcomes. Pre-requisites were adequate funding mechanisms, engaging organisational power-brokers, leadership roles, and utilising and valuing registered nurses' broad skill base. To ensure registered nurse-provided care coordination processes were sustainable and embedded, mentoring and support as well as allocated time were required. Finally, when registered nurse-provided care coordination was supported, positive client outcomes were achievable, and transformation of professional practice and development of advanced nursing roles was possible. Registered nurse-provided care coordination could "fit" within the context of general practice if it was adequately resourced. However, the heterogeneity of general practice can create an impasse that could be addressed through close attention to shared and agreed understandings. Successful development and implementation of registered nurse roles in care coordination requires attention to educational preparation, support of the individual nurse, and attention to organisational structures, financial implications and team member relationships.

  4. 28 CFR 65.40 - General.

    Science.gov (United States)

    2010-07-01

    ... Administration DEPARTMENT OF JUSTICE (CONTINUED) EMERGENCY FEDERAL LAW ENFORCEMENT ASSISTANCE Submission and Review of Applications § 65.40 General. This subpart describes the process and criteria for the Attorney General's review and approval or disapproval of state applications. The original application, on Standard...

  5. Combining Generalized Renewal Processes with Non-Extensive Entropy-Based q-Distributions for Reliability Applications

    Directory of Open Access Journals (Sweden)

    Isis Didier Lins

    2018-03-01

    Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.

  6. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    Science.gov (United States)

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. 23 CFR 710.301 - General.

    Science.gov (United States)

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false General. 710.301 Section 710.301 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RIGHT-OF-WAY AND ENVIRONMENT RIGHT-OF-WAY AND REAL ESTATE Project Development § 710.301 General. The project development process typically follows a...

  8. O(3)-invariant tunneling in general relativity

    International Nuclear Information System (INIS)

    Berezin, V.A.; Tkachev, I.I.; Kuzmin, V.A.; AN SSSR, Moscow. Inst. Yadernykh Issledovanij)

    1987-12-01

    We derived a general formula for the action for any O(3)-invariant tunneling processes in false vacuum decay in general relativity. The general classification of the bubble Euclidean trajectories is elaborated and explicit expressions for bounces for some processes like the vacuum creation of a double bubble, in particular in the vicinity of a black hole; the subbarrier creation of the Einstein-Rosen bridge, creation from nothing of two Minkowski worlds connected by a shell etc., are given. (orig.)

  9. Processing of FRG mixed oxide fuel elements at General Atomic under the US/FRG cooperative agreement for spent fuel elements

    International Nuclear Information System (INIS)

    Holder, N.D.; Strand, J.B.; Schwarz, F.A.; Tischer, H.E.

    1980-11-01

    The Federal Republic of Germany (FRG) and the United States (US) are cooperating on certain aspects gas-cooled reactor technology under an umbrella agreement. Under the spent fuel treatment section of the agreement, FRG fuel spheres were recently sent for processing in the Department of Energy sponsored cold pilot plant for High-Temperature Gas-Cooled Reactor (HTGR) fuel processing at General Atomic Company in San Diego, California. The FRG fuel spheres were crushed and burned to recover coated fuel particles. These particles were in turn crushed and burned to recover the fuel-bearing kernels for further treatment for uranium recovery. Successful completion of the tests described in this paper demonstrated the applicability of the US HTGR fuel treatment flowsheet to FRG fuel processing. 10 figures

  10. 28 CFR 33.60 - General.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false General. 33.60 Section 33.60 Judicial Administration DEPARTMENT OF JUSTICE BUREAU OF JUSTICE ASSISTANCE GRANT PROGRAMS Criminal Justice Block Grants Submission and Review of Applications § 33.60 General. This subpart describes the process and criteria for...

  11. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  12. Optimal Constant-Stress Accelerated Degradation Test Plans Using Nonlinear Generalized Wiener Process

    Directory of Open Access Journals (Sweden)

    Zhen Chen

    2016-01-01

    Full Text Available Accelerated degradation test (ADT has been widely used to assess highly reliable products’ lifetime. To conduct an ADT, an appropriate degradation model and test plan should be determined in advance. Although many historical studies have proposed quite a few models, there is still room for improvement. Hence we propose a Nonlinear Generalized Wiener Process (NGWP model with consideration of the effects of stress level, product-to-product variability, and measurement errors for a higher estimation accuracy and a wider range of use. Then under the constraints of sample size, test duration, and test cost, the plans of constant-stress ADT (CSADT with multiple stress levels based on the NGWP are designed by minimizing the asymptotic variance of the reliability estimation of the products under normal operation conditions. An optimization algorithm is developed to determine the optimal stress levels, the number of units allocated to each level, inspection frequency, and measurement times simultaneously. In addition, a comparison based on degradation data of LEDs is made to show better goodness-of-fit of the NGWP than that of other models. Finally, optimal two-level and three-level CSADT plans under various constraints and a detailed sensitivity analysis are demonstrated through examples in this paper.

  13. Improvement of the training process of qualified female athletes engaged in bodybuilding in the general preparatory stage of the preparatory period, taking into account the biological cycle

    Directory of Open Access Journals (Sweden)

    Viacheslav Mulyk

    2017-06-01

    Full Text Available Purpose: substantiation of the methodology of the training process of qualified female athletes engaged in bodybuilding in the general preparatory stage of the preparatory period, taking into account the biological cycle. Material & Methods: in the study participated 18 qualified female athletes engaged in bodybuilding, included in the Kharkov region team of bodybuilding. Results: comparative characteristic of the most frequently used methodology of the training process in bodybuilding are shows. An optimal methodology for qualified female athletes engaged in bodybuilding has been developed and justified, depending on the initial form of the athlete at the beginning of the general preparatory stage of the training. The dependence of the change in the body weight of female athletes from the training process is shows. Conclusion: on the basis of the study, the author suggests an optimal training methodology depending on the mesocycle of training in the preparatory period in the general preparatory stage.

  14. 21 CFR 880.6890 - General purpose disinfectants.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) MEDICAL DEVICES GENERAL HOSPITAL AND PERSONAL USE DEVICES General Hospital and Personal Use... disinfectant is a germicide intended to process noncritical medical devices and equipment surfaces. A general... prior to terminal sterilization or high level disinfection. Noncritical medical devices make only...

  15. Impact of stuttering severity on adolescents' domain-specific and general self-esteem through cognitive and emotional mediating processes.

    Science.gov (United States)

    Adriaensens, Stefanie; Beyers, Wim; Struyf, Elke

    2015-01-01

    The theory that self-esteem is substantially constructed based on social interactions implies that having a stutter could have a negative impact on self-esteem. Specifically, self-esteem during adolescence, a period of life characterized by increased self-consciousness, could be at risk. In addition to studying mean differences between stuttering and non-stuttering adolescents, this article concentrates on the influence of stuttering severity on domain-specific and general self-esteem. Subsequently, we investigate if covert processes on negative communication attitudes, experienced stigma, non-disclosure of stuttering, and (mal)adaptive perfectionism mediate the relationship between stuttering severity and self-esteem. Our sample comprised 55 stuttering and 76 non-stuttering adolescents. They were asked to fill in a battery of questionnaires, consisting of: Subjective Screening of Stuttering, Self-Perception Profile for Adolescents, Erickson S-24, Multidimensional Perfectionism Scale, and the Stigmatization and Disclosure in Adolescents Who Stutter Scale. SEM (structural equation modeling) analyses showed that stuttering severity negatively influences adolescents' evaluations of social acceptance, school competence, the competence to experience a close friendship, and global self-esteem. Maladaptive perfectionism and especially negative communication attitudes fully mediate the negative influence of stuttering severity on self-esteem. Group comparison showed that the mediation model applies to both stuttering and non-stuttering adolescents. We acknowledge the impact of having a stutter on those domains of the self in which social interactions and communication matter most. We then accentuate that negative attitudes about communication situations and excessive worries about saying things in ways they perceive as wrong are important processes to consider with regard to the self-esteem of adolescents who stutter. Moreover, we provide evidence that these covert

  16. Diphoton generalized distribution amplitudes

    International Nuclear Information System (INIS)

    El Beiyad, M.; Pire, B.; Szymanowski, L.; Wallon, S.

    2008-01-01

    We calculate the leading order diphoton generalized distribution amplitudes by calculating the amplitude of the process γ*γ→γγ in the low energy and high photon virtuality region at the Born order and in the leading logarithmic approximation. As in the case of the anomalous photon structure functions, the γγ generalized distribution amplitudes exhibit a characteristic lnQ 2 behavior and obey inhomogeneous QCD evolution equations.

  17. Management of the General Process of Parenteral Nutrition Using mHealth Technologies: Evaluation and Validation Study.

    Science.gov (United States)

    Cervera Peris, Mercedes; Alonso Rorís, Víctor Manuel; Santos Gago, Juan Manuel; Álvarez Sabucedo, Luis; Wanden-Berghe, Carmina; Sanz-Valero, Javier

    2018-04-03

    Any system applied to the control of parenteral nutrition (PN) ought to prove that the process meets the established requirements and include a repository of records to allow evaluation of the information about PN processes at any time. The goal of the research was to evaluate the mobile health (mHealth) app and validate its effectiveness in monitoring the management of the PN process. We studied the evaluation and validation of the general process of PN using an mHealth app. The units of analysis were the PN bags prepared and administered at the Son Espases University Hospital, Palma, Spain, from June 1 to September 6, 2016. For the evaluation of the app, we used the Poststudy System Usability Questionnaire and subsequent analysis with the Cronbach alpha coefficient. Validation was performed by checking the compliance of control for all operations on each of the stages (validation and transcription of the prescription, preparation, conservation, and administration) and by monitoring the operative control points and critical control points. The results obtained from 387 bags were analyzed, with 30 interruptions of administration. The fulfillment of stages was 100%, including noncritical nonconformities in the storage control. The average deviation in the weight of the bags was less than 5%, and the infusion time did not present deviations greater than 1 hour. The developed app successfully passed the evaluation and validation tests and was implemented to perform the monitoring procedures for the overall PN process. A new mobile solution to manage the quality and traceability of sensitive medicines such as blood-derivative drugs and hazardous drugs derived from this project is currently being deployed. ©Mercedes Cervera Peris, Víctor Manuel Alonso Rorís, Juan Manuel Santos Gago, Luis Álvarez Sabucedo, Carmina Wanden-Berghe, Javier Sanz-Valero. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 03.04.2018.

  18. A generalized additive regression model for survival times

    DEFF Research Database (Denmark)

    Scheike, Thomas H.

    2001-01-01

    Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

  19. Exposure to rubber fume and rubber process dust in the general rubber goods, tyre manufacturing and retread industries.

    Science.gov (United States)

    Dost, A A; Redman, D; Cox, G

    2000-08-01

    This study assesses the current patterns and levels of exposure to rubber fume and rubber process dust in the British rubber industry and compares and contrasts the data obtained from the general rubber goods (GRG), retread tire (RT) and new tire (NT) sectors. A total of 179 rubber companies were visited and data were obtained from 52 general rubber goods, 29 retread tire and 7 new tire manufacturers. The survey was conducted using a questionnaire and included a walk-through inspection of the workplace to assess the extent of use of control measures and the nature of work practices being employed. The most recent (predominantly 1995-97) exposure monitoring data for rubber fume and rubber process dust were obtained from these companies; no additional sampling was conducted for the purpose of this study. In addition to the assessment of exposure data, evaluation of occupational hygiene reports for the quality of information and advice was also carried out.A comparison of the median exposures for processes showed that the order of exposure to rubber fume (E, in mg m(-3)) is: E(moulding) (0.40) approximately E(extrusion) (0.33)>E(milling) (0.18) for GRG; E(press) (0. 32)>E(extrusion) (0.19)>E(autoclave) (0.10) for RT; and E(press) (0. 22) approximately E(all other) (0.22) for NT. The order of exposure to rubber fume between sectors was E(GRG) (0.40)>E(RT) (0.32)>E(NT) (0.22). Median exposures to rubber process dust in the GRG was E(weighing) (4.2)>E(mixing) (1.2) approximately E(milling) (0.8) approximately E(extrusion) (0.8) and no significant difference (P=0. 31) between GRG and NT sectors. The findings compare well with the study carried out in the Netherlands [Kromhout et al. (1994), Annals of Occupational Hygiene 38(1), 3-22], and it is suggested that the factors governing the significant differences noted between the three sectors relate principally to the production and task functions and also to the extent of controls employed. Evaluation of occupational

  20. Generally representative is generally representative: comment on Shuttleworth-Edwards.

    Science.gov (United States)

    Taylor, Nicola

    2016-10-01

    The aim of this paper is to provide comment on Shuttleworth-Edwards' criticism of the general population norms created for the South African adaptation of the WAIS-IV. In her criticism, she states that the norms are not applicable for any groups in South Africa, based on the fact that the norms were not stratified according to quality of education. A discussion of some of the key issues that impact on the creation of general population norms in the South African context is provided. Demographic characteristics such as education level, quality of education, urban and rural demarcations, and home language are all considered. While the utility of within-group norms is not denied, the adoption of these without reference to the general population is not advised. To recommend that practitioners simply dispense with the general population norm without evidence that it creates misclassification or does not function effectively for the intended population lacks scientific merit at the current time. The need for clinical studies and further predictive validity research using the South African adaptation of the WAIS-IV is crucial to demonstrate the continued utility of the test in the South African context. Additional reference groups will improve the amount of comparative information available for clinicians to be able to make better informed decisions for diagnosis, but the general population norms will be an important starting point in this process.

  1. Generalized enthalpy model of a high-pressure shift freezing process

    KAUST Repository

    Smith, N. A. S.; Peppin, S. S. L.; Ramos, A. M.

    2012-01-01

    High-pressure freezing processes are a novel emerging technology in food processing, offering significant improvements to the quality of frozen foods. To be able to simulate plateau times and thermal history under different conditions, in this work

  2. One-loop QCD and Higgs bosons to partons processes using six-dimensional helicity and generalized unitarity

    International Nuclear Information System (INIS)

    Davies, Scott

    2011-01-01

    We combine the six-dimensional helicity formalism of Cheung and O'Connell with D-dimensional generalized unitarity to obtain a new formalism for computing one-loop amplitudes in dimensionally regularized QCD. With this procedure, we simultaneously obtain the pieces that are constructible from four-dimensional unitarity cuts and the rational pieces that are missed by them, while retaining a helicity formalism. We illustrate the procedure using four- and five-point one-loop amplitudes in QCD, including examples with external fermions. We also demonstrate the technique's effectiveness in next-to-leading order QCD corrections to Higgs processes by computing the next-to-leading order correction to the Higgs plus three positive-helicity gluons amplitude in the large top-quark mass limit.

  3. Real-time video signal processing by generalized DDA and control memories: three-dimensional rotation and mapping

    Science.gov (United States)

    Hama, Hiromitsu; Yamashita, Kazumi

    1991-11-01

    A new method for video signal processing is described in this paper. The purpose is real-time image transformations at low cost, low power, and small size hardware. This is impossible without special hardware. Here generalized digital differential analyzer (DDA) and control memory (CM) play a very important role. Then indentation, which is called jaggy, is caused on the boundary of a background and a foreground accompanied with the processing. Jaggy does not occur inside the transformed image because of adopting linear interpretation. But it does occur inherently on the boundary of the background and the transformed images. It causes deterioration of image quality, and must be avoided. There are two well-know ways to improve image quality, blurring and supersampling. The former does not have much effect, and the latter has the much higher cost of computing. As a means of settling such a trouble, a method is proposed, which searches for positions that may arise jaggy and smooths such points. Computer simulations based on the real data from VTR, one scene of a movie, are presented to demonstrate our proposed scheme using DDA and CMs and to confirm the effectiveness on various transformations.

  4. Perturbed GUE Minor Process and Warren's Process with Drifts

    Science.gov (United States)

    Ferrari, Patrik L.; Frings, René

    2014-01-01

    We consider the minor process of (Hermitian) matrix diffusions with constant diagonal drifts. At any given time, this process is determinantal and we provide an explicit expression for its correlation kernel. This is a measure on the Gelfand-Tsetlin pattern that also appears in a generalization of Warren's process (Electron. J. Probab. 12:573-590, 2007), in which Brownian motions have level-dependent drifts. Finally, we show that this process arises in a diffusion scaling limit from an interacting particle system in the anisotropic KPZ class in 2+1 dimensions introduced in Borodin and Ferrari (Commun. Math. Phys., 2008). Our results generalize the known results for the zero drift situation.

  5. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  6. Generalization of Gibbs Entropy and Thermodynamic Relation

    OpenAIRE

    Park, Jun Chul

    2010-01-01

    In this paper, we extend Gibbs's approach of quasi-equilibrium thermodynamic processes, and calculate the microscopic expression of entropy for general non-equilibrium thermodynamic processes. Also, we analyze the formal structure of thermodynamic relation in non-equilibrium thermodynamic processes.

  7. The professional’s orientation in the formative process for the bachelor’s general united students

    Directory of Open Access Journals (Sweden)

    Darwin Stalin Faz-Delgado

    2016-11-01

    Full Text Available In Ecuador Primary Education has as goal to develop the abilities, skills and linguistic competence in children and teenagers from 5 years old until they arrive to High School degree.  High School main objective is to provide to students a general and an interdisciplinary preparation that guide them to elaborate their life projects in order that they can fit in societyas responsible, critical and solidary human beings. It also has as intention to develop students’ abilities in knowledge acquisition and citizen competence and to prepare them to work, to learn and to access to University; this aspect establishes the importance of an adequate professional orientation that facilitates the conscious selection of their future profession and career. This article contains theoretical basis of process the formation and professional’s orientation in the High School, for the attention on the context of the Ecuador people.

  8. General practitioners' decision to refer patients to dietitians: insight into the clinical reasoning process.

    Science.gov (United States)

    Pomeroy, Sylvia E M; Cant, Robyn P

    2010-01-01

    The aim of this project was to describe general practitioners' (GPs') decision-making process for reducing nutrition risk in cardiac patients through referring a patient to a dietitian. The setting was primary care practices in Victoria. The method we employed was mixed methods research: in Study 1, 30 GPs were interviewed. Recorded interviews were transcribed and narratives analysed thematically. Study 2 involved a survey of statewide random sample of GPs. Frequencies and analyses of variance were used to explore the impact of demographic variables on decisions to refer. We found that the referral decision involved four elements: (i) synthesising management information; (ii) forecasting outcomes; (iii) planning management; and (iv) actioning referrals. GPs applied cognitive and collaborative strategies to develop a treatment plan. In Study 2, doctors (248 GPs, 30%) concurred with identified barriers/enabling factors for patients' referral. There was no association between GPs' sex, age or hours worked per week and referral factors. We conclude that a GP's judgment to offer a dietetic referral to an adult patient is a four element reasoning process. Attention to how these elements interact may assist clinical decision making. Apart from the sole use of prescribed medications/surgical procedures for cardiac care, patients offered a dietetic referral were those who were considered able to commit to dietary change and who were willing to attend a dietetic consultation. Improvements in provision of patients' nutrition intervention information to GPs are needed. Further investigation is justified to determine how to resolve this practice gap.

  9. The General Aggression Model

    NARCIS (Netherlands)

    Allen, Johnie J.; Anderson, Craig A.; Bushman, Brad J.

    The General Aggression Model (GAM) is a comprehensive, integrative, framework for understanding aggression. It considers the role of social, cognitive, personality, developmental, and biological factors on aggression. Proximate processes of GAM detail how person and situation factors influence

  10. Application of the generalized multi structural (GMS) wave function to photoelectron spectra and electron scattering processes

    International Nuclear Information System (INIS)

    Nascimento, M.A.C. do

    1992-01-01

    A Generalized Multi Structural (GMS) wave function is presented which combines the advantages of the SCF-MO and VB models, preserving the classical chemical structures but optimizing the orbitals in a self-consistent way. This wave function is particularly suitable to treat situations where the description of the molecular state requires localized wave functions. It also provides a very convenient way of treating the electron correlation problem, avoiding large CI expansions. The final wave functions are much more compact and easier to interpret than the ones obtained by the conventional methods, using orthogonal orbitals. Applications of the GMS wave function to the study of the photoelectron spectra of the trans-glyoxal molecule and to electron impact excitation processes in the nitrogen molecule are presented as an illustration of the method. (author)

  11. Associations between Grawe's general mechanisms of change and Young's early maladaptive schemas in psychotherapy research: a comparative study of change processes.

    Science.gov (United States)

    Mander, Johannes V; Jacob, Gitta A; Götz, Lea; Sammet, Isa; Zipfel, Stephan; Teufel, Martin

    2015-01-01

    The study aimed at analyzing associations between Grawe's general mechanisms of change and Young's early maladaptive schemas (EMS). Therefore, 98 patients completed the Scale for the Multiperspective Assessment of General Change Mechanisms in Psychotherapy (SACiP), the Young Shema Questionnaire-Short Form Revised (YSQ S3R), and diverse outcome measures at the beginning and end of treatment. Our results are important for clinical applications, as we demonstrated strong predictive effects of change mechanisms on schema domains using regression analyses and cross-lagged panel models. Resource activation experiences seem to be especially crucial in fostering alterations in EMS, as this change mechanism demonstrated significant associations with several schema domains. Future research should investigate these aspects in more detail using observer-based micro-process analyses.

  12. The Generalized Quantum Statistics

    OpenAIRE

    Hwang, WonYoung; Ji, Jeong-Young; Hong, Jongbae

    1999-01-01

    The concept of wavefunction reduction should be introduced to standard quantum mechanics in any physical processes where effective reduction of wavefunction occurs, as well as in the measurement processes. When the overlap is negligible, each particle obey Maxwell-Boltzmann statistics even if the particles are in principle described by totally symmetrized wavefunction [P.R.Holland, The Quantum Theory of Motion, Cambridge Unversity Press, 1993, p293]. We generalize the conjecture. That is, par...

  13. A generalized gyrokinetic Poisson solver

    International Nuclear Information System (INIS)

    Lin, Z.; Lee, W.W.

    1995-03-01

    A generalized gyrokinetic Poisson solver has been developed, which employs local operations in the configuration space to compute the polarization density response. The new technique is based on the actual physical process of gyrophase-averaging. It is useful for nonlocal simulations using general geometry equilibrium. Since it utilizes local operations rather than the global ones such as FFT, the new method is most amenable to massively parallel algorithms

  14. Tuned with a tune: Talker normalization via general auditory processes

    Directory of Open Access Journals (Sweden)

    Erika J C Laing

    2012-06-01

    Full Text Available Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker’s speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS of a talker’s speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences’ LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by nonspeech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization.

  15. Teachers' Understanding of Algebraic Generalization

    Science.gov (United States)

    Hawthorne, Casey Wayne

    Generalization has been identified as a cornerstone of algebraic thinking (e.g., Lee, 1996; Sfard, 1995) and is at the center of a rich conceptualization of K-8 algebra (Kaput, 2008; Smith, 2003). Moreover, mathematics teachers are being encouraged to use figural-pattern generalizing tasks as a basis of student-centered instruction, whereby teachers respond to and build upon the ideas that arise from students' explorations of these activities. Although more and more teachers are engaging their students in such generalizing tasks, little is known about teachers' understanding of generalization and their understanding of students' mathematical thinking in this domain. In this work, I addressed this gap, exploring the understanding of algebraic generalization of 4 exemplary 8th-grade teachers from multiple perspectives. A significant feature of this investigation is an examination of teachers' understanding of the generalization process, including the use of algebraic symbols. The research consisted of two phases. Phase I was an examination of the teachers' understandings of the underlying quantities and quantitative relationships represented by algebraic notation. In Phase II, I observed the instruction of 2 of these teachers. Using the lens of professional noticing of students' mathematical thinking, I explored the teachers' enacted knowledge of algebraic generalization, characterizing how it supported them to effectively respond to the needs and queries of their students. Results indicated that teachers predominantly see these figural patterns as enrichment activities, disconnected from course content. Furthermore, in my analysis, I identified conceptual difficulties teachers experienced when solving generalization tasks, in particular, connecting multiple symbolic representations with the quantities in the figures. Moreover, while the teachers strived to overcome the challenges of connecting different representations, they invoked both productive and unproductive

  16. 40 CFR 434.11 - General definitions.

    Science.gov (United States)

    2010-07-01

    ... General Provisions § 434.11 General definitions. (a) The term “acid or ferruginous mine drainage” means mine drainage which, before any treatment, either has a pH of less than 6.0 or a total iron... processes within a coal preparation plant. (h) The term “mine drainage” means any drainage, and any water...

  17. Time-changed Ornstein–Uhlenbeck process

    International Nuclear Information System (INIS)

    Gajda, Janusz; Wyłomańska, Agnieszka

    2015-01-01

    The Ornstein–Uhlenbeck process is one of the most popular systems used for financial data description. However, this process has also been examined in the context of many other phenomena. In this paper we consider the so-called time-changed Ornstein–Uhlenbeck process, in which time is replaced by an inverse subordinator of general infinite divisible distribution. Time-changed processes nowadays play an important role in various fields of mathematical physics, chemistry, and biology as well as in finance. In this paper we examine the main characteristics of the time-changed Ornstein–Uhlenbeck process, such as the covariance function. Moreover, we also prove the formula for a generalized fractional Fokker–Planck equation that describes the one-dimensional probability density function of the analyzed system. For three cases of subordinators we show the special forms of obtained general formulas. Furthermore, we mention how to simulate the trajectory of the Ornstein–Uhlenbeck process delayed by a general inverse subordinator. (paper)

  18. Generalized Nonlinear Yule Models

    OpenAIRE

    Lansky, Petr; Polito, Federico; Sacerdote, Laura

    2016-01-01

    With the aim of considering models with persistent memory we propose a fractional nonlinear modification of the classical Yule model often studied in the context of macrovolution. Here the model is analyzed and interpreted in the framework of the development of networks such as the World Wide Web. Nonlinearity is introduced by replacing the linear birth process governing the growth of the in-links of each specific webpage with a fractional nonlinear birth process with completely general birth...

  19. DYNSYL: a general-purpose dynamic simulator for chemical processes

    International Nuclear Information System (INIS)

    Patterson, G.K.; Rozsa, R.B.

    1978-01-01

    Lawrence Livermore Laboratory is conducting a safeguards program for the Nuclear Regulatory Commission. The goal of the Material Control Project of this program is to evaluate material control and accounting (MCA) methods in plants that handle special nuclear material (SNM). To this end we designed and implemented the dynamic chemical plant simulation program DYNSYL. This program can be used to generate process data or to provide estimates of process performance; it simulates both steady-state and dynamic behavior. The MCA methods that may have to be evaluated range from sophisticated on-line material trackers such as Kalman filter estimators, to relatively simple material balance procedures. This report describes the overall structure of DYNSYL and includes some example problems. The code is still in the experimental stage and revision is continuing

  20. General and special education teachers' relations within teamwork ...

    African Journals Online (AJOL)

    and instruction, assessment and evaluation, and classroom management and behavior. Typically, the ... teaching techniques and learning processes. ... general objective of this research is to establish the relations of general and special.

  1. Referents that support the Pedagogic Professional Performance Integral General Professor Secondary School, in the use of informatics in the teaching-learning process

    Directory of Open Access Journals (Sweden)

    Luis Emilio Caro Betancourt

    2008-09-01

    Full Text Available This article approaches the theoretical referents that sustain the professional pedagogical behavior of the Entire General Professor of Secondary School, using computer science in the teaching learning process. Taking into account the introducti on of the scientific and technical developments (Computer science in education and the professional's role starting from the demands of the conceived model for Secondary School Education.

  2. The evolution of robotic general surgery.

    Science.gov (United States)

    Wilson, E B

    2009-01-01

    Surgical robotics in general surgery has a relatively short but very interesting evolution. Just as minimally invasive and laparoscopic techniques have radically changed general surgery and fractionated it into subspecialization, robotic technology is likely to repeat the process of fractionation even further. Though it appears that robotics is growing more quickly in other specialties, the changes digital platforms are causing in the general surgical arena are likely to permanently alter general surgery. This review examines the evolution of robotics in minimally invasive general surgery looking forward to a time where robotics platforms will be fundamental to elective general surgery. Learning curves and adoption techniques are explored. Foregut, hepatobiliary, endocrine, colorectal, and bariatric surgery will be examined as growth areas for robotics, as well as revealing the current uses of this technology.

  3. Trade liberalization, the Mercosur integration process and the agriculture-industry transfers: a general equilibrium analysis

    Directory of Open Access Journals (Sweden)

    Joaquim Bento de Souza Ferreira Filho

    1999-12-01

    Full Text Available This paper deals with the effects of trade liberalization and Mercosur integration process upon the Brazilian economy, with emphasis on the agricultural and agroindustrial production sectors, under the hypothesis that those phenomena could be another step in the rural-urban transfer process in Brazil. The analysis is conducted through an applied general equilibrium model. Results suggest that trade liberalization would hardly generate a widespread process of rural-urban transfers, although Brazilian agriculture shows up as a loser in the process. Notwithstanding that fact, there are transfers inside the agricultural sectors, where, besides the losses in the value added of the grain production sectors, there would be gains for the livestock and for the ''other crops" sectors. The agroindustry, in contrast, seems to gain both in Brazil and Argentina. Model results suggest yet that the Brazilian society would be benefitted as a whole by the integration, despite the losses in the agricultural sector.Este artigo analisa os efeitos do processo de liberalização comercial e de constituição do Mercosul sobre a economia brasileira, com ênfase nos setores produtivos da agricultura e da agroindústria, sob a hipótese de que aqueles fenômenos seriam mais uma etapa no processo de transferências rurais-urbanas no Brasil. Para tanto, a análise é conduzida através do uso de um modelo de equilíbrio geral aplicado. Os resultados sugerem que a integração comercial não irá gerar um processo amplo de transferências rurais-urbanas no Brasil, embora a agricultura brasileira apareça, no agregado, como o setor perdedor na integração, em benefício da agricultura argentina. Há, entretanto, transferências dentro dos setores da agropecuária brasileira, onde, ao lado das perdas no valor adicionado do setor produtor de grãos, haveria ganhos para a pecuária e para o setor ''outras culturas". A agroindústria, em contraste, parece ganhar tanto no Brasil

  4. Generalized rough sets

    International Nuclear Information System (INIS)

    Rady, E.A.; Kozae, A.M.; Abd El-Monsef, M.M.E.

    2004-01-01

    The process of analyzing data under uncertainty is a main goal for many real life problems. Statistical analysis for such data is an interested area for research. The aim of this paper is to introduce a new method concerning the generalization and modification of the rough set theory introduced early by Pawlak [Int. J. Comput. Inform. Sci. 11 (1982) 314

  5. A new iteration process for generalized lipschitz pseudo-contractive and generalized lipschitz accretive mappings

    International Nuclear Information System (INIS)

    Chidume, C.E.; Ofoedu, E.U.

    2007-07-01

    Let K be a nonempty closed convex subset of a real Banach space E. Let T : K → K be a generalized Lipschitz pseudo-contractive mapping such that F(T) := { x element of K : Tx = x} ≠ 0. Let { α n } n ≥ 1 , { λ n } n ≥ 1 and { θ n } n ≥ 1 be real sequences in (0, 1) such that α n = o( θ n ), lim n →∞ λ n = 0 and λ n ( α n + θ n ) 1 element of K, let the sequence { x n } n ≥ 1 be iteratively generated by x n+1 = (1 - λ n α n )x n + λ n α n Tx n - λ n θ n (x n - x 1 ), n ≥ 1. Then, { x n } n ≥ 1 is bounded. Moreover, if E is a reflexive Banach space with uniformly Gateaux differentiable norm and if Σ n=1 ∞ λ n θ n = ∞ is additionally assumed, then, under mild conditions, left brace# x n } n ≥ 1 converges strongly to some x* element of F(T). (author)

  6. On Generalization in Qualitatively Oriented Research

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2007-09-01

    Full Text Available In this article, I open a debate about the importance and possibilities of generalization in qualitative oriented research. Generalization traditionally is seen as a central aim of science, as a process of theory formulation for further applications. Others criticize the concept in general, either because of the insufficiency of inductive arguments (POPPER, 1959 or because of context specificity of all scientific findings (LINCOLN & GUBA, 1985. In this paper, I argue that generalization is necessary in qualitative research, but we have to differentiate different aims of generalization: laws, rules, context specific statements, similarities and differences, and procedures. There are different possibilities to arrive at a generalization: analysis of total population, falsification, random or stratified samples, argumentative generalization, theoretical sampling, variation, and triangulation. Depending on the type of research or research design some of those strategies of generalization can be important for qualitative oriented research. This is discussed especially in respect to single case analysis. URN: urn:nbn:de:0114-fqs0703262

  7. Categorization = Decision Making + Generalization

    Science.gov (United States)

    Seger, Carol A; Peterson, Erik J.

    2013-01-01

    We rarely, if ever, repeatedly encounter exactly the same situation. This makes generalization crucial for real world decision making. We argue that categorization, the study of generalizable representations, is a type of decision making, and that categorization learning research would benefit from approaches developed to study the neuroscience of decision making. Similarly, methods developed to examine generalization and learning within the field of categorization may enhance decision making research. We first discuss perceptual information processing and integration, with an emphasis on accumulator models. We then examine learning the value of different decision making choices via experience, emphasizing reinforcement learning modeling approaches. Next we discuss how value is combined with other factors in decision making, emphasizing the effects of uncertainty. Finally, we describe how a final decision is selected via thresholding processes implemented by the basal ganglia and related regions. We also consider how memory related functions in the hippocampus may be integrated with decision making mechanisms and contribute to categorization. PMID:23548891

  8. Consultant-led, multidisciplinary balance clinic: process evaluation of a specialist model of care in a district general hospital.

    Science.gov (United States)

    Trinidade, A; Yung, M W

    2014-04-01

    A specialist balance clinic to effectively deal with dizzy patients is recommended by ENT-UK. We audit the patient pathway before and following the introduction of a consultant-led dedicated balance clinic. Process evaluation and audit. ENT outpatients department of a district general hospital. The journey of dizzy patients seen in the general ENT clinic was mapped from case notes and recorded retrospectively. A consultant-led, multidisciplinary balance clinic involving an otologist, a senior audiologist and a neurophysiotherapist was then set up, and the journey was prospectively recorded and compared with that before the change. Of the 44 dizzy patients seen in the general clinic, 41% had further follow-up consultations; 64% were given definitive or provisional diagnoses; 75% were discharged without a management plan. Oculomotor examination was not systematically performed. The mean interval between Visits 1 and 2 was 8.4 weeks and the mean number of visits was 3. In the consultant-led dedicated balance clinic, following Visit 1, only 8% of patients required follow-up; 97% received definitive diagnoses, which guided management; all patients left with definitive management plans in place. In all patients, oculomotor assessment was systematically performed and all patients received consultant and, where necessary, allied healthcare professional input. By standardising the management experience for dizzy patients, appropriate and timely treatment can be achieved, allowing for a more seamless and efficient patient journey from referral to treatment. A multidisciplinary balance clinic led by a consultant otologist is the ideal way to achieve this. © 2014 John Wiley & Sons Ltd.

  9. Experienced General Music Teachers' Instructional Decision Making

    Science.gov (United States)

    Johnson, Daniel C.; Matthews, Wendy K.

    2017-01-01

    The purpose of this descriptive study was to explore experienced general music teachers' decision-making processes. Participants included seven experienced, American general music teachers who contributed their views during two phases of data collection: (1) responses to three classroom scenarios; and (2) in-depth, semi-structured, follow-up…

  10. A General Representation Theorem for Integrated Vector Autoregressive Processes

    DEFF Research Database (Denmark)

    Franchi, Massimo

    We study the algebraic structure of an I(d) vector autoregressive process, where d is restricted to be an integer. This is useful to characterize its polynomial cointegrating relations and its moving average representation, that is to prove a version of the Granger representation theorem valid...

  11. Glauber model and its generalizations

    International Nuclear Information System (INIS)

    Bialkowski, G.

    The physical aspects of the Glauber model problems are studied: potential model, profile function and Feynman diagrams approaches. Different generalizations of the Glauber model are discussed: particularly higher and lower energy processes and large angles [fr

  12. A generalized perturbation program for CANDU reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Heon; Kim, Jong Kyung [Hanyang University, Seoul (Korea, Republic of); Choi, Hang Bok; Roh, Gyu Hong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Yang, Won Sik [Chosun University, Kwangju (Korea, Republic of)

    1999-12-31

    A generalized perturbation program has been developed for the purpose of estimating zonal power variation of a CANDU reactor upon refueling operation. The forward and adjoint calculation modules of RFSP code were used to construct the generalized perturbation program. The numerical algorithm for the generalized adjoint flux calculation was verified by comparing the zone power estimates upon refueling with those of forward calculation. It was, however, noticed that the truncation error from the iteration process of the generalized adjoint flux is not negligible. 2 refs., 1 figs., 1 tab. (Author)

  13. A generalized perturbation program for CANDU reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Heon; Kim, Jong Kyung [Hanyang University, Seoul (Korea, Republic of); Choi, Hang Bok; Roh, Gyu Hong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Yang, Won Sik [Chosun University, Kwangju (Korea, Republic of)

    1998-12-31

    A generalized perturbation program has been developed for the purpose of estimating zonal power variation of a CANDU reactor upon refueling operation. The forward and adjoint calculation modules of RFSP code were used to construct the generalized perturbation program. The numerical algorithm for the generalized adjoint flux calculation was verified by comparing the zone power estimates upon refueling with those of forward calculation. It was, however, noticed that the truncation error from the iteration process of the generalized adjoint flux is not negligible. 2 refs., 1 figs., 1 tab. (Author)

  14. Generalized Boolean logic Driven Markov Processes: A powerful modeling framework for Model-Based Safety Analysis of dynamic repairable and reconfigurable systems

    International Nuclear Information System (INIS)

    Piriou, Pierre-Yves; Faure, Jean-Marc; Lesage, Jean-Jacques

    2017-01-01

    This paper presents a modeling framework that permits to describe in an integrated manner the structure of the critical system to analyze, by using an enriched fault tree, the dysfunctional behavior of its components, by means of Markov processes, and the reconfiguration strategies that have been planned to ensure safety and availability, with Moore machines. This framework has been developed from BDMP (Boolean logic Driven Markov Processes), a previous framework for dynamic repairable systems. First, the contribution is motivated by pinpointing the limitations of BDMP to model complex reconfiguration strategies and the failures of the control of these strategies. The syntax and semantics of GBDMP (Generalized Boolean logic Driven Markov Processes) are then formally defined; in particular, an algorithm to analyze the dynamic behavior of a GBDMP model is developed. The modeling capabilities of this framework are illustrated on three representative examples. Last, qualitative and quantitative analysis of GDBMP models highlight the benefits of the approach.

  15. Quantum Networks: General theory and applications

    International Nuclear Information System (INIS)

    Bisio, A.; D'Ariano, G. M.; Perinotti, P.; Chiribella, G.

    2011-01-01

    In this work we present a general mathematical framework to deal with Quantum Networks, i.e. networks resulting from the interconnection of elementary quantum circuits. The cornerstone of our approach is a generalization of the Choi isomorphism that allows one to efficiently represent any given Quantum Network in terms of a single positive operator. Our formalism allows one to face and solve many quantum information processing problems that would be hardly manageable otherwise, the most relevant of which are reviewed in this work: quantum process tomography, quantum cloning and learning of transformations, inversion of a unitary gate, information-disturbance tradeoff in estimating a unitary transformation, cloning and learning of a measurement device (Authors)

  16. Generalized File Management System or Proto-DBMS?

    Science.gov (United States)

    Braniff, Tom

    1979-01-01

    The use of a data base management system (DBMS) as opposed to traditional data processing is discussed. The generalized file concept is viewed as an entry level step to the DBMS. The transition process from one system to the other is detailed. (SF)

  17. Accreditation in general practice in Denmark

    DEFF Research Database (Denmark)

    Andersen, Merethe K; Pedersen, Line B; Siersma, Volkert

    2017-01-01

    Background: Accreditation is used increasingly in health systems worldwide. However, there is a lack of evidence on the effects of accreditation, particularly in general practice. In 2016 a mandatory accreditation scheme was initiated in Denmark, and during a 3-year period all practices, as default...... general practitioners in Denmark. Practices allocated to accreditation in 2016 serve as the intervention group, and practices allocated to accreditation in 2018 serve as controls. The selected outcomes should meet the following criteria: (1) a high degree of clinical relevance; (2) the possibility...... practice and mortality. All outcomes relate to quality indicators included in the Danish Healthcare Quality Program, which is based on general principles for accreditation. Discussion: The consequences of accreditation and standard-setting processes are generally under-researched, particularly in general...

  18. General principles of the quality management

    International Nuclear Information System (INIS)

    Koutaniemi, P.

    2005-01-01

    The objective of the presentation is to outline some general infrastructure of nuclear industry with regard to the quality management; to emphasise significance of safety management as an integral part of the quality management; and to highlight different steps of the management process in a near-time working, at an annual level and as a strategic process

  19. A general audiovisual temporal processing deficit in adult readers with dyslexia

    NARCIS (Netherlands)

    Francisco, A.A.; Jesse, A.; Groen, M.A.; McQueen, J.M.

    2017-01-01

    Purpose: Because reading is an audiovisual process, reading impairment may reflect an audiovisual processing deficit. The aim of the present study was to test the existence and scope of such a deficit in adult readers with dyslexia. Method: We tested 39 typical readers and 51 adult readers with

  20. Regeneration and general Markov chains

    Directory of Open Access Journals (Sweden)

    Vladimir V. Kalashnikov

    1994-01-01

    Full Text Available Ergodicity, continuity, finite approximations and rare visits of general Markov chains are investigated. The obtained results permit further quantitative analysis of characteristics, such as, rates of convergence, continuity (measured as a distance between perturbed and non-perturbed characteristics, deviations between Markov chains, accuracy of approximations and bounds on the distribution function of the first visit time to a chosen subset, etc. The underlying techniques use the embedding of the general Markov chain into a wide sense regenerative process with the help of splitting construction.

  1. The theory, practice, and future of process improvement in general thoracic surgery.

    Science.gov (United States)

    Freeman, Richard K

    2014-01-01

    Process improvement, in its broadest sense, is the analysis of a given set of actions with the aim of elevating quality and reducing costs. The tenets of process improvement have been applied to medicine in increasing frequency for at least the last quarter century including thoracic surgery. This review outlines the theory underlying process improvement, the currently available data sources for process improvement and possible future directions of research. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Cognitive load and emotional processing in generalized anxiety disorder: electrocortical evidence for increased distractibility.

    Science.gov (United States)

    MacNamara, Annmarie; Proudfit, Greg Hajcak

    2014-08-01

    Generalized anxiety disorder (GAD) may be characterized by emotion regulation deficits attributable to an imbalance between top-down (i.e., goal-driven) and bottom-up (i.e., stimulus-driven) attention. In prior work, these attentional processes were examined by presenting unpleasant and neutral pictures within a working memory paradigm. The late positive potential (LPP) measured attention toward task-irrelevant pictures. Results from this prior work showed that working memory load reduced the LPP across participants; however, this effect was attenuated for individuals with greater self-reported state anxiety, suggesting reduced top-down control. In the current study, the same paradigm was used with 106 medication-free female participants-71 with GAD and 35 without GAD. Unpleasant pictures elicited larger LPPs, and working memory load reduced the picture-elicited LPP. Compared with healthy controls, participants with GAD showed large LPPs to unpleasant pictures presented under high working memory load. Self-reported symptoms of anhedonic depression were related to a reduced effect of working memory load on the LPP elicited by neutral pictures. These results indicate that individuals with GAD show less flexible modulation of attention when confronted with unpleasant stimuli. Furthermore, among those with GAD, anhedonic depression may broaden attentional deficits to neutral distracters. (c) 2014 APA, all rights reserved.

  3. Nigerian Journal of General Practice: About this journal

    African Journals Online (AJOL)

    Policies. » Focus and Scope; » Section Policies; » Peer Review Process; » Publication Frequency; » Subscriptions; » The Association of General and Private Medical Practitioners of Nigeria [AGPMPN]; » Advertising in the Nigerian Journal of General Practice; » NJGP Editorial Board ...

  4. 'It Opened My Eyes'-examining the impact of a multifaceted chlamydia testing intervention on general practitioners using Normalization Process Theory.

    Science.gov (United States)

    Yeung, Anna; Hocking, Jane; Guy, Rebecca; Fairley, Christopher K; Smith, Kirsty; Vaisey, Alaina; Donovan, Basil; Imrie, John; Gunn, Jane; Temple-Smith, Meredith

    2018-03-28

    Chlamydia is the most common notifiable sexually transmissible infection in Australia. Left untreated, it can develop into pelvic inflammatory disease and infertility. The majority of notifications come from general practice and it is ideally situated to test young Australians. The Australian Chlamydia Control Effectiveness Pilot (ACCEPt) was a multifaceted intervention that aimed to reduce chlamydia prevalence by increasing testing in 16- to 29-year-olds attending general practice. GPs were interviewed to describe the effectiveness of the ACCEPt intervention in integrating chlamydia testing into routine practice using Normalization Process Theory (NPT). GPs were purposively selected based on age, gender, geographic location and size of practice at baseline and midpoint. Interview data were analysed regarding the intervention components and results were interpreted using NPT. A total of 44 GPs at baseline and 24 at midpoint were interviewed. Most GPs reported offering a test based on age at midpoint versus offering a test based on symptoms or patient request at baseline. Quarterly feedback was the most significant ACCEPt component for facilitating a chlamydia test. The ACCEPt intervention has been able to moderately normalize chlamydia testing among GPs, although the components had varying levels of effectiveness. NPT can demonstrate the effective implementation of an intervention in general practice and has been valuable in understanding which components are essential and which components can be improved upon.

  5. Definition of Nonequilibrium Entropy of General Systems

    OpenAIRE

    Mei, Xiaochun

    1999-01-01

    The definition of nonequilibrium entropy is provided for the general nonequilibrium processes by connecting thermodynamics with statistical physics, and the principle of entropy increment in the nonequilibrium processes is also proved in the paper. The result shows that the definition of nonequilibrium entropy is not unique.

  6. CO-PRODUCT ENHANCEMENT AND DEVELOPMENT FOR THE MASADA OXYNOL PROCESS PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Donald V. Watkins

    2010-06-14

    The focus of this project was an overall process improvement through the enhancement of the co-product streams. The enhancement of the process operations and co-products will increase both ethanol production and the value of other process outputs and reduces the amount of waste byproducts. This leads to a more economical and environmentally sound alternative to landfill disposal of municipal solid waste (MSW). These enhancements can greatly increase the commercial potential for the production of ethanol from MSW by the Masada CES OxyNol process. Both technological and economical issues were considered for steps throughout the conversion process. The research efforts of this project are varied but synergistic. The project investigated many of the operations involved in the Masada process with the overall goal of process improvements. The general goal of the testing was to improve co-product quality, improve conversions efficiencies, minimize process losses, increase energy efficiency, and mitigate process and commercialization risks. The project was divided into 16 subtasks as described in general terms below. All these tasks are interrelated but not necessarily interdependent.

  7. A Generalized Polynomial Chaos-Based Approach to Analyze the Impacts of Process Deviations on MEMS Beams.

    Science.gov (United States)

    Gao, Lili; Zhou, Zai-Fa; Huang, Qing-An

    2017-11-08

    A microstructure beam is one of the fundamental elements in MEMS devices like cantilever sensors, RF/optical switches, varactors, resonators, etc. It is still difficult to precisely predict the performance of MEMS beams with the current available simulators due to the inevitable process deviations. Feasible numerical methods are required and can be used to improve the yield and profits of the MEMS devices. In this work, process deviations are considered to be stochastic variables, and a newly-developed numerical method, i.e., generalized polynomial chaos (GPC), is applied for the simulation of the MEMS beam. The doubly-clamped polybeam has been utilized to verify the accuracy of GPC, compared with our Monte Carlo (MC) approaches. Performance predictions have been made on the residual stress by achieving its distributions in GaAs Monolithic Microwave Integrated Circuit (MMIC)-based MEMS beams. The results show that errors are within 1% for the results of GPC approximations compared with the MC simulations. Appropriate choices of the 4-order GPC expansions with orthogonal terms have also succeeded in reducing the MC simulation labor. The mean value of the residual stress, concluded from experimental tests, shares an error about 1.1% with that of the 4-order GPC method. It takes a probability around 54.3% for the 4-order GPC approximation to attain the mean test value of the residual stress. The corresponding yield occupies over 90 percent around the mean within the twofold standard deviations.

  8. A Generalized Polynomial Chaos-Based Approach to Analyze the Impacts of Process Deviations on MEMS Beams

    Directory of Open Access Journals (Sweden)

    Lili Gao

    2017-11-01

    Full Text Available A microstructure beam is one of the fundamental elements in MEMS devices like cantilever sensors, RF/optical switches, varactors, resonators, etc. It is still difficult to precisely predict the performance of MEMS beams with the current available simulators due to the inevitable process deviations. Feasible numerical methods are required and can be used to improve the yield and profits of the MEMS devices. In this work, process deviations are considered to be stochastic variables, and a newly-developed numerical method, i.e., generalized polynomial chaos (GPC, is applied for the simulation of the MEMS beam. The doubly-clamped polybeam has been utilized to verify the accuracy of GPC, compared with our Monte Carlo (MC approaches. Performance predictions have been made on the residual stress by achieving its distributions in GaAs Monolithic Microwave Integrated Circuit (MMIC-based MEMS beams. The results show that errors are within 1% for the results of GPC approximations compared with the MC simulations. Appropriate choices of the 4-order GPC expansions with orthogonal terms have also succeeded in reducing the MC simulation labor. The mean value of the residual stress, concluded from experimental tests, shares an error about 1.1% with that of the 4-order GPC method. It takes a probability around 54.3% for the 4-order GPC approximation to attain the mean test value of the residual stress. The corresponding yield occupies over 90 percent around the mean within the twofold standard deviations.

  9. The General Aggression Model.

    Science.gov (United States)

    Allen, Johnie J; Anderson, Craig A; Bushman, Brad J

    2018-02-01

    The General Aggression Model (GAM) is a comprehensive, integrative, framework for understanding aggression. It considers the role of social, cognitive, personality, developmental, and biological factors on aggression. Proximate processes of GAM detail how person and situation factors influence cognitions, feelings, and arousal, which in turn affect appraisal and decision processes, which in turn influence aggressive or nonaggressive behavioral outcomes. Each cycle of the proximate processes serves as a learning trial that affects the development and accessibility of aggressive knowledge structures. Distal processes of GAM detail how biological and persistent environmental factors can influence personality through changes in knowledge structures. GAM has been applied to understand aggression in many contexts including media violence effects, domestic violence, intergroup violence, temperature effects, pain effects, and the effects of global climate change. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A General Audiovisual Temporal Processing Deficit in Adult Readers with Dyslexia

    Science.gov (United States)

    Francisco, Ana A.; Jesse, Alexandra; Groen, Margriet A.; McQueen, James M.

    2017-01-01

    Purpose: Because reading is an audiovisual process, reading impairment may reflect an audiovisual processing deficit. The aim of the present study was to test the existence and scope of such a deficit in adult readers with dyslexia. Method: We tested 39 typical readers and 51 adult readers with dyslexia on their sensitivity to the simultaneity of…

  11. 19 CFR 210.27 - General provisions governing discovery.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false General provisions governing discovery. 210.27 Section 210.27 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT Discovery and Compulsory Process § 210.27 General...

  12. Processing of FRG high-temperature gas-cooled reactor fuel elements at General Atomic under the US/FRG cooperative agreement for spent fuel elements

    International Nuclear Information System (INIS)

    Holder, N.D.; Strand, J.B.; Schwarz, F.A.; Drake, R.N.

    1981-11-01

    The Federal Republic of Germany (FRG) and the United States (US) are cooperating on certain aspects of gas-cooled reactor technology under an umbrella agreement. Under the spent fuel treatment development section of the agreement, both FRG mixed uranium/ thorium and low-enriched uranium fuel spheres have been processed in the Department of Energy-sponsored cold pilot plant for high-temperature gas-cooled reactor (HTGR) fuel processing at General Atomic Company in San Diego, California. The FRG fuel spheres were crushed and burned to recover coated fuel particles suitable for further treatment for uranium recovery. Successful completion of the tests described in this paper demonstrated certain modifications to the US HTGR fuel burining process necessary for FRG fuel treatment. Results of the tests will be used in the design of a US/FRG joint prototype headend facility for HTGR fuel

  13. Maltreatment and Delinquency in China: Examining and Extending the Intervening Process of General Strain Theory.

    Science.gov (United States)

    Gao, Yunjiao; Wong, Dennis S W; Yu, Yanping

    2016-01-01

    Using a sample of 1,163 adolescents from four middle schools in China, this study explores the intervening process of how adolescent maltreatment is related to delinquency within the framework of general strain theory (GST) by comparing two models. The first model is Agnew's integrated model of GST, which examines the mediating effects of social control, delinquent peer affiliation, state anger, and depression on the relationship between maltreatment and delinquency. Based on this model, with the intent to further explore the mediating effects of state anger and depression and to investigate whether their effects on delinquency can be demonstrated more through delinquent peer affiliation and social control, an extended model (Model 2) is proposed by the authors. The second model relates state anger to delinquent peer affiliation and state depression to social control. By comparing the fit indices and the significance of the hypothesized paths of the two models, the study found that the extended model can better reflect the mechanism of how maltreatment contributes to delinquency, whereas the original integrated GST model only receives partial support because of its failure to find the mediating effects of state negative emotions. © The Author(s) 2014.

  14. General Properties of Scattering Matrix for Mode Conversion Process between B Waves and External EM Waves and Their Consequence to Experiments

    International Nuclear Information System (INIS)

    Maekawa, T.; Tanaka, H.; Uchida, M.; Igami, H.

    2003-01-01

    General properties of scattering matrix, which governs the mode conversion process between electron Bernstein (B) waves and external electromagnetic (EM) waves in the presence of steep density gradient, are theoretically analyzed. Based on the analysis, polarization adjustment of incident EM waves for optimal mode conversion to B waves is possible and effective for a range of density gradient near the upper hybrid resonance, which are not covered by the previously proposed schemes of perpendicular injection of X mode and oblique injection of O mode. Furthermore, the analysis shows that the polarization of the externally emitted EM waves from B waves is uniquely related to the optimized polarization of incident EM waves for B wave heating and that the mode conversion rate is the same for the both processes of emission and the injection with the optimized polarization

  15. A combined approach of generalized additive model and bootstrap with small sample sets for fault diagnosis in fermentation process of glutamate.

    Science.gov (United States)

    Liu, Chunbo; Pan, Feng; Li, Yun

    2016-07-29

    Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.

  16. Structuring diabetes care in general practices: many improvements, remaining challenges.

    LENUS (Irish Health Repository)

    Jennings, S

    2009-08-07

    BACKGROUND: For people with type 2 diabetes to enjoy improved longevity and quality of life, care needs to be organised in a systematic way. AIM: To test if processes and intermediate outcomes for patients with type 2 diabetes changed with the move to structured care in general practice shared with secondary care. METHODS: An audit of process and intermediate outcomes for patients with type 2 diabetes before and after the change to structured care in 10 Dublin general practices shared with secondary care four years on. RESULTS: Structured diabetes care in general practice has led to more dedicated clinics improved processes of care and increased access to multidisciplinary expertise. Improvement in blood pressure control, the use of aspirin and the use of lipid lowering agents indicate a significant decrease in absolute risk of vascular events for this population. CONCLUSIONS: Structured care in general practice improves intermediate outcomes for people with type 2 diabetes. Further improvements need to be made to reach international targets.

  17. Implementing telephone triage in general practice: a process evaluation of a cluster randomised controlled trial.

    Science.gov (United States)

    Murdoch, Jamie; Varley, Anna; Fletcher, Emily; Britten, Nicky; Price, Linnie; Calitri, Raff; Green, Colin; Lattimer, Valerie; Richards, Suzanne H; Richards, David A; Salisbury, Chris; Taylor, Rod S; Campbell, John L

    2015-04-10

    Telephone triage represents one strategy to manage demand for face-to-face GP appointments in primary care. However, limited evidence exists of the challenges GP practices face in implementing telephone triage. We conducted a qualitative process evaluation alongside a UK-based cluster randomised trial (ESTEEM) which compared the impact of GP-led and nurse-led telephone triage with usual care on primary care workload, cost, patient experience, and safety for patients requesting a same-day GP consultation. The aim of the process study was to provide insights into the observed effects of the ESTEEM trial from the perspectives of staff and patients, and to specify the circumstances under which triage is likely to be successfully implemented. Here we report perspectives of staff. The intervention comprised implementation of either GP-led or nurse-led telephone triage for a period of 2-3 months. A qualitative evaluation was conducted using staff interviews recruited from eight general practices (4 GP triage, 4 Nurse triage) in the UK, implementing triage as part of the ESTEEM trial. Qualitative interviews were undertaken with 44 staff members in GP triage and nurse triage practices (16 GPs, 8 nurses, 7 practice managers, 13 administrative staff). Staff reported diverse experiences and perceptions regarding the implementation of telephone triage, its effects on workload, and on the benefits of triage. Such diversity were explained by the different ways triage was organised, the staffing models used to support triage, how the introduction of triage was communicated across practice staff, and by how staff roles were reconfigured as a result of implementing triage. The findings from the process evaluation offer insight into the range of ways GP practices participating in ESTEEM implemented telephone triage, and the circumstances under which telephone triage can be successfully implemented beyond the context of a clinical trial. Staff experiences and perceptions of telephone

  18. 22 CFR 510.1 - Service of process.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Service of process. 510.1 Section 510.1 Foreign Relations BROADCASTING BOARD OF GOVERNORS SERVICE OF PROCESS § 510.1 Service of process. (a) The General... accepting service of process for an employee in his/her official capacity, the General Counsel or his/her...

  19. The general dynamic model

    DEFF Research Database (Denmark)

    Borregaard, Michael K.; Matthews, Thomas J.; Whittaker, Robert James

    2016-01-01

    Aim: Island biogeography focuses on understanding the processes that underlie a set of well-described patterns on islands, but it lacks a unified theoretical framework for integrating these processes. The recently proposed general dynamic model (GDM) of oceanic island biogeography offers a step...... towards this goal. Here, we present an analysis of causality within the GDM and investigate its potential for the further development of island biogeographical theory. Further, we extend the GDM to include subduction-based island arcs and continental fragment islands. Location: A conceptual analysis...... of evolutionary processes in simulations derived from the mechanistic assumptions of the GDM corresponded broadly to those initially suggested, with the exception of trends in extinction rates. Expanding the model to incorporate different scenarios of island ontogeny and isolation revealed a sensitivity...

  20. Monte Carlo Simulation of Markov, Semi-Markov, and Generalized Semi- Markov Processes in Probabilistic Risk Assessment

    Science.gov (United States)

    English, Thomas

    2005-01-01

    A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.

  1. From General Game Descriptions to a Market Specification Language for General Trading Agents

    Science.gov (United States)

    Thielscher, Michael; Zhang, Dongmo

    The idea behind General Game Playing is to build systems that, instead of being programmed for one specific task, are intelligent and flexible enough to negotiate an unknown environment solely on the basis of the rules which govern it. In this paper, we argue that this principle has the great potential to bring to a new level artificially intelligent systems in other application areas as well. Our specific interest lies in General Trading Agents, which are able to understand the rules of unknown markets and then to actively participate in them without human intervention. To this end, we extend the general Game Description Language into a language that allows to formally describe arbitrary markets in such a way that these specifications can be automatically processed by a computer. We present both syntax and a transition-based semantics for this Market Specification Language and illustrate its expressive power by presenting axiomatizations of several well-known auction types.

  2. Implementing guidelines in general practice: evaluation of process and outcome of care in chronic diseases.

    NARCIS (Netherlands)

    Schellevis, F.G.; Eijk, J.T.M. van; Lisdonk, E.H. van de; Velden, J. van der; Weel, C. van

    1994-01-01

    In a prospective longitudinal study over 21 months the performance of general practitioners and the disease status of their patients was measured during the formulation and implementation of guidelines on follow-up care. Data on 15 general practitioners and on 613 patients with hypertension, 95 with

  3. Applications of Fourier transforms to generalized functions

    CERN Document Server

    Rahman, M

    2011-01-01

    This book explains how Fourier transforms can be applied to generalized functions. The generalized function is one of the important branches of mathematics and is applicable in many practical fields. Its applications to the theory of distribution and signal processing are especially important. The Fourier transform is a mathematical procedure that can be thought of as transforming a function from its time domain to the frequency domain.The book contains six chapters and three appendices. Chapter 1 deals with preliminary remarks on Fourier series from a general point of view and also contains an introduction to the first generalized function. Chapter 2 is concerned with the generalized functions and their Fourier transforms. Chapter 3 contains the Fourier transforms of particular generalized functions. The author has stated and proved 18 formulas dealing with the Fourier transforms of generalized functions, and demonstrated some important problems of practical interest. Chapter 4 deals with the asymptotic esti...

  4. Uranium tetrafluoride reduction closed bomb. Part I: Reduction process general conditions

    International Nuclear Information System (INIS)

    Anca Abati, R.; Lopez Rodriguez, M.

    1961-01-01

    General conditions about the metallo thermic reduction in small bombs (250 and 800 gr. of uranium) has been investigated. Factors such as kind and granulometry of the magnesium used, magnesium excess and preheating temperature, which affect yields and metal quality have been considered. magnesium excess increased yields in a 15% in the small bomb, about the preheating temperature, there is a range between which yields and metal quality does not change. All tests have been made with graphite linings. (Author) 18 refs

  5. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  6. Sum rules across the unpolarized Compton processes involving generalized polarizabilities and moments of nucleon structure functions

    Science.gov (United States)

    Lensky, Vadim; Hagelstein, Franziska; Pascalutsa, Vladimir; Vanderhaeghen, Marc

    2018-04-01

    We derive two new sum rules for the unpolarized doubly virtual Compton scattering process on a nucleon, which establish novel low-Q2 relations involving the nucleon's generalized polarizabilities and moments of the nucleon's unpolarized structure functions F1(x ,Q2) and F2(x ,Q2). These relations facilitate the determination of some structure constants which can only be accessed in off-forward doubly virtual Compton scattering, not experimentally accessible at present. We perform an empirical determination for the proton and compare our results with a next-to-leading-order chiral perturbation theory prediction. We also show how these relations may be useful for a model-independent determination of the low-Q2 subtraction function in the Compton amplitude, which enters the two-photon-exchange contribution to the Lamb shift of (muonic) hydrogen. An explicit calculation of the Δ (1232 )-resonance contribution to the muonic-hydrogen 2 P -2 S Lamb shift yields -1 ±1 μ eV , confirming the previously conjectured smallness of this effect.

  7. 27 CFR 20.112 - Special industrial solvents general-use formula.

    Science.gov (United States)

    2010-04-01

    ... solvents general-use formula. 20.112 Section 20.112 Alcohol, Tobacco Products and Firearms ALCOHOL AND... AND RUM Formulas and Statements of Process General-Use Formulas § 20.112 Special industrial solvents general-use formula. (a) A special industrial solvent is any article made with any other ingredients...

  8. USING THE GENERAL ELECTRIC / MCKINSEY MATRIX IN THE PROCESS OF SELECTING THE CENTRAL AND EAST EUROPEAN MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae Răzvan Decuseară

    2013-01-01

    Full Text Available Due to limited resources a company cannot serve all potential markets in the world in a manner that all the clients to be satisfied and the business goals achieved, which is why the company should select the most appropriate markets. It can focus on a single product market serving many geographic areas, but may also decide to serve different product markets in a group of selected geographic areas. Due to the large number and diversity of markets that can choose, analyze of the market attractiveness and the selection the most interesting is a complex process. General Electric Matrix/McKinsey has two dimensions, market attractiveness and the competitive strength of the firm, and aims to analyze the strengths and weaknesses of the company in a variety of areas, allowing the company to identify the most attractive markets and to guide managers in allocating resources to these markets, improve the weaker competitive position of the company in emerging markets, or to draw firm unattractive markets. We can say that it is a very efficient tool for the company being used by international market specialists, on one hand to select foreign markets for the company, and on the other hand, to determine the strategy that the firm will be using to internationalize on those markets. At the end of this paper we present a part of a larger study in which we showed how General Electric Matrix/McKinsey it is used specifically in select foreign markets.

  9. General knowledge structure for diagnosis

    International Nuclear Information System (INIS)

    Steinar Brendeford, T.

    1996-01-01

    At the OECD Halden Reactor Project work has been going on for several years in the field of automatic fault diagnosis for nuclear power plants. Continuing this work, studies are now carried out to combine different diagnostic systems within the same framework. The goal is to establish a general knowledge structure for diagnosis applied to a NPP process. Such a consistent and generic storage of knowledge will lighten the task of combining different diagnosis techniques. An integration like this is expected to increase the robustness and widen the scope of the diagnosis. Further, verification of system reliability and on-line explanations of hypotheses can be helped. Last but not least there is a potential in reuse of both specific and generic knowledge. The general knowledge framework is also a prerequisite for a successful integration of computerized operator support systems within the process supervision and control complex. Consistency, verification and reuse are keywords also in this respect. Systems that should be considered for integration are; automatic control, computerized operator procedures, alarm - and alarm filtering, signal validation, diagnosis and condition based maintenance. This paper presents three prototype diagnosis systems developed at the OECD Halden Reactor Project. A software arrangement for process simulation with these three systems attached in parallel is briefly described. The central part of this setup is a 'blackboard' system to be used for representing shared knowledge. Examples of such knowledge representations are included in the paper. The conclusions so far in this line of work are only tentative. The studies of existing methodologies for diagnosis, however, show a potential for several generalizations to be made in knowledge representation and use. (author). 14 refs, 6 figs

  10. General knowledge structure for diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Steinar Brendeford, T [Institutt for Energiteknikk, Halden (Norway). OECD Halden Reaktor Projekt

    1997-12-31

    At the OECD Halden Reactor Project work has been going on for several years in the field of automatic fault diagnosis for nuclear power plants. Continuing this work, studies are now carried out to combine different diagnostic systems within the same framework. The goal is to establish a general knowledge structure for diagnosis applied to a NPP process. Such a consistent and generic storage of knowledge will lighten the task of combining different diagnosis techniques. An integration like this is expected to increase the robustness and widen the scope of the diagnosis. Further, verification of system reliability and on-line explanations of hypotheses can be helped. Last but not least there is a potential in reuse of both specific and generic knowledge. The general knowledge framework is also a prerequisite for a successful integration of computerized operator support systems within the process supervision and control complex. Consistency, verification and reuse are keywords also in this respect. Systems that should be considered for integration are; automatic control, computerized operator procedures, alarm - and alarm filtering, signal validation, diagnosis and condition based maintenance. This paper presents three prototype diagnosis systems developed at the OECD Halden Reactor Project. A software arrangement for process simulation with these three systems attached in parallel is briefly described. The central part of this setup is a `blackboard` system to be used for representing shared knowledge. Examples of such knowledge representations are included in the paper. The conclusions so far in this line of work are only tentative. The studies of existing methodologies for diagnosis, however, show a potential for several generalizations to be made in knowledge representation and use. (author). 14 refs, 6 figs.

  11. Generalized Nonlinear Yule Models

    Science.gov (United States)

    Lansky, Petr; Polito, Federico; Sacerdote, Laura

    2016-11-01

    With the aim of considering models related to random graphs growth exhibiting persistent memory, we propose a fractional nonlinear modification of the classical Yule model often studied in the context of macroevolution. Here the model is analyzed and interpreted in the framework of the development of networks such as the World Wide Web. Nonlinearity is introduced by replacing the linear birth process governing the growth of the in-links of each specific webpage with a fractional nonlinear birth process with completely general birth rates. Among the main results we derive the explicit distribution of the number of in-links of a webpage chosen uniformly at random recognizing the contribution to the asymptotics and the finite time correction. The mean value of the latter distribution is also calculated explicitly in the most general case. Furthermore, in order to show the usefulness of our results, we particularize them in the case of specific birth rates giving rise to a saturating behaviour, a property that is often observed in nature. The further specialization to the non-fractional case allows us to extend the Yule model accounting for a nonlinear growth.

  12. Energy optimization of integrated process plants

    Energy Technology Data Exchange (ETDEWEB)

    Sandvig Nielsen, J

    1996-10-01

    A general approach for viewing the process synthesis as an evolutionary process is proposed. Each step is taken according to the present level of information and knowledge. This is formulated in a Process Synthesis Cycle. Initially the synthesis is conducted at a high abstraction level maximizing use of heuristics (prior experience, rules of thumbs etc). When further knowledge and information are available, heuristics will gradually be replaced by exact problem formulations. The principles in the Process Synthesis Cycle, is used to develop a general procedure for energy synthesis, based on available tools. The procedure is based on efficient use of process simulators with integrated Pinch capabilities (energy targeting). The proposed general procedure is tailored to three specific problems (Humid Air Turbine power plant synthesis, Nitric Acid process synthesis and Sulphuric Acid synthesis). Using the procedure reduces the problem dimension considerable and thus allows for faster evaluation of more alternatives. At more detailed level a new framework for the Heat Exchanger Network synthesis problem is proposed. The new framework is object oriented based on a general functional description of all elements potentially present in the heat exchanger network (streams, exchangers, pumps, furnaces etc.). (LN) 116 refs.

  13. Collection, transport and general processing of clinical specimens in Microbiology laboratory.

    Science.gov (United States)

    Sánchez-Romero, M Isabel; García-Lechuz Moya, Juan Manuel; González López, Juan José; Orta Mira, Nieves

    2018-02-06

    The interpretation and the accuracy of the microbiological results still depend to a great extent on the quality of the samples and their processing within the Microbiology laboratory. The type of specimen, the appropriate time to obtain the sample, the way of sampling, the storage and transport are critical points in the diagnostic process. The availability of new laboratory techniques for unusual pathogens, makes necessary the review and update of all the steps involved in the processing of the samples. Nowadays, the laboratory automation and the availability of rapid techniques allow the precision and turn-around time necessary to help the clinicians in the decision making. In order to be efficient, it is very important to obtain clinical information to use the best diagnostic tools. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  14. Domain-Generality of Timing-Based Serial Order Processes in Short-Term Memory: New Insights from Musical and Verbal Domains.

    Directory of Open Access Journals (Sweden)

    Simon Gorin

    Full Text Available Several models in the verbal domain of short-term memory (STM consider a dissociation between item and order processing. This view is supported by data demonstrating that different types of time-based interference have a greater effect on memory for the order of to-be-remembered items than on memory for the items themselves. The present study investigated the domain-generality of the item versus serial order dissociation by comparing the differential effects of time-based interfering tasks, such as rhythmic interference and articulatory suppression, on item and order processing in verbal and musical STM domains. In Experiment 1, participants had to maintain sequences of verbal or musical information in STM, followed by a probe sequence, this under different conditions of interference (no-interference, rhythmic interference, articulatory suppression. They were required to decide whether all items of the probe list matched those of the memory list (item condition or whether the order of the items in the probe sequence matched the order in the memory list (order condition. In Experiment 2, participants performed a serial order probe recognition task for verbal and musical sequences ensuring sequential maintenance processes, under no-interference or rhythmic interference conditions. For Experiment 1, serial order recognition was not significantly more impacted by interfering tasks than was item recognition, this for both verbal and musical domains. For Experiment 2, we observed selective interference of the rhythmic interference condition on both musical and verbal order STM tasks. Overall, the results suggest a similar and selective sensitivity to time-based interference for serial order STM in verbal and musical domains, but only when the STM tasks ensure sequential maintenance processes.

  15. Domain-Generality of Timing-Based Serial Order Processes in Short-Term Memory: New Insights from Musical and Verbal Domains.

    Science.gov (United States)

    Gorin, Simon; Kowialiewski, Benjamin; Majerus, Steve

    2016-01-01

    Several models in the verbal domain of short-term memory (STM) consider a dissociation between item and order processing. This view is supported by data demonstrating that different types of time-based interference have a greater effect on memory for the order of to-be-remembered items than on memory for the items themselves. The present study investigated the domain-generality of the item versus serial order dissociation by comparing the differential effects of time-based interfering tasks, such as rhythmic interference and articulatory suppression, on item and order processing in verbal and musical STM domains. In Experiment 1, participants had to maintain sequences of verbal or musical information in STM, followed by a probe sequence, this under different conditions of interference (no-interference, rhythmic interference, articulatory suppression). They were required to decide whether all items of the probe list matched those of the memory list (item condition) or whether the order of the items in the probe sequence matched the order in the memory list (order condition). In Experiment 2, participants performed a serial order probe recognition task for verbal and musical sequences ensuring sequential maintenance processes, under no-interference or rhythmic interference conditions. For Experiment 1, serial order recognition was not significantly more impacted by interfering tasks than was item recognition, this for both verbal and musical domains. For Experiment 2, we observed selective interference of the rhythmic interference condition on both musical and verbal order STM tasks. Overall, the results suggest a similar and selective sensitivity to time-based interference for serial order STM in verbal and musical domains, but only when the STM tasks ensure sequential maintenance processes.

  16. Observations on the properties of second and general-order kinetics equations describing the thermoluminescence processes

    International Nuclear Information System (INIS)

    Kitis, G.; Furetta, C.; Azorin, J.

    2003-01-01

    Synthetic thermoluminescent (Tl) glow peaks, following a second and general kinetics order have been generated by computer. The general properties of the so generated peaks have been investigated over several order of magnitude of simulated doses. Some non usual results which, at the best knowledge of the authors, are not reported in the literature, are obtained and discussed. (Author)

  17. Generalized Bessel functions in tunnelling ionization

    International Nuclear Information System (INIS)

    Reiss, H R; Krainov, V P

    2003-01-01

    We develop two new approximations for the generalized Bessel function that frequently arises in the analytical treatment of strong-field processes, especially in non-perturbative multiphoton ionization theories. Both these new forms are applicable to the tunnelling environment in atomic ionization, and are analytically much simpler than the currently used low-frequency asymptotic approximation for the generalized Bessel function. The second of the new forms is an approximation to the first, and it is the second new form that exhibits the well-known tunnelling exponential

  18. A Stochastic Maximum Principle for General Mean-Field Systems

    International Nuclear Information System (INIS)

    Buckdahn, Rainer; Li, Juan; Ma, Jin

    2016-01-01

    In this paper we study the optimal control problem for a class of general mean-field stochastic differential equations, in which the coefficients depend, nonlinearly, on both the state process as well as of its law. In particular, we assume that the control set is a general open set that is not necessary convex, and the coefficients are only continuous on the control variable without any further regularity or convexity. We validate the approach of Peng (SIAM J Control Optim 2(4):966–979, 1990) by considering the second order variational equations and the corresponding second order adjoint process in this setting, and we extend the Stochastic Maximum Principle of Buckdahn et al. (Appl Math Optim 64(2):197–216, 2011) to this general case.

  19. A Stochastic Maximum Principle for General Mean-Field Systems

    Energy Technology Data Exchange (ETDEWEB)

    Buckdahn, Rainer, E-mail: Rainer.Buckdahn@univ-brest.fr [Université de Bretagne-Occidentale, Département de Mathématiques (France); Li, Juan, E-mail: juanli@sdu.edu.cn [Shandong University, Weihai, School of Mathematics and Statistics (China); Ma, Jin, E-mail: jinma@usc.edu [University of Southern California, Department of Mathematics (United States)

    2016-12-15

    In this paper we study the optimal control problem for a class of general mean-field stochastic differential equations, in which the coefficients depend, nonlinearly, on both the state process as well as of its law. In particular, we assume that the control set is a general open set that is not necessary convex, and the coefficients are only continuous on the control variable without any further regularity or convexity. We validate the approach of Peng (SIAM J Control Optim 2(4):966–979, 1990) by considering the second order variational equations and the corresponding second order adjoint process in this setting, and we extend the Stochastic Maximum Principle of Buckdahn et al. (Appl Math Optim 64(2):197–216, 2011) to this general case.

  20. Process Intensification: A Perspective on Process Synthesis

    DEFF Research Database (Denmark)

    Lutze, Philip; Gani, Rafiqul; Woodley, John

    2010-01-01

    In recent years, process intensification (PI) has attracted considerable academic interest as a potential means for process improvement, to meet the increasing demands for sustainable production. A variety of intensified operations developed in academia and industry creates a large number...... of options to potentially improve the process but to identify the set of feasible solutions for PI in which the optimal can be found takes considerable resources. Hence, a process synthesis tool to achieve PI would potentially assist in the generation and evaluation of PI options. Currently, several process...... design tools with a clear focus on specific PI tasks exist. Therefore, in this paper, the concept of a general systematic framework for synthesis and design of PI options in hierarchical steps through analyzing an existing process, generating PI options in a superstructure and evaluating intensified...

  1. Gravitational Couplings for Generalized Orientifold Planes

    OpenAIRE

    Giraldo, Juan Fernando Ospina

    2000-01-01

    The Wess-Zumino action for generalized orientifold planes (GOp-planes) is presented and a series power expantion is realized from which processes that involves GOp-planes, RR-forms, gravitons and gaugeons, are obtained. Finally non-standard GOp-planes are showed.

  2. Multivariate Generalized Multiscale Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Anne Humeau-Heurtier

    2016-11-01

    Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.

  3. Minimal and careful processing

    OpenAIRE

    Nielsen, Thorkild

    2004-01-01

    In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...

  4. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    Directory of Open Access Journals (Sweden)

    Björn Böttcher

    Full Text Available We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  5. When and why do doctors decide to become general practitioners? Implications for recruitment into UK general practice specialty training.

    Science.gov (United States)

    Irish, Bill; Lake, Jonathan

    2011-01-01

    All applicants to round 1 of national recruitment into the general practice specialty recruitment process were surveyed as to the reasons for, and the timing of their career choices. Most applicants reported decision making after completing undergraduate training citing variety, continuity of care and work-life balance as their main drivers for a career in general practice. Applicants were statistically more likely to have undertaken a Foundation placement in general practice than their peers on a Foundation programme. Reasons for choice of deanery were largely related to location and social ties, rather than to the educational 'reputation' of its programmes.

  6. The evolution of primate general and cultural intelligence.

    Science.gov (United States)

    Reader, Simon M; Hager, Yfke; Laland, Kevin N

    2011-04-12

    There are consistent individual differences in human intelligence, attributable to a single 'general intelligence' factor, g. The evolutionary basis of g and its links to social learning and culture remain controversial. Conflicting hypotheses regard primate cognition as divided into specialized, independently evolving modules versus a single general process. To assess how processes underlying culture relate to one another and other cognitive capacities, we compiled ecologically relevant cognitive measures from multiple domains, namely reported incidences of behavioural innovation, social learning, tool use, extractive foraging and tactical deception, in 62 primate species. All exhibited strong positive associations in principal component and factor analyses, after statistically controlling for multiple potential confounds. This highly correlated composite of cognitive traits suggests social, technical and ecological abilities have coevolved in primates, indicative of an across-species general intelligence that includes elements of cultural intelligence. Our composite species-level measure of general intelligence, 'primate g(S)', covaried with both brain volume and captive learning performance measures. Our findings question the independence of cognitive traits and do not support 'massive modularity' in primate cognition, nor an exclusively social model of primate intelligence. High general intelligence has independently evolved at least four times, with convergent evolution in capuchins, baboons, macaques and great apes.

  7. Intact Prototype Formation but Impaired Generalization in Autism

    Science.gov (United States)

    Froehlich, A. L.; Anderson, J. S.; Bigler, E. D.; Miller, J. S.; Lange, N. T.; DuBray, M. B.; Cooperrider, J. R.; Cariello, A.; Nielsen, J. A.; Lainhart, J. E.

    2012-01-01

    Cognitive processing in autism has been characterized by a difficulty with the abstraction of information across multiple stimuli or situations and subsequent generalization to new stimuli or situations. This apparent difficulty leads to the suggestion that prototype formation, a process of creating a mental summary representation of multiple…

  8. Continuous affine processes

    DEFF Research Database (Denmark)

    Buchardt, Kristian

    2016-01-01

    Affine processes possess the property that expectations of exponential affine transformations are given by a set of Riccati differential equations, which is the main feature of this popular class of processes. In this paper we generalise these results for expectations of more general transformati...

  9. 27 CFR 20.113 - Proprietary solvents general-use formula.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Proprietary solvents... Formulas and Statements of Process General-Use Formulas § 20.113 Proprietary solvents general-use formula. (a) A proprietary solvent is any article made with any other ingredients combined with the...

  10. Analysis of Queues with Rational Arrival Process Components - A General Approach

    DEFF Research Database (Denmark)

    Bean, Nigel; Nielsen, Bo Friis

    In a previous paper we demonstrated that the well known matrix-geometric solution of Quasi-Birth-and-Death processes is valid also if we introduce Rational Arrival Process (RAP) components. Here we extend those results and we offer an alternative proof by using results obtained by Tweedie. We prove...... the matrix-geometric form for a certain kind of operators on the stationary measure for discrete time Markov chains of GI/M/1 type. We apply this result to an embedded chain with RAP components. We then discuss the straight- forward modification of the standard algorithms for calculating the matrix R...

  11. Dynamical and hamiltonian dilations of stochastic processes

    International Nuclear Information System (INIS)

    Baumgartner, B.; Gruemm, H.-R.

    1982-01-01

    This is a study of the problem, which stochastic processes could arise from dynamical systems by loss of information. The notions of ''dilation'' and ''approximate dilation'' of a stochastic process are introduced to give exact definitions of this particular relationship. It is shown that every generalized stochastic process is approximately dilatable by a sequence of dynamical systems, but for stochastic processes in full generality one needs nets. (Author)

  12. 48 CFR 7.104 - General procedures.

    Science.gov (United States)

    2010-10-01

    ..., since it generally restricts competition and increases prices. Early in the planning process, the planner should consult with requirements and logistics personnel who determine type, quality, quantity... competition when awarding a contract, the plan shall also be coordinated with the cognizant competition...

  13. 48 CFR 239.7102-1 - General.

    Science.gov (United States)

    2010-10-01

    ..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-1 General. (a) Agencies shall ensure that information assurance is... Telecommunications and Information Systems Security Policy No. 11; (4) Federal Information Processing Standards; (5...

  14. Generalized Superconductivity. Generalized Levitation

    International Nuclear Information System (INIS)

    Ciobanu, B.; Agop, M.

    2004-01-01

    In the recent papers, the gravitational superconductivity is described. We introduce the concept of generalized superconductivity observing that any nongeodesic motion and, in particular, the motion in an electromagnetic field, can be transformed in a geodesic motion by a suitable choice of the connection. In the present paper, the gravitoelectromagnetic London equations have been obtained from the generalized Helmholtz vortex theorem using the generalized local equivalence principle. In this context, the gravitoelectromagnetic Meissner effect and, implicitly, the gravitoelectromagnetic levitation are given. (authors)

  15. General purpose computers in real time

    International Nuclear Information System (INIS)

    Biel, J.R.

    1989-01-01

    I see three main trends in the use of general purpose computers in real time. The first is more processing power. The second is the use of higher speed interconnects between computers (allowing more data to be delivered to the processors). The third is the use of larger programs running in the computers. Although there is still work that needs to be done, I believe that all indications are that the online need for general purpose computers should be available for the SCC and LHC machines. 2 figs

  16. Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations

    Science.gov (United States)

    Kuzemsky, A. L.

    2018-01-01

    We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.

  17. Variational estimation of process parameters in a simplified atmospheric general circulation model

    Science.gov (United States)

    Lv, Guokun; Koehl, Armin; Stammer, Detlef

    2016-04-01

    Parameterizations are used to simulate effects of unresolved sub-grid-scale processes in current state-of-the-art climate model. The values of the process parameters, which determine the model's climatology, are usually manually adjusted to reduce the difference of model mean state to the observed climatology. This process requires detailed knowledge of the model and its parameterizations. In this work, a variational method was used to estimate process parameters in the Planet Simulator (PlaSim). The adjoint code was generated using automatic differentiation of the source code. Some hydrological processes were switched off to remove the influence of zero-order discontinuities. In addition, the nonlinearity of the model limits the feasible assimilation window to about 1day, which is too short to tune the model's climatology. To extend the feasible assimilation window, nudging terms for all state variables were added to the model's equations, which essentially suppress all unstable directions. In identical twin experiments, we found that the feasible assimilation window could be extended to over 1-year and accurate parameters could be retrieved. Although the nudging terms transform to a damping of the adjoint variables and therefore tend to erases the information of the data over time, assimilating climatological information is shown to provide sufficient information on the parameters. Moreover, the mechanism of this regularization is discussed.

  18. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  19. 21 CFR 179.25 - General provisions for food irradiation.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true General provisions for food irradiation. 179.25... (CONTINUED) FOOD FOR HUMAN CONSUMPTION (CONTINUED) IRRADIATION IN THE PRODUCTION, PROCESSING AND HANDLING OF FOOD Radiation and Radiation Sources § 179.25 General provisions for food irradiation. For the purposes...

  20. A combination of process of care and clinical target among type 2 diabetes mellitus patients in general medical clinics and specialist diabetes clinics at hospital levels.

    Science.gov (United States)

    Sieng, Sokha; Hurst, Cameron

    2017-08-07

    This study compares a combination of processes of care and clinical targets among patients with type 2 diabetes mellitus (T2DM) between specialist diabetes clinics (SDCs) and general medical clinics (GMCs), and how differences between these two types of clinics differ with hospital type (community, provincial and regional). Type 2 diabetes mellitus patient medical records were collected from 595 hospitals (499 community, 70 provincial, 26 regional) in Thailand between April 1 to June 30, 2012 resulting in a cross-sectional sample of 26,860 patients. Generalized linear mixed modeling was conducted to examine associations between clinic type and quality of care. The outcome variables of interest were split into clinical targets and process of care. A subsequent subgroup analysis was conducted to examine if the nature of clinical target and process of care differences between GMCs and SDCs varied with hospital type (regional, provincial, community). Regardless of the types of hospitals (regional, provincial, or community) patients attending SDCs were considerably more likely to have eye and foot exam. In terms of larger hospitals (regional and provincial) patients attending SDCs were more likely to achieve HbA1c exam, All FACE exam, BP target, and the Num7Q. Interestingly, SDCs performed better than GMCs at only provincial hospitals for LDL-C target and the All7Q. Finally, patients with T2DM who attended community hospital-GMCs had a better chance of achieving the blood pressure target than patients who attended community hospital-SDCs. Specialized diabetes clinics outperform general medical clinics for both regional and provincial hospitals for all quality of care indicators and the number of quality of care indicators achieved was never lower. However, this better performance of SDC was not observed in community hospital. Indeed, GMCs outperformed SDCs for some quality of care indicators in the community level setting.

  1. A critical incident study of general practice trainees in their basic general practice term.

    Science.gov (United States)

    Diamond, M R; Kamien, M; Sim, M G; Davis, J

    1995-03-20

    To obtain information on the experiences of general practice (GP) trainees during their first general practice (GP) attachment. Critical incident technique--a qualitative analysis of open-ended interviews about incidents which describe competent or poor professional practice. Thirty-nine Western Australian doctors from the Royal Australian College of General Practitioners' (RACGP) Family Medicine Program who were completing their first six months of general practice in 1992. Doctors reported 180 critical incidents, of which just over 50% involved problems (and sometimes successes) with: difficult patients; paediatrics; the doctor-patient relationship; counselling skills; obstetrics and gynaecology; relationships with other health professionals and practice staff; and cardiovascular disorders. The major skills associated with both positive and negative critical incidents were: the interpersonal skills of rapport and listening; the diagnostic skills of thorough clinical assessment and the appropriate use of investigations; and the management skills of knowing when and how to obtain help from supervisors, hospitals and specialists. Doctors reported high levels of anxiety over difficult management decisions and feelings of guilt over missed diagnoses and inadequate management. The initial GP term is a crucial transition period in the development of the future general practitioner. An analysis of commonly recurring positive and negative critical incidents can be used by the RACGP Training Program to accelerate the learning process of doctors in vocational training and has implications for the planning of undergraduate curricula.

  2. Integrating the context-appropriate balanced attention model and reinforcement sensitivity theory: Towards a domain-general personality process model.

    Science.gov (United States)

    Collins, Michael D; Jackson, Chris J; Walker, Benjamin R; O'Connor, Peter J; Gardiner, Elliroma

    2017-01-01

    Over the last 40 years or more the personality literature has been dominated by trait models based on the Big Five (B5). Trait-based models describe personality at the between-person level but cannot explain the within-person mental mechanisms responsible for personality. Nor can they adequately account for variations in emotion and behavior experienced by individuals across different situations and over time. An alternative, yet understated, approach to personality architecture can be found in neurobiological theories of personality, most notably reinforcement sensitivity theory (RST). In contrast to static trait-based personality models like the B5, RST provides a more plausible basis for a personality process model, namely, one that explains how emotions and behavior arise from the dynamic interaction between contextual factors and within-person mental mechanisms. In this article, the authors review the evolution of a neurobiologically based personality process model based on RST, the response modulation model and the context-appropriate balanced attention model. They argue that by integrating this complex literature, and by incorporating evidence from personality neuroscience, one can meaningfully explain personality at both the within- and between-person levels. This approach achieves a domain-general architecture based on RST and self-regulation that can be used to align within-person mental mechanisms, neurobiological systems and between-person measurement models. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. General practice ethnicity data: evaluation of a tool

    Directory of Open Access Journals (Sweden)

    Neuwelt P

    2014-03-01

    Full Text Available INTRODUCTION: There is evidence that the collection of ethnicity data in New Zealand primary care is variable and that data recording in practices does not always align with the procedures outlined in the Ethnicity Data Protocols for the Health and Disability Sector. In 2010, The Ministry of Health funded the development of a tool to audit the collection of ethnicity data in primary care. The aim of this study was to pilot the Ethnicity Data Audit Tool (EAT in general practice. The goal was to evaluate the tool and identify recommendations for its improvement. METHODS: Eight general practices in the Waitemata District Health Board region participated in the EAT pilot. Feedback about the pilot process was gathered by questionnaires and interviews, to gain an understanding of practices’ experiences in using the tool. Questionnaire and interview data were analysed using a simple analytical framework and a general inductive method. FINDINGS: General practice receptionists, practice managers and general practitioners participated in the pilot. Participants found the pilot process challenging but enlightening. The majority felt that the EAT was a useful quality improvement tool for handling patient ethnicity data. Larger practices were the most positive about the tool. CONCLUSION: The findings suggest that, with minor improvements to the toolkit, the EAT has the potential to lead to significant improvements in the quality of ethnicity data collection and recording in New Zealand general practices. Other system-level factors also need to be addressed.

  4. 45 CFR 98.14 - Plan process.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Plan process. 98.14 Section 98.14 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND General Application Procedures § 98.14 Plan process. In the development of each Plan, as required pursuant to § 98.17...

  5. Towards Device-Independent Information Processing on General Quantum Networks

    Science.gov (United States)

    Lee, Ciarán M.; Hoban, Matty J.

    2018-01-01

    The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.

  6. Aggregation, Validation, and Generalization of Qualitative Data - Methodological and Practical Research Strategies Illustrated by the Research Process of an empirically Based Typology.

    Science.gov (United States)

    Weis, Daniel; Willems, Helmut

    2017-06-01

    The article deals with the question of how aggregated data which allow for generalizable insights can be generated from single-case based qualitative investigations. Thereby, two central challenges of qualitative social research are outlined: First, researchers must ensure that the single-case data can be aggregated and condensed so that new collective structures can be detected. Second, they must apply methods and practices to allow for the generalization of the results beyond the specific study. In the following, we demonstrate how and under what conditions these challenges can be addressed in research practice. To this end, the research process of the construction of an empirically based typology is described. A qualitative study, conducted within the framework of the Luxembourg Youth Report, is used to illustrate this process. Specifically, strategies are presented which increase the likelihood of generalizability or transferability of the results, while also highlighting their limitations.

  7. 9 CFR 166.2 - General restrictions.

    Science.gov (United States)

    2010-01-01

    ... of any of the following: Processed products; rendered products; bakery waste; candy waste; eggs... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false General restrictions. 166.2 Section 166.2 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF...

  8. Statement of the Director General to the forty-second regular session of the General Conference of the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    1998-01-01

    In his Statement on the forty-second regular session of the General Conference of the IAEA, the Director General of the Agency highlighted the role of the IAEA in four areas: nuclear power and the fuel cycle, nuclear and radiation safety, nuclear verification and the security of material, and transfer of technology. The final part of the Statement is devoted to the process of programme and management review initiated by the Director General to ensure that the Agency maintains and enhances its record as an effective and efficient organization

  9. Decision making process and factors contributing to research participation among general practitioners: A grounded theory study.

    Science.gov (United States)

    Tong, Seng Fah; Ng, Chirk Jenn; Lee, Verna Kar Mun; Lee, Ping Yein; Ismail, Irmi Zarina; Khoo, Ee Ming; Tahir, Noor Azizah; Idris, Iliza; Ismail, Mastura; Abdullah, Adina

    2018-01-01

    The participation of general practitioners (GPs) in primary care research is variable and often poor. We aimed to develop a substantive and empirical theoretical framework to explain GPs' decision-making process to participate in research. We used the grounded theory approach to construct a substantive theory to explain the decision-making process of GPs to participate in research activities. Five in-depth interviews and four focus group discussions were conducted among 21 GPs. Purposeful sampling followed by theoretical sampling were used to attempt saturation of the core category. Data were collected using semi-structured open-ended questions. Interviews were recorded, transcribed verbatim and checked prior to analysis. Open line-by-line coding followed by focus coding were used to arrive at a substantive theory. Memoing was used to help bring concepts to higher abstract levels. The GPs' decision to participate in research was attributed to their inner drive and appreciation for primary care research and their confidence in managing their social and research environments. The drive and appreciation for research motivated the GPs to undergo research training to enhance their research knowledge, skills and confidence. However, the critical step in the GPs' decision to participate in research was their ability to align their research agenda with priorities in their social environment, which included personal life goals, clinical practice and organisational culture. Perceived support for research, such as funding and technical expertise, facilitated the GPs' participation in research. In addition, prior experiences participating in research also influenced the GPs' confidence in taking part in future research. The key to GPs deciding to participate in research is whether the research agenda aligns with the priorities in their social environment. Therefore, research training is important, but should be included in further measures and should comply with GPs' social

  10. Domain general constraints on statistical learning.

    Science.gov (United States)

    Thiessen, Erik D

    2011-01-01

    All theories of language development suggest that learning is constrained. However, theories differ on whether these constraints arise from language-specific processes or have domain-general origins such as the characteristics of human perception and information processing. The current experiments explored constraints on statistical learning of patterns, such as the phonotactic patterns of an infants' native language. Infants in these experiments were presented with a visual analog of a phonotactic learning task used by J. R. Saffran and E. D. Thiessen (2003). Saffran and Thiessen found that infants' phonotactic learning was constrained such that some patterns were learned more easily than other patterns. The current results indicate that infants' learning of visual patterns shows the same constraints as infants' learning of phonotactic patterns. This is consistent with theories suggesting that constraints arise from domain-general sources and, as such, should operate over many kinds of stimuli in addition to linguistic stimuli. © 2011 The Author. Child Development © 2011 Society for Research in Child Development, Inc.

  11. Generalized network improvement and packing problems

    CERN Document Server

    Holzhauser, Michael

    2016-01-01

    Michael Holzhauser discusses generalizations of well-known network flow and packing problems by additional or modified side constraints. By exploiting the inherent connection between the two problem classes, the author investigates the complexity and approximability of several novel network flow and packing problems and presents combinatorial solution and approximation algorithms. Contents Fractional Packing and Parametric Search Frameworks Budget-Constrained Minimum Cost Flows: The Continuous Case Budget-Constrained Minimum Cost Flows: The Discrete Case Generalized Processing Networks Convex Generalized Flows Target Groups Researchers and students in the fields of mathematics, computer science, and economics Practitioners in operations research and logistics The Author Dr. Michael Holzhauser studied computer science at the University of Kaiserslautern and is now a research fellow in the Optimization Research Group at the Department of Mathematics of the University of Kaiserslautern.

  12. The Black-Scholes option pricing problem in mathematical finance: generalization and extensions for a large class of stochastic processes

    Science.gov (United States)

    Bouchaud, Jean-Philippe; Sornette, Didier

    1994-06-01

    The ability to price risks and devise optimal investment strategies in thé présence of an uncertain "random" market is thé cornerstone of modern finance theory. We first consider thé simplest such problem of a so-called "European call option" initially solved by Black and Scholes using Ito stochastic calculus for markets modelled by a log-Brownien stochastic process. A simple and powerful formalism is presented which allows us to generalize thé analysis to a large class of stochastic processes, such as ARCH, jump or Lévy processes. We also address thé case of correlated Gaussian processes, which is shown to be a good description of three différent market indices (MATIF, CAC40, FTSE100). Our main result is thé introduction of thé concept of an optimal strategy in the sense of (functional) minimization of the risk with respect to the portfolio. If the risk may be made to vanish for particular continuous uncorrelated 'quasiGaussian' stochastic processes (including Black and Scholes model), this is no longer the case for more general stochastic processes. The value of the residual risk is obtained and suggests the concept of risk-corrected option prices. In the presence of very large deviations such as in Lévy processes, new criteria for rational fixing of the option prices are discussed. We also apply our method to other types of options, `Asian', `American', and discuss new possibilities (`doubledecker'...). The inclusion of transaction costs leads to the appearance of a natural characteristic trading time scale. L'aptitude à quantifier le coût du risque et à définir une stratégie optimale de gestion de portefeuille dans un marché aléatoire constitue la base de la théorie moderne de la finance. Nous considérons d'abord le problème le plus simple de ce type, à savoir celui de l'option d'achat `européenne', qui a été résolu par Black et Scholes à l'aide du calcul stochastique d'Ito appliqué aux marchés modélisés par un processus Log

  13. Computer-assisted analyses of (/sup 14/C)2-DG autoradiographs employing a general purpose image processing system

    Energy Technology Data Exchange (ETDEWEB)

    Porro, C; Biral, G P [Modena Univ. (Italy). Ist. di Fisiologia Umana; Fonda, S; Baraldi, P [Modena Univ. (Italy). Lab. di Bioingegneria della Clinica Oculistica; Cavazzuti, M [Modena Univ. (Italy). Clinica Neurologica

    1984-09-01

    A general purpose image processing system is described including B/W TV camera, high resolution image processor and display system (TESAK VDC 501), computer (DEC PDP 11/23) and monochrome and color monitors. Images may be acquired from a microscope equipped with a TV camera or using the TV in direct viewing; the A/D converter and the image processor provides fast (40 ms) and precise (512x512 data points) digitization of TV signal with a 256 gray levels maximum resolution. Computer programs have been developed in order to perform qualitative and quantitative analyses of autoradiographs obtained with the 2-DG method, which are written in FORTRAN and MACRO 11 Assembly Language. They include: (1) procedures designed to recognize errors in acquisition due to possible image shading and correct them via software; (2) routines suitable for qualitative analyses of the whole image or selected regions of it, providing the opportunity for pseudocolor coding, statistics, graphic overlays; (3) programs permitting the conversion of gray levels into metabolic rates of glucose utilization and the display of gray- or color-coded metabolic maps.

  14. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  15. Criteria and Processes for the Certification of Non-Radioactive Hazardous and Non-Hazardous Wastes

    International Nuclear Information System (INIS)

    Dominick, J.

    2008-01-01

    This document details Lawrence Livermore National Laboratory's (LLNL) criteria and processes for determining if potentially volumetrically contaminated or potentially surface contaminated wastes are to be managed as material containing residual radioactivity or as non-radioactive. This document updates and replaces UCRL-AR-109662, Criteria and Procedures for the Certification of Nonradioactive Hazardous Waste (Reference 1), also known as 'The Moratorium', and follows the guidance found in the U.S. Department of Energy (DOE) document, Performance Objective for Certification of Non-Radioactive Hazardous Waste (Reference 2). The 1992 Moratorium document (UCRL-AR-109662) is three volumes and 703 pages. The first volume provides an overview of the certification process and lists the key radioanalytical methods and their associated Limits of Sensitivities. Volumes Two and Three contain supporting documents and include over 30 operating procedures, QA plans, training documents and organizational charts that describe the hazardous and radioactive waste management system in place in 1992. This current document is intended to update the previous Moratorium documents and to serve as the top-tier LLNL institutional Moratorium document. The 1992 Moratorium document was restricted to certification of Resource Conservation and Recovery Act (RCRA), State and Toxic Substances Control Act (TSCA) hazardous waste from Radioactive Material Management Areas (RMMA). This still remains the primary focus of the Moratorium; however, this document increases the scope to allow use of this methodology to certify other LLNL wastes and materials destined for off-site disposal, transfer, and re-use including non-hazardous wastes and wastes generated outside of RMMAs with the potential for DOE added radioactivity. The LLNL organization that authorizes off-site transfer/disposal of a material or waste stream is responsible for implementing the requirements of this document. The LLNL Radioactive and

  16. Generalized internal multiple imaging

    KAUST Repository

    Zuberi, Mohammad Akbar Hosain

    2014-12-04

    Various examples are provided for generalized internal multiple imaging (GIMI). In one example, among others, a method includes generating a higher order internal multiple image using a background Green\\'s function and rendering the higher order internal multiple image for presentation. In another example, a system includes a computing device and a generalized internal multiple imaging (GIMI) application executable in the computing device. The GIMI application includes logic that generates a higher order internal multiple image using a background Green\\'s function and logic that renders the higher order internal multiple image for display on a display device. In another example, a non-transitory computer readable medium has a program executable by processing circuitry that generates a higher order internal multiple image using a background Green\\'s function and renders the higher order internal multiple image for display on a display device.

  17. Generalized internal multiple imaging

    KAUST Repository

    Zuberi, Mohammad Akbar Hosain; Alkhalifah, Tariq

    2014-01-01

    Various examples are provided for generalized internal multiple imaging (GIMI). In one example, among others, a method includes generating a higher order internal multiple image using a background Green's function and rendering the higher order internal multiple image for presentation. In another example, a system includes a computing device and a generalized internal multiple imaging (GIMI) application executable in the computing device. The GIMI application includes logic that generates a higher order internal multiple image using a background Green's function and logic that renders the higher order internal multiple image for display on a display device. In another example, a non-transitory computer readable medium has a program executable by processing circuitry that generates a higher order internal multiple image using a background Green's function and renders the higher order internal multiple image for display on a display device.

  18. The generalized second law and the black hole evaporation in an empty space as a nonequilibrium process

    International Nuclear Information System (INIS)

    Saida, Hiromi

    2006-01-01

    When a black hole is in an empty space in which there is no matter field except that of the Hawking radiation (Hawking field), then the black hole evaporates and the entropy of the black hole decreases. The generalized second law guarantees the increase of the total entropy of the whole system which consists of the black hole and the Hawking field. That is, the increase of the entropy of the Hawking field is faster than the decrease of the black hole entropy. In a naive sense, one may expect that the entropy increase of the Hawking field is due to the self-interaction among the composite particles of the Hawking field, and that the self-relaxation of the Hawking field results in the entropy increase. Then, when one considers a non-self-interacting matter field as the Hawking field, it is obvious that self-relaxation does not take place, and one may think that the total entropy does not increase. However, using nonequilibrium thermodynamics which has been developed recently, we find for the non-self-interacting Hawking field that the rate of entropy increase of the Hawking field (the entropy emission rate by the black hole) grows faster than the rate of entropy decrease of the black hole during the black hole evaporation in empty space. The origin of the entropy increase of the Hawking field is the increase of the black hole temperature. Hence an understanding of the generalized second law in the context of nonequilibrium thermodynamics is suggested; even if the self-relaxation of the Hawking field does not take place, the temperature increase of the black hole during the evaporation process causes the entropy increase of the Hawking field to result in the increase of the total entropy

  19. 21 CFR 820.180 - General requirements.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false General requirements. 820.180 Section 820.180 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL... minimize deterioration and to prevent loss. Those records stored in automated data processing systems shall...

  20. An Overview of Generalized Gamma Mittag–Leffler Model and Its Applications

    Directory of Open Access Journals (Sweden)

    Seema S. Nair

    2015-08-01

    Full Text Available Recently, probability models with thicker or thinner tails have gained more importance among statisticians and physicists because of their vast applications in random walks, Lévi flights, financial modeling, etc. In this connection, we introduce here a new family of generalized probability distributions associated with the Mittag–Leffler function. This family gives an extension to the generalized gamma family, opens up a vast area of potential applications and establishes connections to the topics of fractional calculus, nonextensive statistical mechanics, Tsallis statistics, superstatistics, the Mittag–Leffler stochastic process, the Lévi process and time series. Apart from examining the properties, the matrix-variate analogue and the connection to fractional calculus are also explained. By using the pathway model of Mathai, the model is further generalized. Connections to Mittag–Leffler distributions and corresponding autoregressive processes are also discussed.

  1. Viewing brain processes as Critical State Transitions across levels of organization: Neural events in Cognition and Consciousness, and general principles.

    Science.gov (United States)

    Werner, Gerhard

    2009-04-01

    In this theoretical and speculative essay, I propose that insights into certain aspects of neural system functions can be gained from viewing brain function in terms of the branch of Statistical Mechanics currently referred to as "Modern Critical Theory" [Stanley, H.E., 1987. Introduction to Phase Transitions and Critical Phenomena. Oxford University Press; Marro, J., Dickman, R., 1999. Nonequilibrium Phase Transitions in Lattice Models. Cambridge University Press, Cambridge, UK]. The application of this framework is here explored in two stages: in the first place, its principles are applied to state transitions in global brain dynamics, with benchmarks of Cognitive Neuroscience providing the relevant empirical reference points. The second stage generalizes to suggest in more detail how the same principles could also apply to the relation between other levels of the structural-functional hierarchy of the nervous system and between neural assemblies. In this view, state transitions resulting from the processing at one level are the input to the next, in the image of a 'bucket brigade', with the content of each bucket being passed on along the chain, after having undergone a state transition. The unique features of a process of this kind will be discussed and illustrated.

  2. Davidson's generalization of the Fenyes-Nelson stochastic model of quantum mechanics

    International Nuclear Information System (INIS)

    Shucker, D.S.

    1980-01-01

    Davidson's generalization of the Fenyes-Nelson stochastic model of quantum mechanics is discussed. It is shown that this author's previous results concerning the Fenyes-Nelson process extend to the more general theory of Davidson. (orig.)

  3. On a Fractional Binomial Process

    Science.gov (United States)

    Cahoy, Dexter O.; Polito, Federico

    2012-02-01

    The classical binomial process has been studied by Jakeman (J. Phys. A 23:2815-2825, 1990) (and the references therein) and has been used to characterize a series of radiation states in quantum optics. In particular, he studied a classical birth-death process where the chance of birth is proportional to the difference between a larger fixed number and the number of individuals present. It is shown that at large times, an equilibrium is reached which follows a binomial process. In this paper, the classical binomial process is generalized using the techniques of fractional calculus and is called the fractional binomial process. The fractional binomial process is shown to preserve the binomial limit at large times while expanding the class of models that include non-binomial fluctuations (non-Markovian) at regular and small times. As a direct consequence, the generality of the fractional binomial model makes the proposed model more desirable than its classical counterpart in describing real physical processes. More statistical properties are also derived.

  4. Modalities of Generalization Through Single Case Studies.

    Science.gov (United States)

    Zittoun, Tania

    2017-06-01

    The value of case studies for theory building is still doubted in psychology. The paper argues for the importance of case studies and the possibility of generalizing from these for a specific sociocultural understanding of human development. The paper first clarifies the notion of abduction within case studies, drawing on pragmatists James and Peirce and expanding it with the work of Lewin, and argues that it is the core mechanism that allows generalization from case studies. The second section presents the possibility of generalizing from individual single case studies, for which not only the subjective perspective, but also the dynamics by which the social and cultural environment guide and enable the person's development, have to be accounted for. The third section elaborates the question of institutional case studies, where the challenge is to account both for institutional dynamics, and for persons' trajectories within; this is exemplified with an ongoing study on the process of obtaining citizenship in Switzerland. The paper briefly concludes by highlighting two possible implications of the paper, one concerning the process of theoretical reasoning, the other, the fact that sociocultural psychology could itself be seen as an institution in-the-making.

  5. Initialization-free generalized Deutsche-Jazz's algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Chi, Dong Pyo [School of Mathematical Sciences, Seoul National University, Seoul (Korea, Republic of)]. E-mail: dpchi@math.snu.ac.kr; Kim, Jinsoo [School of Electrical Engineering and Computer Science, Seoul National University, Seoul (Korea)]. E-mail: jkim@ee.snu.ac.kr; Lee, Soojoon [School of Mathematical Sciences, Seoul National University, Seoul (Korea)]. E-mail: level@math.snu.ac.kr

    2001-06-29

    We generalize the Deutsch-Jozsa algorithm by exploiting summations of the roots of unity. The generalized algorithm distinguishes a wider class of functions promised to be either constant or many to one and onto an evenly spaced range. As previously, the generalized quantum algorithm solves this problem using a single functional evaluation. We also consider the problem of distinguishing constant and evenly balanced functions and present a quantum algorithm for this problem that does not require any initialization of an auxiliary register involved in the process of functional evaluation and after solving the problem recovers the initial state of an auxiliary register. (author)

  6. Gamma irradiators for radiation processing

    International Nuclear Information System (INIS)

    2006-01-01

    Radiation technology is one of the most important fields which the IAEA supports and promotes, and has several programmes that facilitate its use in the developing Member States. In view of this mandate, this Booklet on 'Gamma Irradiators for Radiation Processing' is prepared which describes variety of gamma irradiators that can be used for radiation processing applications. It is intended to present description of general principles of design and operation of the gamma irradiators available currently for industrial use. It aims at providing information to industrial end users to familiarise them with the technology, with the hope that the information contained here would assist them in selecting the most optimum irradiator for their needs. Correct selection affects not only the ease of operation but also yields higher efficiency, and thus improved economy. The Booklet is also intended for promoting radiation processing in general to governments and general public

  7. [Pregnancy in the context of general adaptation syndrome].

    Science.gov (United States)

    Gur'ianov, V A; Pyregov, A V; Tolmachev, G N; Volodin, A V

    2007-01-01

    Based on their own findings and the data available in the literature on pregnancy including that complicated by gestosis, the authors consider these conditions in the context of Selye's general adaptation syndrome. They identify its basic links (the autonomic nervous and cardiovascular systems) the function of which is affected by all the physiological and pathophysiological processes involved in its development. There is a high likelihood of baseline impaired adaption processes in these links, which may lead to an inability to accommodate (dysadaptation) by the moment of delivery. The paper gives the current interpretation of functional disorders, called Zangemeister'a triad in 1913, from the present-day points of view of the evaluation of pregnancy as the systemic inflammatory response syndrome and, probably, adaptation disease. Based on the results of analyzing the data available in the literature, the authors indicate physiologically the basic trends in the modulation of impaired development processes of the general adaptation syndrome towards the completion of pregnancy and surgical delivery.

  8. Smooth generalized linear models for aggregated data

    OpenAIRE

    Ayma Anza, Diego Armando

    2016-01-01

    Mención Internacional en el título de doctor Aggregated data commonly appear in areas such as epidemiology, demography, and public health. Generally, the aggregation process is done to protect the privacy of patients, to facilitate compact presentation, or to make it comparable with other coarser datasets. However, this process may hinder the visualization of the underlying distribution that follows the data. Also, it prohibits the direct analysis of relationships between ag...

  9. Hypothermic general cold adaptation induced by local cold acclimation.

    Science.gov (United States)

    Savourey, G; Barnavol, B; Caravel, J P; Feuerstein, C; Bittel, J H

    1996-01-01

    To study relationships between local cold adaptation of the lower limbs and general cold adaptation, eight subjects were submitted both to a cold foot test (CFT, 5 degrees C water immersion, 5 min) and to a whole-body standard cold air test (SCAT, 1 degree C, 2 h, nude at rest) before and after a local cold acclimation (LCA) of the lower limbs effected by repeated cold water immersions. The LCA induced a local cold adaptation confirmed by higher skin temperatures of the lower limbs during CFT and a hypothermic insulative general cold adaptation (decreased rectal temperature and mean skin temperature P adaptation was related to the habituation process confirmed by decreased plasma concentrations of noradrenaline (NA) during LCA (P general cold adaptation was unrelated either to local cold adaptation or to the habituation process, because an increased NA during SCAT after LCA (P syndrome" occurring during LCA.

  10. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  11. Three-Dimensional General-Relativistic Magnetohydrodynamic Simulations of Remnant Accretion Disks from Neutron Star Mergers: Outflows and r-Process Nucleosynthesis.

    Science.gov (United States)

    Siegel, Daniel M; Metzger, Brian D

    2017-12-08

    The merger of binary neutron stars, or of a neutron star and a stellar-mass black hole, can result in the formation of a massive rotating torus around a spinning black hole. In addition to providing collimating media for γ-ray burst jets, unbound outflows from these disks are an important source of mass ejection and rapid neutron capture (r-process) nucleosynthesis. We present the first three-dimensional general-relativistic magnetohydrodynamic (GRMHD) simulations of neutrino-cooled accretion disks in neutron star mergers, including a realistic equation of state valid at low densities and temperatures, self-consistent evolution of the electron fraction, and neutrino cooling through an approximate leakage scheme. After initial magnetic field amplification by magnetic winding, we witness the vigorous onset of turbulence driven by the magnetorotational instability (MRI). The disk quickly reaches a balance between heating from MRI-driven turbulence and neutrino cooling, which regulates the midplane electron fraction to a low equilibrium value Y_{e}≈0.1. Over the 380-ms duration of the simulation, we find that a fraction ≈20% of the initial torus mass is unbound in powerful outflows with asymptotic velocities v≈0.1c and electron fractions Y_{e}≈0.1-0.25. Postprocessing the outflows through a nuclear reaction network shows the production of a robust second- and third-peak r process. Though broadly consistent with the results of previous axisymmetric hydrodynamical simulations, extrapolation of our results to late times suggests that the total ejecta mass from GRMHD disks is significantly higher. Our results provide strong evidence that postmerger disk outflows are an important site for the r process.

  12. Explanation and inference: Mechanistic and functional explanations guide property generalization

    Directory of Open Access Journals (Sweden)

    Tania eLombrozo

    2014-09-01

    Full Text Available The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1, experimentally provided (Experiment 2, or experimentally induced (Experiment 2. The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  13. Explanation and inference: mechanistic and functional explanations guide property generalization.

    Science.gov (United States)

    Lombrozo, Tania; Gwynne, Nicholas Z

    2014-01-01

    The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1), experimentally provided (Experiment 2), or experimentally induced (Experiment 2). The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional) can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  14. Domain general sequence operations contribute to pre-SMA involvement in visuo-spatial processing

    Directory of Open Access Journals (Sweden)

    E. Charles eLeek

    2016-01-01

    Full Text Available This study used 3T MRI to elucidate the functional role of supplementary motor area (SMA in relation to visuo-spatial processing. A localizer task contrasting sequential number subtraction and repetitive button pressing was used to functionally delineate non-motor sequence processing in pre-SMA, and activity in SMA-proper associated with motor sequencing. Patterns of BOLD responses in these regions were then contrasted to those from two tasks of visuo-spatial processing. In one task participants performed mental rotation in which recognition memory judgments were made to previously memorized 2D novel patterns across image-plane rotations. The other task involved abstract grid navigation in which observers computed a series of imagined location shifts in response to directional (arrow cues around a mental grid. The results showed overlapping activation in pre-SMA for sequential subtraction and both visuo-spatial tasks. These results suggest that visuo-spatial processing is supported by non-motor sequence operations that involve pre-SMA. More broadly, these data further highlight the functional heterogeneity of pre-SMA, and show that its role extends to processes beyond the planning and online control of movement.

  15. 29 CFR 780.128 - General statement on “secondary” agriculture.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false General statement on âsecondaryâ agriculture. 780.128... APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT General Scope of Agriculture Practices Exempt Under âsecondaryâ Meaning of Agriculture...

  16. OPTIMAL PROCESSES IN IRREVERSIBLE THERMODYNAMICS AND MICROECONOMICS

    Directory of Open Access Journals (Sweden)

    Vladimir A. Kazakov

    2004-06-01

    Full Text Available This paper describes general methodology that allows one to extend Carnot efficiency of classical thermodynamic for zero rate processes onto thermodynamic systems with finite rate. We define the class of minimal dissipation processes and show that it represents generalization of reversible processes and determines the limiting possibilities of finite rate systems. The described methodology is then applied to microeconomic exchange systems yielding novel estimates of limiting efficiencies for such systems.

  17. A Java-based fMRI processing pipeline evaluation system for assessment of univariate general linear model and multivariate canonical variate analysis-based pipelines.

    Science.gov (United States)

    Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C

    2008-01-01

    As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.

  18. General anesthetics in children: neurotoxic or neuroprotective?

    Directory of Open Access Journals (Sweden)

    Jéssica Farias Rebouças

    2017-02-01

    Full Text Available Introduction: general anesthetics are involved in neuroprotection in adults after ischemic events and cognitive impairment, thus, they also may be associated with learning disorders in children exposed to them before three years of age. Objective: Describe about the neurotoxic effects of general anesthetics in experimental animals and children. Method: This is a systematic review, performed from search in databases and on PubMed using the keywords "neurotoxicity" and "general anesthetics," and "general anesthetics," "neurotoxicity", "children", "young child "and" pediatric ". Results: The search resulted in 185 articles. Out of these, 78 met our inclusion criteria. We found that there was a significant evidence of neurotoxicity induced by general anesthetics in experimental animals that were just born, resulting in late and permanent cognitive deficits. This effect was associated with multiple exposures, exposure length of time and combination of drugs. However, some studies found cognitive impairment after a single exposure to anesthetic. Conclusion: There is insufficient evidence to state that general anesthetics are neurotoxic and have the potential to trigger learning and behavior disabilities in children. However, we suggest caution in indicating surgery in children under three years old, analyzing risk-benefit and inserting the family in the decision process.   Keywords: Neurotoxicity; Neuroprotection; Cognitive Impairment; Children; General Anesthesics

  19. The Sluggishness of Early-Stage Face Processing (N170 is Correlated with Negative and General Psychiatric Symptoms in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Yingjun Zheng

    2016-11-01

    Full Text Available Patients with schizophrenia exhibit consistent abnormalities in face-evoked N170. However, the relation between face-specific N170 abnormalities in schizophrenic patients and schizophrenia clinical characters, which probably based on common neural mechanisms, is still rarely discovered. Using event-related potentials (ERPs recording in both schizophrenic patients and healthy controls, the amplitude and latency of N170 were recorded when participants were passively watching face and non-face (table pictures. The results showed a face-specific N170 latency sluggishness in schizophrenic patients, i.e., the N170 latencies of schizophrenic patients were significantly longer than those of healthy controls under both upright face and inverted face conditions. Importantly, the face-related N170 latencies of the left temporo-occipital electrodes (P7 and PO7 were positively correlated with negative symptoms and general psychiatric symptoms. Besides the analysis of latencies, the N170 amplitudes became weaker in schizophrenic patients under both inverted face and inverted table conditions, with a left hemisphere dominant. More interestingly, the FIEs (the difference of N170 amplitudes between upright and inverted faces were absent in schizophrenic patients, which suggested the abnormality of holistic face processing. These results above revealed a marked symptom-relevant neural sluggishness of face-specific processing in schizophrenic patients, supporting the demyelinating hypothesis of schizophrenia.

  20. Small business, cash budgets and general practice.

    Science.gov (United States)

    Jackson, A R

    1991-01-01

    In practice management, general practice falls into the category of small business with all its attendant generic problems. Disciplined planning and good financial management are not often seen in small business. These are required if general practitioners are to continue (or return to) the provision of high quality medical services. An effective budget process, especially cash-flow budgeting, is the key to successful planning and financial management. Budgeting will bring Control, Co-ordination, and Credibility to your practice. It will enable you to set goals and to achieve them.

  1. The protection of fundamental human rights in criminal process General report

    NARCIS (Netherlands)

    Brants, C.; Franken, Stijn

    2009-01-01

    This contribution examines the effect of the uniform standards of human rights in international conventions on criminal process in different countries and identifies factors inherent in national systems that influence the scope of international standards and the way in which they are implemented in

  2. Surprise! Infants Consider Possible Bases of Generalization for a Single Input Example

    Science.gov (United States)

    Gerken, LouAnn; Dawson, Colin; Chatila, Razanne; Tenenbaum, Josh

    2015-01-01

    Infants have been shown to generalize from a small number of input examples. However, existing studies allow two possible means of generalization. One is via a process of noting similarities shared by several examples. Alternatively, generalization may reflect an implicit desire to explain the input. The latter view suggests that generalization…

  3. Inspector General, DOD, Oversight of the Naval Audit Service Audit of the Navy General Fund Financial Statements for FY 1998

    National Research Council Canada - National Science Library

    1999-01-01

    .... The audit objective was to determine the accuracy and completeness of the Naval Audit Service audit of the Navy General Fund Financial Statements for Fiscal Year 1998. See Appendix A for a discussion of the audit Process.

  4. Representations and processes of human spatial competence.

    Science.gov (United States)

    Gunzelmann, Glenn; Lyon, Don R

    2011-10-01

    This article presents an approach to understanding human spatial competence that focuses on the representations and processes of spatial cognition and how they are integrated with cognition more generally. The foundational theoretical argument for this research is that spatial information processing is central to cognition more generally, in the sense that it is brought to bear ubiquitously to improve the adaptivity and effectiveness of perception, cognitive processing, and motor action. We describe research spanning multiple levels of complexity to understand both the detailed mechanisms of spatial cognition, and how they are utilized in complex, naturalistic tasks. In the process, we discuss the critical role of cognitive architectures in developing a consistent account that spans this breadth, and we note some areas in which the current version of a popular architecture, ACT-R, may need to be augmented. Finally, we suggest a framework for understanding the representations and processes of spatial competence and their role in human cognition generally. Copyright © 2011 Cognitive Science Society, Inc.

  5. Food-Processing Wastes.

    Science.gov (United States)

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2017-10-01

    Literature published in 2016 and early 2017 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  6. A generalization of Takane's algorithm for DEDICOM

    NARCIS (Netherlands)

    Kiers, Henk A.L.; ten Berge, Jos M.F.; Takane, Yoshio; de Leeuw, Jan

    An algorithm is described for fitting the DEDICOM model for the analysis of asymmetric data matrices. This algorithm generalizes an algorithm suggested by Takane in that it uses a damping parameter in the iterative process. Takane's algorithm does not always converge monotonically. Based on the

  7. Efficient coding explains the universal law of generalization in human perception.

    Science.gov (United States)

    Sims, Chris R

    2018-05-11

    Perceptual generalization and discrimination are fundamental cognitive abilities. For example, if a bird eats a poisonous butterfly, it will learn to avoid preying on that species again by generalizing its past experience to new perceptual stimuli. In cognitive science, the "universal law of generalization" seeks to explain this ability and states that generalization between stimuli will follow an exponential function of their distance in "psychological space." Here, I challenge existing theoretical explanations for the universal law and offer an alternative account based on the principle of efficient coding. I show that the universal law emerges inevitably from any information processing system (whether biological or artificial) that minimizes the cost of perceptual error subject to constraints on the ability to process or transmit information. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  8. 32 CFR 865.110 - Decision process.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Decision process. 865.110 Section 865.110...-GENERAL PERSONNEL REVIEW BOARDS Air Force Discharge Review Board § 865.110 Decision process. (a) The DRB... decision making process. ...

  9. using fuzzy logic in image processing

    International Nuclear Information System (INIS)

    Ashabrawy, M.A.F.

    2002-01-01

    due to the unavoidable merge between computer and mathematics, the signal processing in general and the processing in particular have greatly improved and advanced. signal processing deals with the processing of any signal data for use by a computer, while image processing deals with all kinds of images (just images). image processing involves the manipulation of image data for better appearance and viewing by people; consequently, it is a rapidly growing and exciting field to be involved in today . this work takes an applications - oriented approach to image processing .the applications; the maps and documents of the first egyptian research reactor (ETRR-1), the x-ray medical images and the fingerprints image. since filters, generally, work continuous ranges rather than discrete values, fuzzy logic techniques are more convenient.thee techniques are powerful in image processing and can deal with one- dimensional, 1-D and two - dimensional images, 2-D images as well

  10. A fast butterfly algorithm for generalized Radon transforms

    KAUST Repository

    Hu, Jingwei; Fomel, Sergey; Demanet, Laurent; Ying, Lexing

    2013-01-01

    Generalized Radon transforms, such as the hyperbolic Radon transform, cannot be implemented as efficiently in the frequency domain as convolutions, thus limiting their use in seismic data processing. We have devised a fast butterfly algorithm

  11. Rational Unified Process

    OpenAIRE

    Kopal, Nils

    2016-01-01

    In this German seminar paper, which was written in the year 2011 at the University of Duisburg for a Bachelor Colloquium in Applied computer science, we show a brief overview of the Rational Unified Process (RUP). Thus, interested students or generally interested people in software development gain a first impression of RUP. The paper includes a survey and overview of the underlying process structure, the phases of the process, its workflows, and describes the always by the RUP developers pos...

  12. Weak convergence of marked point processes generated by crossings of multivariate jump processes

    DEFF Research Database (Denmark)

    Tamborrino, Massimiliano; Sacerdote, Laura; Jacobsen, Martin

    2014-01-01

    We consider the multivariate point process determined by the crossing times of the components of a multivariate jump process through a multivariate boundary, assuming to reset each component to an initial value after its boundary crossing. We prove that this point process converges weakly...... process converging to a multivariate Ornstein–Uhlenbeck process is discussed as a guideline for applying diffusion limits for jump processes. We apply our theoretical findings to neural network modeling. The proposed model gives a mathematical foundation to the generalization of the class of Leaky...

  13. General general game AI

    OpenAIRE

    Togelius, Julian; Yannakakis, Georgios N.; 2016 IEEE Conference on Computational Intelligence and Games (CIG)

    2016-01-01

    Arguably the grand goal of artificial intelligence research is to produce machines with general intelligence: the capacity to solve multiple problems, not just one. Artificial intelligence (AI) has investigated the general intelligence capacity of machines within the domain of games more than any other domain given the ideal properties of games for that purpose: controlled yet interesting and computationally hard problems. This line of research, however, has so far focuse...

  14. General Overview of Desalination Technology

    International Nuclear Information System (INIS)

    Ari-Nugroho

    2004-01-01

    Desalination, as discussed in this journal, refers to a water treatment process that removes salts from water. Desalination can be done in a number of ways, but the result is always the same : fresh water is produced from brackish or seawater. The quality of distillate water is indicated by the contents of Total Dissolved Solid (TDS) in it, the less number of TDS contents in it, the highest quality of distillate water it has. This article describes the general analysis of desalination technologies, the varies of water, operation and maintenance of the plant, and general comparison between desalination technologies. Basically, there are two common technologies are being used, i.e. thermal and membrane desalination, which are Multi Effect Distillation (MED), Multi Stage Flash (MSF) and Reverse Osmosis (RO), respectively. Both technologies differ from the energy source. Thermal desalination needs heat source from the power plant, while membrane desalination needs only the electricity to run the pumps. In thermal desalination, the vapour coming from boiling feedwater is condensate, this process produces the lowest saline water, about 10 part per million (ppm). The membrane technology uses semipermeable membrane to separate fresh water from salt dissolve. This technology produces the fresh water about 350-500 ppm. (author)

  15. A Measurable Model of the Creative Process in the Context of a Learning Process

    Science.gov (United States)

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  16. A generalized model via random walks for information filtering

    Science.gov (United States)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-08-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation.

  17. General classification of maturation reaction-norm shape from size-based processes

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Andersen, Ken Haste

    2011-01-01

    for growth and mortality is based on processes at the level of the individual, and is motivated by the energy budget of fish. MRN shape is a balance between opposing factors and depends on subtle details of size dependence of growth and mortality. MRNs with both positive and negative slopes are predicted...

  18. Energy and Uncertainty in General Relativity

    Science.gov (United States)

    Cooperstock, F. I.; Dupre, M. J.

    2018-03-01

    The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.

  19. Generalized functions

    CERN Document Server

    Gelfand, I M; Graev, M I; Vilenkin, N Y; Pyatetskii-Shapiro, I I

    Volume 1 is devoted to basics of the theory of generalized functions. The first chapter contains main definitions and most important properties of generalized functions as functional on the space of smooth functions with compact support. The second chapter talks about the Fourier transform of generalized functions. In Chapter 3, definitions and properties of some important classes of generalized functions are discussed; in particular, generalized functions supported on submanifolds of lower dimension, generalized functions associated with quadratic forms, and homogeneous generalized functions are studied in detail. Many simple basic examples make this book an excellent place for a novice to get acquainted with the theory of generalized functions. A long appendix presents basics of generalized functions of complex variables.

  20. Utilizing General Purpose Graphics Processing Units to Improve Performance of Computer Modelling and Visualization

    Science.gov (United States)

    Monk, J.; Zhu, Y.; Koons, P. O.; Segee, B. E.

    2009-12-01

    With the introduction of the G8X series of cards by nVidia an architecture called CUDA was released, virtually all subsequent video cards have had CUDA support. With this new architecture nVidia provided extensions for C/C++ that create an Application Programming Interface (API) allowing code to be executed on the GPU. Since then the concept of GPGPU (general purpose graphics processing unit) has been growing, this is the concept that the GPU is very good a algebra and running things in parallel so we should take use of that power for other applications. This is highly appealing in the area of geodynamic modeling, as multiple parallel solutions of the same differential equations at different points in space leads to a large speedup in simulation speed. Another benefit of CUDA is a programmatic method of transferring large amounts of data between the computer's main memory and the dedicated GPU memory located on the video card. In addition to being able to compute and render on the video card, the CUDA framework allows for a large speedup in the situation, such as with a tiled display wall, where the rendered pixels are to be displayed in a different location than where they are rendered. A CUDA extension for VirtualGL was developed allowing for faster read back at high resolutions. This paper examines several aspects of rendering OpenGL graphics on large displays using VirtualGL and VNC. It demonstrates how performance can be significantly improved in rendering on a tiled monitor wall. We present a CUDA enhanced version of VirtualGL as well as the advantages to having multiple VNC servers. It will discuss restrictions caused by read back and blitting rates and how they are affected by different sizes of virtual displays being rendered.

  1. General Criterion for Harmonicity

    Science.gov (United States)

    Proesmans, Karel; Vandebroek, Hans; Van den Broeck, Christian

    2017-10-01

    Inspired by Kubo-Anderson Markov processes, we introduce a new class of transfer matrices whose largest eigenvalue is determined by a simple explicit algebraic equation. Applications include the free energy calculation for various equilibrium systems and a general criterion for perfect harmonicity, i.e., a free energy that is exactly quadratic in the external field. As an illustration, we construct a "perfect spring," namely, a polymer with non-Gaussian, exponentially distributed subunits which, nevertheless, remains harmonic until it is fully stretched. This surprising discovery is confirmed by Monte Carlo and Langevin simulations.

  2. [Correlation between iridology and general pathology].

    Science.gov (United States)

    Demea, Sorina

    2002-01-01

    The research proposal is to evaluate the association between certain irian signs and general pathology of studied patients. There were studied 57 hospitalized patients; there was taken over all their iris images, which were analyzed through iridological protocols; in the same time the pathology of these patients was noted from their records in the hospital, concordant with the clinical diagnosis; all these information were included in a database for a computerised processing. The correlations resulted from, shows a high connection between the irian constitution establish through iridological criteria and the existent pathology. Iris examination can be very useful for diagnosis of a certain general pathology, in a holistic approach of the patient.

  3. Catastrophe Insurance Modeled by Shot-Noise Processes

    Directory of Open Access Journals (Sweden)

    Thorsten Schmidt

    2014-02-01

    Full Text Available Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot is followed by a decline (noise. This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with shot-noise processes. Besides this, we obtain some highly tractable examples and constitute a useful modeling tool for dynamic claims processes. The results can in particular be used for pricing Catastrophe Bonds (CAT bonds, a traded risk-linked security. Additionally, current results regarding the estimation of shot-noise processes are reviewed.

  4. A Generalized QMRA Beta-Poisson Dose-Response Model.

    Science.gov (United States)

    Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie

    2016-10-01

    Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single-hit dose-response models are the most commonly used dose-response models in QMRA. Denoting PI(d) as the probability of infection at a given mean dose d, a three-parameter generalized QMRA beta-Poisson dose-response model, PI(d|α,β,r*), is proposed in which the minimum number of organisms required for causing infection, K min , is not fixed, but a random variable following a geometric distribution with parameter 0Poisson model, PI(d|α,β), is a special case of the generalized model with K min = 1 (which implies r*=1). The generalized beta-Poisson model is based on a conceptual model with greater detail in the dose-response mechanism. Since a maximum likelihood solution is not easily available, a likelihood-free approximate Bayesian computation (ABC) algorithm is employed for parameter estimation. By fitting the generalized model to four experimental data sets from the literature, this study reveals that the posterior median r* estimates produced fall short of meeting the required condition of r* = 1 for single-hit assumption. However, three out of four data sets fitted by the generalized models could not achieve an improvement in goodness of fit. These combined results imply that, at least in some cases, a single-hit assumption for characterizing the dose-response process may not be appropriate, but that the more complex models may be difficult to support especially if the sample size is small. The three-parameter generalized model provides a possibility to investigate the mechanism of a dose-response process in greater detail than is possible under a single-hit model. © 2016 Society for Risk Analysis.

  5. Generalization of the Poincare sphere to process 2D displacement signals

    Science.gov (United States)

    Sciammarella, Cesar A.; Lamberti, Luciano

    2017-06-01

    Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.

  6. Factors that predict consumer acceptance of enriched processed meats.

    Science.gov (United States)

    Shan, Liran C; Henchion, Maeve; De Brún, Aoife; Murrin, Celine; Wall, Patrick G; Monahan, Frank J

    2017-11-01

    The study aimed to understand predictors of consumers' purchase intention towards processed meat based functional foods (i.e. enriched processed meat). A cross-sectional survey was conducted with 486 processed meat consumers in spring 2016. Results showed that processed meats were perceived differently in healthiness, with sausage-type products perceived less healthy than cured meat products. Consumers were in general more uncertain than positive about enriched processed meat but differences existed in terms of the attitudes and purchase intention. Following regression analysis, consumers' purchase intention towards enriched processed meat was primarily driven by their attitudes towards the product concept. Perceived healthiness of existing products and eating frequency of processed meat were also positively associated with the purchase intention. Other factors such as general food choice motives, socio-demographic characteristics, consumer health and the consumption of functional foods and dietary supplements in general, were not significant predictors of the purchase intention for enriched processed meat. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. O(3)-symmetric tunneling at false vacuum decay in general relativity

    International Nuclear Information System (INIS)

    Berezin, V.A.; Tkachev, I.I.; Kuzmin, V.A.; AN SSSR, Moscow. Inst. Yadernykh Issledovanij)

    1987-12-01

    The O(3)-symmetric vacuum decay is investigated in general relativity in thin-wall approximation. The following processes are studied: the spontaneous nucleation of a new phase bubble containing a remnant of an old phase inside; a subbarrier transition of a new phase bubble with the non-vanishing total energy from a subcritical state to the infinite expansion; the vacuum decay in the vicinity of a black hole; the creation from nothing of the Universe containing a bubble. General formulae for bounces for all these processes are derived. (orig.)

  8. Informal and Formal Learning of General Practitioners

    Science.gov (United States)

    Spaan, Nadia Roos; Dekker, Anne R. J.; van der Velden, Alike W.; de Groot, Esther

    2016-01-01

    Purpose: The purpose of this study is to understand the influence of formal learning from a web-based training and informal (workplace) learning afterwards on the behaviour of general practitioners (GPs) with respect to prescription of antibiotics. Design/methodology/approach: To obtain insight in various learning processes, semi-structured…

  9. General practitioners as supervisors in postgraduate clinical education

    DEFF Research Database (Denmark)

    Wearne, Susan; Dornan, Tim; Teunissen, Pim W.

    2012-01-01

    Context General practice supervisors are said to serve as the cornerstones of general practice postgraduate education and therefore it is important to clearly define their roles and what makes them effective. The commonly used definition of a supervisor is not primarily based on general practice...... with resident doctors that provided a foundation for learning. Residents needed a balance of challenge, usually provided by patients, and support, provided by supervisors. Supervisors established learning environments, assessed residents' learning needs, facilitated learning, monitored the content and process...... of learning and the well-being of residents, and summarised learning in ways that turned 'know that' into 'know how'. Conclusions General practice must be expert in ensuring patients are well cared for 'by proxy' and in giving residents just the right amount of support they need to face the challenges posed...

  10. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  11. Discharge Processes and 30-Day Readmission Rates of Patients Hospitalized for Heart Failure on General Medicine and Cardiology Services.

    Science.gov (United States)

    Salata, Brian M; Sterling, Madeline R; Beecy, Ashley N; Ullal, Ajayram V; Jones, Erica C; Horn, Evelyn M; Goyal, Parag

    2018-05-01

    Given high rates of heart failure (HF) hospitalizations and widespread adoption of the hospitalist model, patients with HF are often cared for on General Medicine (GM) services. Differences in discharge processes and 30-day readmission rates between patients on GM and those on Cardiology during the contemporary hospitalist era are unknown. The present study compared discharge processes and 30-day readmission rates of patients with HF admitted on GM services and those on Cardiology services. We retrospectively studied 926 patients discharged home after HF hospitalization. The primary outcome was 30-day all-cause readmission after discharge from index hospitalization. Although 60% of patients with HF were admitted to Cardiology services, 40% were admitted to GM services. Prevalence of cardiovascular and noncardiovascular co-morbidities were similar between patients admitted to GM services and Cardiology services. Discharge summaries for patients on GM services were less likely to have reassessments of ejection fraction, new study results, weights, discharge vital signs, discharge physical examinations, and scheduled follow-up cardiologist appointments. In a multivariable regression analysis, patients on GM services were more likely to experience 30-day readmissions compared with those on Cardiology services (odds ratio 1.43 95% confidence interval [1.05 to 1.96], p = 0.02). In conclusion, outcomes are better among those admitted to Cardiology services, signaling the need for studies and interventions focusing on noncardiology hospital providers that care for patients with HF. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Pyro-processes and the wastes

    International Nuclear Information System (INIS)

    Kurata, Masaki; Tokiwai, Moriyasu; Inoue, Tadashi; Nishimura, Tomohiro

    2000-01-01

    Reprocessing using pyrometallurgical processes is generally considered to have economical benefits comparing with conventional aqueous processes because of the combination of simpler process and equipments, less criticality, and more compact facilities. On the other hand, the pyrometallurgical processes must generate peculiar wastes and R and D on those wastes is slightly inferior, as compared with the main processes. In this paper, process flows of major pyrometallurgical processes are firstly summarized and, then, the present R and D condition on the wastes are shown. (author)

  13. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto

    2017-05-16

    In this article, the first hitting times of generalized Poisson processes N-f (t), related to Bernstein functions f are studied. For the spacefractional Poisson processes, N alpha (t), t > 0 ( corresponding to f = x alpha), the hitting probabilities P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features of these extended counting processes. A section of the paper is devoted to processes of the form N( G(H,v) (t)) where G(H,v) (t) are generalized grey Brownian motions. This involves the theory of time-dependent fractional operators of the McBride form. While the time-fractional Poisson process is a renewal process, we prove that the space-time Poisson process is no longer a renewal process.

  14. Generalized versus non-generalized neural network model for multi-lead inflow forecasting at Aswan High Dam

    Directory of Open Access Journals (Sweden)

    A. El-Shafie

    2011-03-01

    Full Text Available Artificial neural networks (ANN have been found efficient, particularly in problems where characteristics of the processes are stochastic and difficult to describe using explicit mathematical models. However, time series prediction based on ANN algorithms is fundamentally difficult and faces problems. One of the major shortcomings is the search for the optimal input pattern in order to enhance the forecasting capabilities for the output. The second challenge is the over-fitting problem during the training procedure and this occurs when ANN loses its generalization. In this research, autocorrelation and cross correlation analyses are suggested as a method for searching the optimal input pattern. On the other hand, two generalized methods namely, Regularized Neural Network (RNN and Ensemble Neural Network (ENN models are developed to overcome the drawbacks of classical ANN models. Using Generalized Neural Network (GNN helped avoid over-fitting of training data which was observed as a limitation of classical ANN models. Real inflow data collected over the last 130 years at Lake Nasser was used to train, test and validate the proposed model. Results show that the proposed GNN model outperforms non-generalized neural network and conventional auto-regressive models and it could provide accurate inflow forecasting.

  15. The consistency assessment of topological relations in cartographic generalization

    Science.gov (United States)

    Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu

    2006-10-01

    The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.

  16. Consciousness Is a Thing, Not a Process

    Directory of Open Access Journals (Sweden)

    Susan Pockett

    2017-12-01

    Full Text Available The central dogma of cognitive psychology is ‘consciousness is a process, not a thing’. Hence, the main task of cognitive neuroscientists is generally seen as working out what kinds of neural processing are conscious and what kinds are not. I argue here that the central dogma is simply wrong. All neural processing is unconscious. The illusion that some of it is conscious results largely from a failure to separate consciousness per se from a number of unconscious processes that normally accompany it—most particularly focal attention. Conscious sensory experiences are not processes at all. They are things: specifically, spatial electromagnetic (EM patterns, which are presently generated only by ongoing unconscious processing at certain times and places in the mammalian brain, but which in principle could be generated by hardware rather than wetware. The neurophysiological mechanisms by which putatively conscious EM patterns are generated, the features that may distinguish conscious from unconscious patterns, the general principles that distinguish the conscious patterns of different sensory modalities and the general features that distinguish the conscious patterns of different experiences within any given sensory modality are all described. Suggestions for further development of this paradigm are provided.

  17. Guidelines for computer security in general practice

    Directory of Open Access Journals (Sweden)

    Peter Schattner

    2007-06-01

    Conclusions This paper maps out a process for developing computer security guidelines for general practice. The specific content will vary in different countries according to their levels of adoption of IT, and cultural, technical and other health service factors. Making these guidelines relevant to local contexts should help maximise their uptake.

  18. Alarm processing system using AI techniques for nuclear power plant

    International Nuclear Information System (INIS)

    Yang, Joon On; Chang, Soon Heung

    1990-01-01

    An alarm processing system (APS) has been developed using artificial intelligence (AI) techniques. The alarms of nuclear power plants (NPP's) are classified into the generalized and special alarms. The generalized alarms are also classified into the global and local alarms. For each type of alarms, the specific processing rules are applied to filter and suppress unnecessary and potentially misleading alarms. The local processing are based on 'model-based reasoning.' The global and special alarms are processed by using the general cause-consequence check rules. The priorities of alarms are determined according to the plant state and the consistencies between them

  19. Technical Safety Requirements for the Waste Storage Facilities

    International Nuclear Information System (INIS)

    Laycak, D.T.

    2010-01-01

    This document contains Technical Safety Requirements (TSR) for the Radioactive and Hazardous Waste Management (RHWM) WASTE STORAGE FACILITIES, which include Area 625 (A625) and the Decontamination and Waste Treatment Facility (DWTF) Storage Area at Lawrence Livermore National Laboratory (LLNL). The TSRs constitute requirements regarding the safe operation of the WASTE STORAGE FACILITIES. These TSRs are derived from the Documented Safety Analysis for the Waste Storage Facilities (DSA) (LLNL 2009). The analysis presented therein determined that the WASTE STORAGE FACILITIES are low-chemical hazard, Hazard Category 2 non-reactor nuclear facilities. The TSRs consist primarily of inventory limits and controls to preserve the underlying assumptions in the hazard and accident analyses. Further, appropriate commitments to safety programs are presented in the administrative controls sections of the TSRs. The WASTE STORAGE FACILITIES are used by RHWM to handle and store hazardous waste, TRANSURANIC (TRU) WASTE, LOW-LEVEL WASTE (LLW), mixed waste, California combined waste, nonhazardous industrial waste, and conditionally accepted waste generated at LLNL as well as small amounts from other U.S. Department of Energy (DOE) facilities, as described in the DSA. In addition, several minor treatments (e.g., size reduction and decontamination) are carried out in these facilities. The WASTE STORAGE FACILITIES are located in two portions of the LLNL main site. A625 is located in the southeast quadrant of LLNL. The A625 fenceline is approximately 225 m west of Greenville Road. The DWTF Storage Area, which includes Building 693 (B693), Building 696 Radioactive Waste Storage Area (B696R), and associated yard areas and storage areas within the yard, is located in the northeast quadrant of LLNL in the DWTF complex. The DWTF Storage Area fenceline is approximately 90 m west of Greenville Road. A625 and the DWTF Storage Area are subdivided into various facilities and storage areas, consisting

  20. Technical Safety Requirements for the Waste Storage Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D T

    2008-06-16

    This document contains Technical Safety Requirements (TSR) for the Radioactive and Hazardous Waste Management (RHWM) WASTE STORAGE FACILITIES, which include Area 625 (A625) and the Decontamination and Waste Treatment Facility (DWTF) Storage Area at Lawrence Livermore National Laboratory (LLNL). The TSRs constitute requirements regarding the safe operation of the WASTE STORAGE FACILITIES. These TSRs are derived from the 'Documented Safety Analysis for the Waste Storage Facilities' (DSA) (LLNL 2008). The analysis presented therein determined that the WASTE STORAGE FACILITIES are low-chemical hazard, Hazard Category 2 non-reactor nuclear facilities. The TSRs consist primarily of inventory limits and controls to preserve the underlying assumptions in the hazard and accident analyses. Further, appropriate commitments to safety programs are presented in the administrative controls sections of the TSRs. The WASTE STORAGE FACILITIES are used by RHWM to handle and store hazardous waste, TRANSURANIC (TRU) WASTE, LOW-LEVEL WASTE (LLW), mixed waste, California combined waste, nonhazardous industrial waste, and conditionally accepted waste generated at LLNL as well as small amounts from other U.S. Department of Energy (DOE) facilities, as described in the DSA. In addition, several minor treatments (e.g., size reduction and decontamination) are carried out in these facilities. The WASTE STORAGE FACILITIES are located in two portions of the LLNL main site. A625 is located in the southeast quadrant of LLNL. The A625 fenceline is approximately 225 m west of Greenville Road. The DWTF Storage Area, which includes Building 693 (B693), Building 696 Radioactive Waste Storage Area (B696R), and associated yard areas and storage areas within the yard, is located in the northeast quadrant of LLNL in the DWTF complex. The DWTF Storage Area fenceline is approximately 90 m west of Greenville Road. A625 and the DWTF Storage Area are subdivided into various facilities and storage areas

  1. 42 CFR 423.32 - Enrollment process.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Enrollment process. 423.32 Section 423.32 Public... Enrollment process. (a) General rule. A Part D eligible individual who wishes to enroll in a PDP may enroll... approved by CMS. (c) Timely process an individual's enrollment request. A PDP sponsor must timely process...

  2. Safe Distribution of Declarative Processes

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2011-01-01

    of projections that covers a DCR Graph that the network of synchronously communicating DCR Graphs given by the projections is bisimilar to the original global process graph. We exemplify the distribution technique on a process identified in a case study of an cross-organizational case management system carried...... process model generalizing labelled prime event structures to a systems model able to finitely represent ω-regular languages. An operational semantics given as a transition semantics between markings of the graph allows DCR Graphs to be conveniently used as both specification and execution model....... The technique for distribution is based on a new general notion of projection of DCR Graphs relative to a subset of labels and events identifying the set of external events that must be communicated from the other processes in the network in order for the distribution to be safe.We prove that for any vector...

  3. Generalized modified gravity in large extra dimensions

    International Nuclear Information System (INIS)

    Aslan, Onder; Demir, Durmus A.

    2006-01-01

    We discuss effective interactions among brane matter induced by modifications of higher-dimensional Einstein gravity through the replacement of Einstein-Hilbert term with a generic function f(R,R AB R AB ,R ABCD R ABCD ) of the curvature tensors. We determine gravi-particle spectrum of the theory, and perform a comparative analysis of its predictions with those of the Einstein gravity within Arkani-Hamed-Dvali-Dimopoulos (ADD) setup. We find that this general higher-curvature quantum gravity theory contributes to scatterings among both massive and massless brane matter (in contrast to much simpler generalization of the Einstein gravity, f(R), which influences only the massive matter), and therefore, can be probed via various scattering processes at present and future colliders and directly confronted with the ADD expectations. In addition to collision processes which proceed with tree-level gravi-particle exchange, effective interactions among brane matter are found to exhibit a strong sensitivity to higher-curvature gravity via the gravi-particle loops. Furthermore, particle collisions with missing energy in their final states are found to be sensitive to additional gravi-particles not found in Einstein gravity. In general, road to a correct description of quantum gravity above Fermi energies depends crucially on if collider and other search methods end up with a negative or positive answer for the presence of higher-curvature gravitational interactions

  4. Entrepreneurship within General Aviation

    Science.gov (United States)

    Ullmann, Brian M.

    1995-01-01

    Many modern economic theories place great importance upon entrepreneurship in the economy. Some see the entrepreneur as the individual who bears risk of operating a business in the face of uncertainty about future conditions and who is rewarded through profits and losses. The 20th century economist Joseph Schumpter saw the entrepreneur as the medium by which advancing technology is incorporated into society as businesses seek competitive advantages through more efficient product development processes. Due to the importance that capitalistic systems place upon entrepreneurship, it has become a well studied subject with many texts to discuss how entrepreneurs can succeed in modern society. Many entrepreneuring and business management courses go so far as to discuss the characteristic phases and prominent challenges that fledgling companies face in their efforts to bring a new product into a competitive market. However, even with all of these aids, start-up companies fail at an enormous rate. Indeed, the odds of shepherding a new company through the travails of becoming a well established company (as measured by the ability to reach Initial Public Offering (IPO)) have been estimated to be six in 1,000,000. Each niche industry has characteristic challenges which act as barriers to entry for new products into that industry. Thus, the applicability of broad generalizations is subject to limitations within niche markets. This paper will discuss entrepreneurship as it relates to general aviation. The goals of this paper will be to: introduce general aviation; discuss the details of marrying entrepreneurship with general aviation; and present a sample business plan which would characterize a possible entrepreneurial venture.

  5. The General Factor of Personality: A General Critique.

    Science.gov (United States)

    Revelle, William; Wilt, Joshua

    2013-10-01

    Recently, it has been proposed that all non-cognitive measures of personality share a general factor of personality. A problem with many of these studies is a lack of clarity in defining a general factor. In this paper we address the multiple ways in which a general factor has been identified and argue that many of these approaches find factors that are not in fact general. Through the use of artificial examples, we show that a general factor is not: The first factor or component of a correlation or covariance matrix.The first factor resulting from a bifactor rotation or biquartimin transformationNecessarily the result of a confirmatory factor analysis forcing a bifactor solution We consider how the definition of what constitutes a general factor can lead to confusion, and we will demonstrate alternative ways of estimating the general factor saturation that are more appropriate.

  6. A generalized theory of chromatography and multistep liquid extraction

    Science.gov (United States)

    Chizhkov, V. P.; Boitsov, V. N.

    2017-03-01

    A generalized theory of chromatography and multistep liquid extraction is developed. The principles of highly efficient processes for fine preparative separation of binary mixture components on a fixed sorbent layer are discussed.

  7. The General Factor of Personality: A General Critique

    OpenAIRE

    Revelle, William; Wilt, Joshua

    2013-01-01

    Recently, it has been proposed that all non-cognitive measures of personality share a general factor of personality. A problem with many of these studies is a lack of clarity in defining a general factor. In this paper we address the multiple ways in which a general factor has been identified and argue that many of these approaches find factors that are not in fact general. Through the use of artificial examples, we show that a general factor is not: The first factor or component of a correla...

  8. On a Generalized Hankel Type Convolution of Generalized Functions

    Indian Academy of Sciences (India)

    Generalized Hankel type transformation; Parserval relation; generalized ... The classical generalized Hankel type convolution are defined and extended to a class of generalized functions. ... Proceedings – Mathematical Sciences | News.

  9. Teaching children generalized imitation skills: a case report.

    Science.gov (United States)

    Brown, Freddy Jackson; Peace, Natalie; Parsons, Rachel

    2009-03-01

    Generalized imitation plays an important role in the acquisition of new skills, in particular language and communication. In this case report a multiple exemplar training procedure, with an errorless learning phase, was used to teach Ben, a 13-year-old child with severe intellectual disabilities, to imitate behaviours modelled by an adult instructor. After exposure to seven multiple exemplars, Ben learned to imitate novel actions to criterion (i.e. generalized imitation). These skills were maintained at 90 percent at 6 week and 18 week follow-up. In line with earlier research, this article provides some further support for the finding that multiple exemplar training can facilitate the reliable emergence of generalized imitation skills. Topographically similar behaviours during the learning phase can be difficult to discriminate and hence can slow the learning process. Future research could explore how generalized imitation supports the development of basic communication and activity skills.

  10. Cognitive Risk Factors for Specific Learning Disorder: Processing Speed, Temporal Processing, and Working Memory.

    Science.gov (United States)

    Moll, Kristina; Göbel, Silke M; Gooch, Debbie; Landerl, Karin; Snowling, Margaret J

    2016-01-01

    High comorbidity rates between reading disorder (RD) and mathematics disorder (MD) indicate that, although the cognitive core deficits underlying these disorders are distinct, additional domain-general risk factors might be shared between the disorders. Three domain-general cognitive abilities were investigated in children with RD and MD: processing speed, temporal processing, and working memory. Since attention problems frequently co-occur with learning disorders, the study examined whether these three factors, which are known to be associated with attention problems, account for the comorbidity between these disorders. The sample comprised 99 primary school children in four groups: children with RD, children with MD, children with both disorders (RD+MD), and typically developing children (TD controls). Measures of processing speed, temporal processing, and memory were analyzed in a series of ANCOVAs including attention ratings as covariate. All three risk factors were associated with poor attention. After controlling for attention, associations with RD and MD differed: Although deficits in verbal memory were associated with both RD and MD, reduced processing speed was related to RD, but not MD; and the association with RD was restricted to processing speed for familiar nameable symbols. In contrast, impairments in temporal processing and visuospatial memory were associated with MD, but not RD. © Hammill Institute on Disabilities 2014.

  11. The General Laws of Chemical Elements Composition Dynamics in the Biosphere

    Science.gov (United States)

    Korzh, Vyacheslav D.

    2013-04-01

    The key point of investigation of the specificity of the biosphere elemental composition formation is determination of patterns of redistribution of elemental average concentrations among various phases, like solid - liquid ( the lithosphere - the hydrosphere), which occurs as a result of a global continuous processing of inert matter by living substances. Our task here is to investigate this process in the system "lithosphere - hydrosphere" in view of the integrated involvement of living material in it. This process is most active in biogeochemical barriers, i.e. in places of "the life condensation" and runs under a nonlinear regularity that has been unknown before. It is established that this process results in a general relative increase in concentrations of chemical elements in the solid phase in proportion as their prevalence in the environment is reduced. This process running in various natural systems has practically the same parameter of nonlinearity (v) approximately equal to 0.7. For proto-lithosphere -"living material" - soil v = 0.75. For river - "living material" - ocean v = 0.67. For the contemporary factual awareness level these estimations of nonlinearity indices are practically negligible. Hence, it is for the first time that the existence of a universal constant of nonlinearity of elemental composition evolution in the biosphere has been proved and its quantitative evaluation has been made. REFERENCES 1. Korzh V.D. 1974. Some general laws governing the turnover of substance within the ocean-atmosphere-continent-ocean cycle. // Journal de Recherches Atmospheriques. Vol. 8. P. 653-660. 2. Korzh V.D. 2008. The general laws in the formation of the elemental composition of the Hydrosphere and Biosphere.// J. Ecologica, Vol. XV, P. 13-21. 3. Korzh V.D. 2012. Determination of general laws of elemental composition in Hydrosphere // Water: chemistry & ecology, Journal of water science and its practical application. # 1, P.56-62.

  12. General description of few-body break-up processes at threshold

    International Nuclear Information System (INIS)

    Barrachina, R.O.

    2005-01-01

    In this communication we describe the effects produced by an N-body threshold behavior in N + 1 body break-up processes, as it occurs in situations where one of the fragments acquires almost all the excess energy of the system. Furthermore, we relate the appearance of discontinuities in single-particle multiply differential cross sections to the threshold behavior of the remaining particles, and describe the applicability of these ideas to different systems from atomic, molecular and nuclear collision physics. We finally show that, even though the study of ultracold collisions represents the direct way of gathering information on a break-up system near threshold, the analysis of high-energy collisions provides an alternative, and sometimes advantageous, approach

  13. Radwaste volume reduction and solidification by General Electric

    International Nuclear Information System (INIS)

    Green, T.A.; Weech, M.E.; Miller, G.P.; Eberle, J.W.

    1982-01-01

    Since 1978 General Electric has been actively engaged in developing a volume reduction and solidifcation system or treatment of radwaste generated in commercial nuclear power plants. The studies have been aimed at defining an integrated system that would be directly responsive to the rapid evolving needs of the industry for the volume reduction and solidification of low-level radwaste. The resulting General Electric Volume Reduction System (GEVRS) is an integrated system based on two processes: the first uses azeotropic distillation technology and is called AZTECH, and the second is controlled-air incineration...called INCA. The AZTECH process serves to remove water from concentrated salt solutions, ion exchange resins and filter sludge slurries and then encapsulates the dried solids into a dense plastic product. The INCA unit serves to reduce combustible wastes to ashes suitable for encapsulation into the same plastic product produced by AZTECH

  14. Process mining

    DEFF Research Database (Denmark)

    van der Aalst, W.M.P.; Rubin, V.; Verbeek, H.M.W.

    2010-01-01

    Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...... support for it). None of the existing techniques enables the user to control the balance between “overfitting” and “underfitting”. To address this, we propose a two-step approach. First, using a configurable approach, a transition system is constructed. Then, using the “theory of regions”, the model...

  15. Process monitoring

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Many of the measurements and observations made in a nuclear processing facility to monitor processes and product quality can also be used to monitor the location and movements of nuclear materials. In this session information is presented on how to use process monitoring data to enhance nuclear material control and accounting (MC and A). It will be seen that SNM losses can generally be detected with greater sensitivity and timeliness and point of loss localized more closely than by conventional MC and A systems if process monitoring data are applied. The purpose of this session is to enable the participants to: (1) identify process unit operations that could improve control units for monitoring SNM losses; (2) choose key measurement points and formulate a loss indicator for each control unit; and (3) describe how the sensitivities and timeliness of loss detection could be determined for each loss indicator

  16. Quantum description of light propagation in generalized media

    DEFF Research Database (Denmark)

    Häyrynen, Teppo; Oksanen, Jani

    2016-01-01

    (TW) approach, we generalize the linear material model to simultaneously account for both the emission and absorption processes and to have point-wise defined noise field statistics and intensity dependent interaction strengths. Thus, our approach describes the quantum input-output relations of linear...... the approach to investigate media in nonuniform states which can be e.g. consequences of a temperature gradient over the medium or a position dependent inversion of the amplifier. Furthermore, by using the generalized model we investigate devices with intensity dependent interactions and show how an initial...

  17. Generalized Bell states map physical systems’ quantum evolution into a grammar for quantum information processing

    Science.gov (United States)

    Delgado, Francisco

    2017-12-01

    Quantum information processing should be generated through control of quantum evolution for physical systems being used as resources, such as superconducting circuits, spinspin couplings in ions and artificial anyons in electronic gases. They have a quantum dynamics which should be translated into more natural languages for quantum information processing. On this terrain, this language should let to establish manipulation operations on the associated quantum information states as classical information processing does. This work shows how a kind of processing operations can be settled and implemented for quantum states design and quantum processing for systems fulfilling a SU(2) reduction in their dynamics.

  18. Refractometry in process engineering

    Energy Technology Data Exchange (ETDEWEB)

    Roepscher, H

    1980-02-01

    Following a brief historical introduction into general refractometry, the limiting angle refractometer is dealt with in the first section and the differential refractometer in the second section, as well as process engineering information on this measuring method being given. An attempt is made with an extensive close-to-practice description to introduce the planner and technician to this physical measuring method in process engineering in order that they be able to use it themselves if necessary. When properly applied, it can be a valuable help to process control in compliance with process automization.

  19. Ocean bio-geophysical modeling using mixed layer-isopycnal general circulation model coupled with photosynthesis process

    Digital Repository Service at National Institute of Oceanography (India)

    Nakamoto, S.; Saito, H.; Muneyama, K.; Sato, T.; PrasannaKumar, S.; Kumar, A.; Frouin, R.

    -chemical system that supports steady carbon circulation in geological time scale in the world ocean using Mixed Layer-Isopycnal ocean General Circulation model with remotely sensed Coastal Zone Color Scanner (CZCS) chlorophyll pigment concentration....

  20. How General-Purpose can a GPU be?

    Directory of Open Access Journals (Sweden)

    Philip Machanick

    2015-12-01

    Full Text Available The use of graphics processing units (GPUs in general-purpose computation (GPGPU is a growing field. GPU instruction sets, while implementing a graphics pipeline, draw from a range of single instruction multiple datastream (SIMD architectures characteristic of the heyday of supercomputers. Yet only one of these SIMD instruction sets has been of application on a wide enough range of problems to survive the era when the full range of supercomputer design variants was being explored: vector instructions. This paper proposes a reconceptualization of the GPU as a multicore design with minimal exotic modes of parallelism so as to make GPGPU truly general.

  1. Cyclic Processing for Context Fusion

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun

    2007-01-01

    Many machine-learning techniques use feedback information. However, current context fusion systems do not support this because they constrain processing to be structured as acyclic processing. This paper proposes a generalization which enables the use of cyclic processing in context fusion systems....... A solution is proposed to the inherent problem of how to avoid uncontrollable looping during cyclic processing. The solution is based on finding cycles using graph-coloring and breaking cycles using time constraints....

  2. A review of governance of maternity services at South Tipperary general hospital

    LENUS (Irish Health Repository)

    Flory, David

    2015-09-01

    This review of the governance of maternity services at South Tipperary General Hospital has focussed on the systems and processes for assurance of service quality, risk management and patient safety primarily inside the hospital but also in the Hospital Group structure within which it operates. The effectiveness of the governance arrangements is largely determined by the quality of the leadership and management – both clinical and general – which designs, implements, and oversees those systems and processes and is ultimately responsible and accountable.\\r\

  3. Stationary and related stochastic processes sample function properties and their applications

    CERN Document Server

    Cramér, Harald

    2004-01-01

    This graduate-level text offers a comprehensive account of the general theory of stationary processes, with special emphasis on the properties of sample functions. Assuming a familiarity with the basic features of modern probability theory, the text develops the foundations of the general theory of stochastic processes, examines processes with a continuous-time parameter, and applies the general theory to procedures key to the study of stationary processes. Additional topics include analytic properties of the sample functions and the problem of time distribution of the intersections between a

  4. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.

    1975-09-01

    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  5. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  6. 28 CFR 30.10 - How does the Attorney General make efforts to accommodate intergovernmental concerns?

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false How does the Attorney General make... the Attorney General make efforts to accommodate intergovernmental concerns? (a) If a state process... form as the Attorney General in his or her discretion deems appropriate. The Attorney General may also...

  7. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    Science.gov (United States)

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  8. Primordial Evolution in the Finitary Process Soup

    Science.gov (United States)

    Görnerup, Olof; Crutchfield, James P.

    A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.

  9. Second order elasticity at hypersonic frequencies of reactive polyurethanes as seen by generalized Cauchy relations

    International Nuclear Information System (INIS)

    Philipp, M; Vergnat, C; Mueller, U; Sanctuary, R; Baller, J; Krueger, J K; Possart, W; Alnot, P

    2009-01-01

    The non-equilibrium process of polymerization of reactive polymers can be accompanied by transition phenomena like gelation or the chemical glass transition. The sensitivity of the mechanical properties at hypersonic frequencies-including the generalized Cauchy relation-to these transition phenomena is studied for three different polyurethanes using Brillouin spectroscopy. As for epoxies, the generalized Cauchy relation surprisingly holds true for the non-equilibrium polymerization process and for the temperature dependence of polyurethanes. Neither the sol-gel transition nor the chemical and thermal glass transitions are visible in the representation of the generalized Cauchy relation. Taking into account the new results and combining them with general considerations about the elastic properties of the isotropic state, an improved physical foundation of the generalized Cauchy relation is proposed.

  10. Second order elasticity at hypersonic frequencies of reactive polyurethanes as seen by generalized Cauchy relations

    Energy Technology Data Exchange (ETDEWEB)

    Philipp, M; Vergnat, C; Mueller, U; Sanctuary, R; Baller, J; Krueger, J K [Laboratoire de Physique des Materiaux, Universite du Luxembourg, 162A, avenue de la Faiencerie, L-1511 Luxembourg (Luxembourg); Possart, W [Fachbereich Werkstoffwissenschaften, Universitaet des Saarlandes, D-66123 Saarbruecken (Germany); Alnot, P [LPMI, Universite Nancy (France)], E-mail: martine.philipp@uni.lu

    2009-01-21

    The non-equilibrium process of polymerization of reactive polymers can be accompanied by transition phenomena like gelation or the chemical glass transition. The sensitivity of the mechanical properties at hypersonic frequencies-including the generalized Cauchy relation-to these transition phenomena is studied for three different polyurethanes using Brillouin spectroscopy. As for epoxies, the generalized Cauchy relation surprisingly holds true for the non-equilibrium polymerization process and for the temperature dependence of polyurethanes. Neither the sol-gel transition nor the chemical and thermal glass transitions are visible in the representation of the generalized Cauchy relation. Taking into account the new results and combining them with general considerations about the elastic properties of the isotropic state, an improved physical foundation of the generalized Cauchy relation is proposed.

  11. Second order elasticity at hypersonic frequencies of reactive polyurethanes as seen by generalized Cauchy relations.

    Science.gov (United States)

    Philipp, M; Vergnat, C; Müller, U; Sanctuary, R; Baller, J; Possart, W; Alnot, P; Krüger, J K

    2009-01-21

    The non-equilibrium process of polymerization of reactive polymers can be accompanied by transition phenomena like gelation or the chemical glass transition. The sensitivity of the mechanical properties at hypersonic frequencies-including the generalized Cauchy relation-to these transition phenomena is studied for three different polyurethanes using Brillouin spectroscopy. As for epoxies, the generalized Cauchy relation surprisingly holds true for the non-equilibrium polymerization process and for the temperature dependence of polyurethanes. Neither the sol-gel transition nor the chemical and thermal glass transitions are visible in the representation of the generalized Cauchy relation. Taking into account the new results and combining them with general considerations about the elastic properties of the isotropic state, an improved physical foundation of the generalized Cauchy relation is proposed.

  12. A big picture prospective for wet waste processing management

    International Nuclear Information System (INIS)

    Gibson, J.D.

    1996-01-01

    This paper provides an overview of general observations made relative to the technical and economical considerations being evaluated by many commercial nuclear power plants involving their decision making process for implementation of several new wet waste management technologies. The waste management processes reviewed include the use of, Reverse Osmosis, Non-Precoat Filters, Resin Stripping ampersand Recycling, Evaporation ampersand Calcination (RVR trademark, ROVER trademark ampersand Thermax trademark), Compression Dewatering (PressPak trademark), Incineration (Resin Express trademark), Survey ampersand Free Release (Green Is Clean) and Quantum Catalytic Extraction Processing (QCEP trademark). These waste management processes are reviewed relative to their general advantages and disadvantages associated with the processing of various wet waste streams including: reactor make-up water, floor drain sludges and other liquid waste streams such as boric acid concentrates and steam generator cleaning solutions. A summary of the conclusions generally being derived by most utilities associated with the use of these waste management processes is also provided

  13. Special procedural measures and the protection of human rights General report

    NARCIS (Netherlands)

    Vervaele, J.A.E.

    2009-01-01

    The aim of the general report is to conduct a comparative analysis of the national reports in order to trace transformation processes in domestic criminal justice systems, in particular criminal process, as special procedural measures are introduced to deal with terrorism and organised crime, and to

  14. A Generalized Evolution Criterion in Nonequilibrium Convective Systems

    Science.gov (United States)

    Ichiyanagi, Masakazu; Nisizima, Kunisuke

    1989-04-01

    A general evolution criterion, applicable to transport processes such as the conduction of heat and mass diffusion, is obtained as a direct version of the Le Chatelier-Braun principle for stationary states. The present theory is not based on any radical departure from the conventional one. The generalized theory is made determinate by proposing the balance equations for extensive thermodynamic variables which will reflect the character of convective systems under the assumption of local equilibrium. As a consequence of the introduction of source terms in the balance equations, there appear additional terms in the expression of the local entropy production, which are bilinear in terms of the intensive variables and the sources. In the present paper, we show that we can construct a dissipation function for such general cases, in which the premises of the Glansdorff-Prigogine theory are accumulated. The new dissipation function permits us to formulate a generalized evolution criterion for convective systems.

  15. NJOY-97, General ENDF/B Processing System for Reactor Design Problems

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The NJOY nuclear data processing system is a modular computer code used for converting evaluated nuclear data in the ENDF format into libraries useful for applications calculations. Because the Evaluated Nuclear Data File (ENDF) format is used all around the world (e.g., ENDF/B-VI in the US, JEF-2.2 in Europe, JENDL-3.2 in Japan, BROND-2.2 in Russia), NJOY gives its users access to a wide variety of the most up-to-date nuclear data. NJOY provides comprehensive capabilities for processing evaluated data, and it can serve applications ranging from continuous-energy Monte Carlo (MCNP), through deterministic transport codes (DANT, ANISN, DORT), to reactor lattice codes (WIMS, EPRI). NJOY handles a wide variety of nuclear effects, including resonances, Doppler broadening, heating (KERMA), radiation-damage, thermal scattering (even cold moderators), gas production, neutrons and charged particles, photo-atomic interactions, self shielding, probability tables, photon production, and high-energy interactions (to 150 MeV). Output can include printed listings, special library files for applications, and Postscript graphics (plus colour). More information on NJOY is available from the developer's home page at http://t2.lanl.gov. Follow the Tourbus section of the Tour area to find notes from the ICTP lectures held at Trieste in March 1998 on the ENDF format and on the NJOY code. 2 - Methods: NJOY97 consists of a set of modules, each performing a well-defined processing task. Each of these modules is essentially a separate computer program linked together by input and output files and a few common constants. The methods and instructions on how to use them are documented in the LA-12740-M report on NJOY91 and in the 'README' file. No new published document is yet available. NJOY97 is a cleaned up version of NJOY94.105 that features compatibility with a wider variety of compilers and machines, explicit double precision for 32-bit systems, a

  16. General H-theorem and Entropies that Violate the Second Law

    Directory of Open Access Journals (Sweden)

    Alexander N. Gorban

    2014-04-01

    Full Text Available H-theorem states that the entropy production is nonnegative and, therefore, the entropy of a closed system should monotonically change in time. In information processing, the entropy production is positive for random transformation of signals (the information processing lemma. Originally, the H-theorem and the information processing lemma were proved for the classical Boltzmann-Gibbs-Shannon entropy and for the correspondent divergence (the relative entropy. Many new entropies and divergences have been proposed during last decades and for all of them the H-theorem is needed. This note proposes a simple and general criterion to check whether the H-theorem is valid for a convex divergence H and demonstrates that some of the popular divergences obey no H-theorem. We consider systems with n states Ai that obey first order kinetics (master equation. A convex function H is a Lyapunov function for all master equations with given equilibrium if and only if its conditional minima properly describe the equilibria of pair transitions Ai ⇌ Aj . This theorem does not depend on the principle of detailed balance and is valid for general Markov kinetics. Elementary analysis of pair equilibria demonstrate that the popular Bregman divergences like Euclidian distance or Itakura-Saito distance in the space of distribution cannot be the universal Lyapunov functions for the first-order kinetics and can increase in Markov processes. Therefore, they violate the second law and the information processing lemma. In particular, for these measures of information (divergences random manipulation with data may add information to data. The main results are extended to nonlinear generalized mass action law kinetic equations.

  17. Uranium enrichment. Enrichment processes

    International Nuclear Information System (INIS)

    Alexandre, M.; Quaegebeur, J.P.

    2009-01-01

    Despite the remarkable progresses made in the diversity and the efficiency of the different uranium enrichment processes, only two industrial processes remain today which satisfy all of enriched uranium needs: the gaseous diffusion and the centrifugation. This article describes both processes and some others still at the demonstration or at the laboratory stage of development: 1 - general considerations; 2 - gaseous diffusion: physical principles, implementation, utilisation in the world; 3 - centrifugation: principles, elementary separation factor, flows inside a centrifuge, modeling of separation efficiencies, mechanical design, types of industrial centrifuges, realisation of cascades, main characteristics of the centrifugation process; 4 - aerodynamic processes: vortex process, nozzle process; 5 - chemical exchange separation processes: Japanese ASAHI process, French CHEMEX process; 6 - laser-based processes: SILVA process, SILMO process; 7 - electromagnetic and ionic processes: mass spectrometer and calutron, ion cyclotron resonance, rotating plasmas; 8 - thermal diffusion; 9 - conclusion. (J.S.)

  18. 28 CFR 51.46 - Reconsideration of objection at the instance of the Attorney General.

    Science.gov (United States)

    2010-07-01

    ... instance of the Attorney General. 51.46 Section 51.46 Judicial Administration DEPARTMENT OF JUSTICE... Processing of Submissions § 51.46 Reconsideration of objection at the instance of the Attorney General. (a... may be reconsidered, if it is deemed appropriate, at the instance of the Attorney General. (b) Notice...

  19. Dissipative quantum mechanics: The generalization of the canonical quantization and von Neumann equation

    International Nuclear Information System (INIS)

    Tarasov, V.E.

    1994-07-01

    Sedov variational principle, which is the generalization of the least actional principle for the dissipative processes is used to generalize the canonical quantization and von Neumann equation for dissipative systems (particles and strings). (author). 66 refs, 1 fig

  20. A generalized model via random walks for information filtering

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhuo-Ming, E-mail: zhuomingren@gmail.com [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Kong, Yixiu [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Shang, Ming-Sheng, E-mail: msshang@cigit.ac.cn [Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Zhang, Yi-Cheng [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland)

    2016-08-06

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  1. A generalized model via random walks for information filtering

    International Nuclear Information System (INIS)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-01-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  2. Self-Intersection Local Times of Generalized Mixed Fractional Brownian Motion as White Noise Distributions

    International Nuclear Information System (INIS)

    Suryawan, Herry P.; Gunarso, Boby

    2017-01-01

    The generalized mixed fractional Brownian motion is defined by taking linear combinations of a finite number of independent fractional Brownian motions with different Hurst parameters. It is a Gaussian process with stationary increments, posseses self-similarity property, and, in general, is neither a Markov process nor a martingale. In this paper we study the generalized mixed fractional Brownian motion within white noise analysis framework. As a main result, we prove that for any spatial dimension and for arbitrary Hurst parameter the self-intersection local times of the generalized mixed fractional Brownian motions, after a suitable renormalization, are well-defined as Hida white noise distributions. The chaos expansions of the self-intersection local times in the terms of Wick powers of white noises are also presented. (paper)

  3. An experimental evaluation of the generalizing capabilities of process discovery techniques and black-box sequence models

    NARCIS (Netherlands)

    Tax, N.; van Zelst, S.J.; Teinemaa, I.; Gulden, Jens; Reinhartz-Berger, Iris; Schmidt, Rainer; Guerreiro, Sérgio; Guédria, Wided; Bera, Palash

    2018-01-01

    A plethora of automated process discovery techniques have been developed which aim to discover a process model based on event data originating from the execution of business processes. The aim of the discovered process models is to describe the control-flow of the underlying business process. At the

  4. Reliability assessment of competing risks with generalized mixed shock models

    International Nuclear Information System (INIS)

    Rafiee, Koosha; Feng, Qianmei; Coit, David W.

    2017-01-01

    This paper investigates reliability modeling for systems subject to dependent competing risks considering the impact from a new generalized mixed shock model. Two dependent competing risks are soft failure due to a degradation process, and hard failure due to random shocks. The shock process contains fatal shocks that can cause hard failure instantaneously, and nonfatal shocks that impact the system in three different ways: 1) damaging the unit by immediately increasing the degradation level, 2) speeding up the deterioration by accelerating the degradation rate, and 3) weakening the unit strength by reducing the hard failure threshold. While the first impact from nonfatal shocks comes from each individual shock, the other two impacts are realized when the condition for a new generalized mixed shock model is satisfied. Unlike most existing mixed shock models that consider a combination of two shock patterns, our new generalized mixed shock model includes three classic shock patterns. According to the proposed generalized mixed shock model, the degradation rate and the hard failure threshold can simultaneously shift multiple times, whenever the condition for one of these three shock patterns is satisfied. An example using micro-electro-mechanical systems devices illustrates the effectiveness of the proposed approach with sensitivity analysis. - Highlights: • A rich reliability model for systems subject to dependent failures is proposed. • The degradation rate and the hard failure threshold can shift simultaneously. • The shift is triggered by a new generalized mixed shock model. • The shift can occur multiple times under the generalized mixed shock model.

  5. REQUIREMENTS FOR A GENERAL INTERPRETATION THEORY

    Directory of Open Access Journals (Sweden)

    Anda Laura Lungu Petruescu

    2013-06-01

    Full Text Available Time has proved that Economic Analysis is not enough as to ensure all the needs of the economic field. The present study wishes to propose a new approach method of the economic phenomena and processes based on the researches made outside the economic space- a new general interpretation theory- which is centered on the human being as the basic actor of economy. A general interpretation theory must assure the interpretation of the causalities among the economic phenomena and processes- causal interpretation; the interpretation of the correlations and dependencies among indicators- normative interpretation; the interpretation of social and communicational processes in economic organizations- social and communicational interpretation; the interpretation of the community status of companies- transsocial interpretation; the interpretation of the purposes of human activities and their coherency – teleological interpretation; the interpretation of equilibrium/ disequilibrium from inside the economic systems- optimality interpretation. In order to respond to such demands, rigor, pragmatism, praxiology and contextual connectors are required. In order to progress, the economic science must improve its language, both its syntax and its semantics. The clarity of exposure requires a language clarity and the scientific theory progress asks for the need of hypotheses in the building of the theories. The switch from the common language to the symbolic one means the switch from ambiguity to rigor and rationality, that is order in thinking. But order implies structure, which implies formalization. Our paper should be a plea for these requirements, requirements which should be fulfilled by a modern interpretation theory.

  6. Decomposition of variance for spatial Cox processes

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Waagepetersen, Rasmus

    Spatial Cox point processes is a natural framework for quantifying the various sources of variation governing the spatial distribution of rain forest trees. We introduce a general criterion for variance decomposition for spatial Cox processes and apply it to specific Cox process models...

  7. Decomposition of variance for spatial Cox processes

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Waagepetersen, Rasmus

    2013-01-01

    Spatial Cox point processes is a natural framework for quantifying the various sources of variation governing the spatial distribution of rain forest trees. We introduce a general criterion for variance decomposition for spatial Cox processes and apply it to specific Cox process models...

  8. Neural responses to ambiguity involve domain-general and domain-specific emotion processing systems.

    Science.gov (United States)

    Neta, Maital; Kelley, William M; Whalen, Paul J

    2013-04-01

    Extant research has examined the process of decision making under uncertainty, specifically in situations of ambiguity. However, much of this work has been conducted in the context of semantic and low-level visual processing. An open question is whether ambiguity in social signals (e.g., emotional facial expressions) is processed similarly or whether a unique set of processors come on-line to resolve ambiguity in a social context. Our work has examined ambiguity using surprised facial expressions, as they have predicted both positive and negative outcomes in the past. Specifically, whereas some people tended to interpret surprise as negatively valenced, others tended toward a more positive interpretation. Here, we examined neural responses to social ambiguity using faces (surprise) and nonface emotional scenes (International Affective Picture System). Moreover, we examined whether these effects are specific to ambiguity resolution (i.e., judgments about the ambiguity) or whether similar effects would be demonstrated for incidental judgments (e.g., nonvalence judgments about ambiguously valenced stimuli). We found that a distinct task control (i.e., cingulo-opercular) network was more active when resolving ambiguity. We also found that activity in the ventral amygdala was greater to faces and scenes that were rated explicitly along the dimension of valence, consistent with findings that the ventral amygdala tracks valence. Taken together, there is a complex neural architecture that supports decision making in the presence of ambiguity: (a) a core set of cortical structures engaged for explicit ambiguity processing across stimulus boundaries and (b) other dedicated circuits for biologically relevant learning situations involving faces.

  9. Theory of generalized Bessel functions

    International Nuclear Information System (INIS)

    Dattoli, G.; Giannessi, L.; Mezi, L.; Torre, A.

    1990-01-01

    In this paper it is discussed the theory of generalized Bessel functions which are of noticeable importance in the analysis of scattering processes for which the dipole approximation cannot be used. These functions have been introduced in their standard form and their modified version. The relevant generating functions and Graf-type addition theorems have been stated. The usefulness of the results to construct a fast algorithm for their quantitative computation is also devised. It is commented on the possibility of getting two-index generalized Bessel functions in e.g. the study of sum rules of the type Σ n=-∞ ∞ t n J n 3 (x), where J n is the cylindrical Bessel function of the first kind. The usefulness of the results for problems of practical interest is finally commented on. It is shown that a modified Anger function can be advantageously introduced to get an almost straightforward computation of the Bernstein sum rule in the theory of ion waves

  10. A generalized wavelet extrema representation

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Jian; Lades, M.

    1995-10-01

    The wavelet extrema representation originated by Stephane Mallat is a unique framework for low-level and intermediate-level (feature) processing. In this paper, we present a new form of wavelet extrema representation generalizing Mallat`s original work. The generalized wavelet extrema representation is a feature-based multiscale representation. For a particular choice of wavelet, our scheme can be interpreted as representing a signal or image by its edges, and peaks and valleys at multiple scales. Such a representation is shown to be stable -- the original signal or image can be reconstructed with very good quality. It is further shown that a signal or image can be modeled as piecewise monotonic, with all turning points between monotonic segments given by the wavelet extrema. A new projection operator is introduced to enforce piecewise inonotonicity of a signal in its reconstruction. This leads to an enhancement to previously developed algorithms in preventing artifacts in reconstructed signal.

  11. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  12. Generalized 2-vector spaces and general linear 2-groups

    OpenAIRE

    Elgueta, Josep

    2008-01-01

    In this paper a notion of {\\it generalized 2-vector space} is introduced which includes Kapranov and Voevodsky 2-vector spaces. Various kinds of generalized 2-vector spaces are considered and examples are given. The existence of non free generalized 2-vector spaces and of generalized 2-vector spaces which are non Karoubian (hence, non abelian) categories is discussed, and it is shown how any generalized 2-vector space can be identified with a full subcategory of an (abelian) functor category ...

  13. Decomposition of variance for spatial Cox processes

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Waagepetersen, Rasmus

    Spatial Cox point processes is a natural framework for quantifying the various sources of variation governing the spatial distribution of rain forest trees. We introducea general criterion for variance decomposition for spatial Cox processes and apply it to specific Cox process models with additive...

  14. THE INTERNAL AUDIT AS COGNITIVE PROCESS

    Directory of Open Access Journals (Sweden)

    D. Petrascu

    2016-11-01

    Full Text Available The term AUDIT generally comes from the Latin word "audire" to listen and to inform others, from today's Anglo-Saxon countries, this term has the meaning of a revision of the accounting information and of those of a different nature, realized by an independent professional, in view of expressing an opinion regarding the regularity and honesty of the audited information (1 §tefan Craciun, Audit financiar §i audit intern, The Economic Publishing House, Bucharest, 2004, page 22. In a general register, an audit has the purpose to grant an entity added value by a systematic and methodic approach, evaluating the risk management processes, the control processes and the governing processes, all of which are materialized within an objective and professional report.

  15. Generalized Multiphoton Quantum Interference

    Directory of Open Access Journals (Sweden)

    Max Tillmann

    2015-10-01

    Full Text Available Nonclassical interference of photons lies at the heart of optical quantum information processing. Here, we exploit tunable distinguishability to reveal the full spectrum of multiphoton nonclassical interference. We investigate this in theory and experiment by controlling the delay times of three photons injected into an integrated interferometric network. We derive the entire coincidence landscape and identify transition matrix immanants as ideally suited functions to describe the generalized case of input photons with arbitrary distinguishability. We introduce a compact description by utilizing a natural basis that decouples the input state from the interferometric network, thereby providing a useful tool for even larger photon numbers.

  16. Experimental studies of generalized parton distributions

    International Nuclear Information System (INIS)

    Kabuss, E.M.

    2014-01-01

    Generalized parton distributions (GPD) provide a new way to study the nucleon structure. Experimentally they can be accessed using hard exclusive processes such as deeply virtual Compton scattering and meson production. First insights to GPDs were already obtained from measurements at DESY, JLAB and CERN, while new ambitious studies are planned at the upgraded JLAB at 12 GeV and at CERN. Here, some emphasis will be put onto the planned COMPASS II programme. (author)

  17. Syncope prevalence in the ED compared to general practice and population: a strong selection process

    NARCIS (Netherlands)

    Olde Nordkamp, Louise R. A.; van Dijk, Nynke; Ganzeboom, Karin S.; Reitsma, Johannes B.; Luitse, Jan S. K.; Dekker, Lukas R. C.; Shen, Win-Kuang; Wieling, Wouter

    2009-01-01

    Objective: We assessed the prevalence and distribution of the different causes of transient loss of consciousness (TLOC) in the emergency department (ED) and chest pain unit (CPU) and estimated the proportion of persons with syncope in the general population who seek medical attention from either

  18. Plasma processing for VLSI

    CERN Document Server

    Einspruch, Norman G

    1984-01-01

    VLSI Electronics: Microstructure Science, Volume 8: Plasma Processing for VLSI (Very Large Scale Integration) discusses the utilization of plasmas for general semiconductor processing. It also includes expositions on advanced deposition of materials for metallization, lithographic methods that use plasmas as exposure sources and for multiple resist patterning, and device structures made possible by anisotropic etching.This volume is divided into four sections. It begins with the history of plasma processing, a discussion of some of the early developments and trends for VLSI. The second section

  19. Introduction to electron beam processing

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, Waichiro [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment

    1994-12-31

    The contents are general features in the irradiation of polymers, electron beam machines - low energy, medium energy, high energy; application of EB machine in industries, engineering of EB processing, dosimetry of EB (electron beam) safe operation of EB machine, recent topics on EB processing under development. 3 tabs., 4 figs., 17 refs.

  20. Introduction to electron beam processing

    International Nuclear Information System (INIS)

    Waichiro Kawakami

    1994-01-01

    The contents are general features in the irradiation of polymers, electron beam machines - low energy, medium energy, high energy; application of EB machine in industries, engineering of EB processing, dosimetry of EB (electron beam) safe operation of EB machine, recent topics on EB processing under development. 3 tabs., 4 figs., 17 refs

  1. Matérn thinned Cox processes

    DEFF Research Database (Denmark)

    Andersen, Ina Trolle; Hahn, Ute

    2016-01-01

    and hard core behaviour can be achieved by applying a dependent Matérn thinning to a Cox process. An exact formula for the intensity of a Matérn thinned shot noise Cox process is derived from the Palm distribution. For the more general class of Matérn thinned Cox processes, formulae for the intensity...

  2. Matérn thinned Cox processes

    DEFF Research Database (Denmark)

    Andersen, Ina Trolle; Hahn, Ute

    of clustering and hard core behaviour can be achieved by applying a dependent Matérn thinning to a Cox process. An exact formula for the intensity of a Matérn thinned shot noise Cox process is derived from the Palm distribution. For the more general class of Matérn thinned Cox processes, formulae...

  3. Do patients' faces influence General Practitioners' cancer suspicions? A test of automatic processing of sociodemographic information.

    Directory of Open Access Journals (Sweden)

    Rosalind Adam

    Full Text Available Delayed cancer diagnosis leads to poorer patient outcomes. During short consultations, General Practitioners (GPs make quick decisions about likelihood of cancer. Patients' facial cues are processed rapidly and may influence diagnosis.To investigate whether patients' facial characteristics influence immediate perception of cancer risk by GPs.Web-based binary forced choice experiment with GPs from Northeast Scotland.GPs were presented with a series of pairs of face prototypes and asked to quickly select the patient more likely to have cancer. Faces were modified with respect to age, gender, and ethnicity. Choices were analysed using Chi-squared goodness-of-fit statistics with Bonferroni corrections.Eighty-two GPs participated. GPs were significantly more likely to suspect cancer in older patients. Gender influenced GP cancer suspicion, but this was modified by age: the male face was chosen as more likely to have cancer than the female face for young (72% of GPs;95% CI 61.0-87.0 and middle-aged faces (65.9%; 95% CI 54.7-75.5; but 63.4% (95% CI 52.2-73.3 decided the older female was more likely to have cancer than the older male (p = 0.015. GPs were significantly more likely to suspect cancer in the young Caucasian male (65.9% (95% CI 54.7, 75.5 compared to the young Asian male (p = 0.004.GPs' first impressions about cancer risk are influenced by patient age, gender, and ethnicity. Tackling GP cognitive biases could be a promising way of reducing cancer diagnostic delays, particularly for younger patients.

  4. General dental practitioner's views on dental general anaesthesia services.

    Science.gov (United States)

    Threlfall, A G; King, D; Milsom, K M; Blinkhom, A S; Tickle, M

    2007-06-01

    Policy has recently changed on provision of dental general anaesthetic services in England. The aim of this study was to investigate general dental practitioners' views about dental general anaesthetics, the reduction in its availability and the impact on care of children with toothache. Qualitative study using semi-structured interviews and clinical case scenarios. General dental practitioners providing NHS services in the North West of England. 93 general dental practitioners were interviewed and 91 answered a clinical case scenario about the care they would provide for a 7-year-old child with multiple decayed teeth presenting with toothache. Scenario responses showed variation; 8% would immediately refer for general anaesthesia, 25% would initially prescribe antibiotics, but the majority would attempt to either restore or extract the tooth causing pain. Interview responses also demonstrated variation in care, however most dentists agree general anaesthesia has a role for nervous children but only refer as a last resort. The responses indicated an increase in inequalities, and that access to services did not match population needs, leaving some children waiting in pain. Most general dental practitioners support moving dental general anaesthesia into hospitals but some believe that it has widened health inequalities and there is also a problem associated with variation in treatment provision. Additional general anaesthetic services in some areas with high levels of tooth decay are needed and evidence based guidelines about caring for children with toothache are required.

  5. Generalized Laws of Black Hole Thermodynamics and Quantum Conservation Laws on Hawking Radiation Process

    OpenAIRE

    Wu, S. Q.; Cai, X.

    2000-01-01

    Four classical laws of black hole thermodynamics are extended from exterior (event) horizon to interior (Cauchy) horizon. Especially, the first law of classical thermodynamics for Kerr-Newman black hole (KNBH) is generalized to those in quantum form. Then five quantum conservation laws on the KNBH evaporation effect are derived in virtue of thermodynamical equilibrium conditions. As a by-product, Bekenstein-Hawking's relation $ S=A/4 $ is exactly recovered.

  6. Generalized laws of black-hole thermodynamics and quantum conservation laws on Hawking radiation process

    International Nuclear Information System (INIS)

    Wu, S.Q.; Cai, X.

    2000-01-01

    Four classical laws of black-hole thermodynamics are extended from exterior (event) horizon to interior (Cauchy) horizon. Especially, the first law of classical thermodynamics for Kerr-Newman black hole (KNBH) is generalized to those in quantum form. Then five quantum conservation laws on the KNBH evaporation effect are derived in virtue of thermodynamical equilibrium conditions. As a by-product, Bekenstein-Haw king's relation S=A/4 is exactly recovered

  7. 7 CFR 1710.119 - Loan processing priorities.

    Science.gov (United States)

    2010-01-01

    ... and Basic Policies § 1710.119 Loan processing priorities. (a) Generally loans are processed in... in effect at the time the facilities were originally constructed; (3) To finance the capital needs of...

  8. Guidelines for computer security in general practice.

    Science.gov (United States)

    Schattner, Peter; Pleteshner, Catherine; Bhend, Heinz; Brouns, Johan

    2007-01-01

    As general practice becomes increasingly computerised, data security becomes increasingly important for both patient health and the efficient operation of the practice. To develop guidelines for computer security in general practice based on a literature review, an analysis of available information on current practice and a series of key stakeholder interviews. While the guideline was produced in the context of Australian general practice, we have developed a template that is also relevant for other countries. Current data on computer security measures was sought from Australian divisions of general practice. Semi-structured interviews were conducted with general practitioners (GPs), the medical software industry, senior managers within government responsible for health IT (information technology) initiatives, technical IT experts, divisions of general practice and a member of a health information consumer group. The respondents were asked to assess both the likelihood and the consequences of potential risks in computer security being breached. The study suggested that the most important computer security issues in general practice were: the need for a nominated IT security coordinator; having written IT policies, including a practice disaster recovery plan; controlling access to different levels of electronic data; doing and testing backups; protecting against viruses and other malicious codes; installing firewalls; undertaking routine maintenance of hardware and software; and securing electronic communication, for example via encryption. This information led to the production of computer security guidelines, including a one-page summary checklist, which were subsequently distributed to all GPs in Australia. This paper maps out a process for developing computer security guidelines for general practice. The specific content will vary in different countries according to their levels of adoption of IT, and cultural, technical and other health service factors. Making

  9. Cellular registration without behavioral recall of olfactory sensory input under general anesthesia.

    Science.gov (United States)

    Samuelsson, Andrew R; Brandon, Nicole R; Tang, Pei; Xu, Yan

    2014-04-01

    Previous studies suggest that sensory information is "received" but not "perceived" under general anesthesia. Whether and to what extent the brain continues to process sensory inputs in a drug-induced unconscious state remain unclear. One hundred seven rats were randomly assigned to 12 different anesthesia and odor exposure paradigms. The immunoreactivities of the immediate early gene products c-Fos and Egr1 as neural activity markers were combined with behavioral tests to assess the integrity and relationship of cellular and behavioral responsiveness to olfactory stimuli under a surgical plane of ketamine-xylazine general anesthesia. The olfactory sensory processing centers could distinguish the presence or absence of experimental odorants even when animals were fully anesthetized. In the anesthetized state, the c-Fos immunoreactivity in the higher olfactory cortices revealed a difference between novel and familiar odorants similar to that seen in the awake state, suggesting that the anesthetized brain functions beyond simply receiving external stimulation. Reexposing animals to odorants previously experienced only under anesthesia resulted in c-Fos immunoreactivity, which was similar to that elicited by familiar odorants, indicating that previous registration had occurred in the anesthetized brain. Despite the "cellular memory," however, odor discrimination and forced-choice odor-recognition tests showed absence of behavioral recall of the registered sensations, except for a longer latency in odor recognition tests. Histologically distinguishable registration of sensory processing continues to occur at the cellular level under ketamine-xylazine general anesthesia despite the absence of behavioral recognition, consistent with the notion that general anesthesia causes disintegration of information processing without completely blocking cellular communications.

  10. On generalization of electric field strength in longitudinally blown arcs

    OpenAIRE

    Yas'ko, O.I.; Esipchuk, A.M.; Qing, Z.; Schram, D.C.; Fauchais, P.

    1997-01-01

    Generalization of av. elec. field strength for different discharge conditions in longitudinally blown arcs is considered. Exptl. data for distinctive devices and different gases were used for phys. modeling. Anal. showed that heat transfer processes are responsible for I-E characteristic formation. Turbulent heat transfer is the most effective for atm. pressure discharges while convection plays the main role in vacuum arcs. A generalized I-E characteristic was obtained. [on SciFinder (R)

  11. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  12. Generalizing on best practices in image processing: a model for promoting research integrity: Commentary on: Avoiding twisted pixels: ethical guidelines for the appropriate use and manipulation of scientific digital images.

    Science.gov (United States)

    Benos, Dale J; Vollmer, Sara H

    2010-12-01

    Modifying images for scientific publication is now quick and easy due to changes in technology. This has created a need for new image processing guidelines and attitudes, such as those offered to the research community by Doug Cromey (Cromey 2010). We suggest that related changes in technology have simplified the task of detecting misconduct for journal editors as well as researchers, and that this simplification has caused a shift in the responsibility for reporting misconduct. We also argue that the concept of best practices in image processing can serve as a general model for education in best practices in research.

  13. 39 CFR 2.2 - Agent for receipt of process.

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Agent for receipt of process. 2.2 Section 2.2 Postal Service UNITED STATES POSTAL SERVICE THE BOARD OF GOVERNORS OF THE U.S. POSTAL SERVICE GENERAL AND TECHNICAL PROVISIONS (ARTICLE II) § 2.2 Agent for receipt of process. The General Counsel of the Postal...

  14. Connecting Achievement Motivation to Performance in General Chemistry

    Science.gov (United States)

    Ferrell, Brent; Phillips, Michael M.; Barbera, Jack

    2016-01-01

    Student success in chemistry is inherently tied to motivational and other affective processes. We investigated three distinct constructs tied to motivation: self-efficacy, interest, and effort beliefs. These variables were measured twice over the course of a semester in three sections of a first-semester general chemistry course (n = 170). We…

  15. In-process and post-process measurements of drill wear for control of the drilling process

    Science.gov (United States)

    Liu, Tien-I.; Liu, George; Gao, Zhiyu

    2011-12-01

    Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.

  16. Generalized kinetic model of reduction of molecular oxidant by metal containing redox

    International Nuclear Information System (INIS)

    Kravchenko, T.A.

    1986-01-01

    Present work is devoted to kinetics of reduction of molecular oxidant by metal containing redox. Constructed generalized kinetic model of redox process in the system solid redox - reagent solution allows to perform the general theoretical approach to research and to obtain new results on kinetics and mechanism of interaction of redox with oxidants.

  17. NJOY91, General ENDF/B Processing System for Reactor Design Problems

    International Nuclear Information System (INIS)

    MacFarlane, R.E.; Barrett, R.J.; Muir, D.W.; Boicourt, R.M.

    1997-01-01

    1 - Description of problem or function: The NJOY nuclear data processing system is a comprehensive computer code package for producing pointwise and multigroup neutron, photon, and charged particle cross sections from ENDF/B evaluated nuclear data. NJOY-89 is a substantial upgrade of the previous release. It includes photon production and photon interaction capabilities, heating calculations, covariance processing, and thermal scattering capabilities. It is capable of processing data in ENDF/B-4, ENDF/B-5, and ENDF/B-6 formats for evaluated data (to the extent that the latter have been frozen at the time of this release). NJOY-91.118: This is the last in the NJOY-91 series. It uses the same module structure as the earlier versions and its graphics options depend on DISSPLA. NJOY91.118 includes bug fixes, improvements in several modules, and some new capabilities. Information on the changes is included in the README file. A new test problem was added to test some ENDF/B-6 features, including Reich-Moore resonance reconstruction, energy-angle matrices in GROUPR, and energy-angle distributions in ACER. The 91.118 release is basically configured for UNIX. Short descriptions of the different modules follow: RECONR Reconstructs pointwise (energy-dependent) cross sections from ENDF/B resonance parameters and interpolation schemes. BROADR Doppler broadens and thins pointwise cross sections. UNRESR Computes effective self-shielded pointwise cross sections in the unresolved-resonance region. HEATR Generates pointwise heat production cross sections (KERMA factors) and radiation-damage-energy production cross sections. THERMR Produces incoherent inelastic energy-to-energy matrices for free or bound scatterers, coherent elastic cross sections for hexagonal materials, and incoherent elastic cross sections. GROUPR Generates self-shielded multigroup cross sections, group- to-group neutron scattering matrices, and photon production matrices from pointwise input. GAMINR Calculates

  18. Simulations of the general circulation of the Martian atmosphere. I - Polar processes

    Science.gov (United States)

    Pollack, James B.; Haberle, Robert M.; Schaeffer, James; Lee, Hilda

    1990-01-01

    Numerical simulations of the Martian atmosphere general circulation are carried out for 50 simulated days, using a three-dimensional model, based on the primitive equations of meteorology, which incorporated the radiative effects of atmospheric dust on solar and thermal radiation. A large number of numerical experiments were conducted for alternative choices of seasonal date and dust optical depth. It was found that, as the dust content of the winter polar region increased, the rate of atmospheric CO2 condensation increased sharply. It is shown that the strong seasonal variation in the atmospheric dust content observed might cause a number of hemispheric asymmetries. These asymmetries include the greater prevalence of polar hoods in the northern polar region during winter, the lower albedo of the northern polar cap during spring, and the total dissipation of the northern CO2 ice cap during the warmer seasons.

  19. A Generalized Analytic Operator-Valued Function Space Integral and a Related Integral Equation

    International Nuclear Information System (INIS)

    Chang, K.S.; Kim, B.S.; Park, C.H.; Ryu, K.S.

    2003-01-01

    We introduce a generalized Wiener measure associated with a Gaussian Markov process and define a generalized analytic operator-valued function space integral as a bounded linear operator from L p into L p-ci r cumflexprime (1< p ≤ 2) by the analytic continuation of the generalized Wiener integral. We prove the existence of the integral for certain functionals which involve some Borel measures. Also we show that the generalized analytic operator-valued function space integral satisfies an integral equation related to the generalized Schroedinger equation. The resulting theorems extend the theory of operator-valued function space integrals substantially and previous theorems about these integrals are generalized by our results

  20. Catalytic Oxidation of Lignins into the Aromatic Aldehydes: General Process Trends and Development Prospects

    Science.gov (United States)

    Tarabanko, Valery E.; Tarabanko, Nikolay

    2017-01-01

    This review discusses principal patterns that govern the processes of lignins’ catalytic oxidation into vanillin (3-methoxy-4-hydroxybenzaldehyde) and syringaldehyde (3,5-dimethoxy-4-hydroxybenzaldehyde). It examines the influence of lignin and oxidant nature, temperature, mass transfer, and of other factors on the yield of the aldehydes and the process selectivity. The review reveals that properly organized processes of catalytic oxidation of various lignins are only insignificantly (10–15%) inferior to oxidation by nitrobenzene in terms of yield and selectivity in vanillin and syringaldehyde. Very high consumption of oxygen (and consequentially, of alkali) in the process—over 10 mol per mol of obtained vanillin—is highlighted as an unresolved and unexplored problem: scientific literature reveals almost no studies devoted to the possibilities of decreasing the consumption of oxygen and alkali. Different hypotheses about the mechanism of lignin oxidation into the aromatic aldehydes are discussed, and the mechanism comprising the steps of single-electron oxidation of phenolate anions, and ending with retroaldol reaction of a substituted coniferyl aldehyde was pointed out as the most convincing one. The possibility and development prospects of single-stage oxidative processing of wood into the aromatic aldehydes and cellulose are analyzed. PMID:29140301

  1. Generalized outcome-based strategy classification: comparing deterministic and probabilistic choice models.

    Science.gov (United States)

    Hilbig, Benjamin E; Moshagen, Morten

    2014-12-01

    Model comparisons are a vital tool for disentangling which of several strategies a decision maker may have used--that is, which cognitive processes may have governed observable choice behavior. However, previous methodological approaches have been limited to models (i.e., decision strategies) with deterministic choice rules. As such, psychologically plausible choice models--such as evidence-accumulation and connectionist models--that entail probabilistic choice predictions could not be considered appropriately. To overcome this limitation, we propose a generalization of Bröder and Schiffer's (Journal of Behavioral Decision Making, 19, 361-380, 2003) choice-based classification method, relying on (1) parametric order constraints in the multinomial processing tree framework to implement probabilistic models and (2) minimum description length for model comparison. The advantages of the generalized approach are demonstrated through recovery simulations and an experiment. In explaining previous methods and our generalization, we maintain a nontechnical focus--so as to provide a practical guide for comparing both deterministic and probabilistic choice models.

  2. Lasers in chemical processing

    International Nuclear Information System (INIS)

    Davis, J.I.

    1982-01-01

    The high cost of laser energy is the crucial issue in any potential laser-processing application. It is expensive relative to other forms of energy and to most bulk chemicals. We show those factors that have previously frustrated attempts to find commercially viable laser-induced processes for the production of materials. Having identified the general criteria to be satisfied by an economically successful laser process and shown how these imply the laser-system requirements, we present a status report on the uranium laser isotope separation (LIS) program at the Lawrence Livermore National Laboratory

  3. Narrative infrastructure in product creation processes

    NARCIS (Netherlands)

    Deuten, Jasper; Rip, Arie

    2000-01-01

    In product creation processes, perhaps even more than in organization processes in general, uncertainties are addressed and complexity is reduced. In retrospect, linearized success stories are told. The history of a product innovation in a biotechnology firm is used to show how actually, over time,

  4. Integrated logistic support studies using behavioral Monte Carlo simulation, supported by Generalized Stochastic Petri Nets

    International Nuclear Information System (INIS)

    Garnier, Robert; Chevalier, Marcel

    2000-01-01

    Studying large and complex industrial sites, requires more and more accuracy in modeling. In particular, when considering Spares, Maintenance and Repair / Replacement processes, determining optimal Integrated Logistic Support policies requires a high level modeling formalism, in order to make the model as close as possible to the real considered processes. Generally, numerical methods are used to process this kind of study. In this paper, we propose an alternate way to process optimal Integrated Logistic Support policy determination when dealing with large, complex and distributed multi-policies industrial sites. This method is based on the use of behavioral Monte Carlo simulation, supported by Generalized Stochastic Petri Nets. (author)

  5. Efficient probabilistic model checking on general purpose graphic processors

    NARCIS (Netherlands)

    Bosnacki, D.; Edelkamp, S.; Sulewski, D.; Pasareanu, C.S.

    2009-01-01

    We present algorithms for parallel probabilistic model checking on general purpose graphic processing units (GPGPUs). For this purpose we exploit the fact that some of the basic algorithms for probabilistic model checking rely on matrix vector multiplication. Since this kind of linear algebraic

  6. Looking Backward: James Madison University's General Education Reform.

    Science.gov (United States)

    Reynolds, Charles W.; Allain, Violet Anselmini; Erwin, T. Dary; Halpern, Linda Cabe; McNallie, Robin; Ross, Martha K.

    1998-01-01

    Describes the new general education program at James Madison University (Virginia) and the process by which it was developed. Indicates that the program is organized by five broad areas of knowledge that are defined by interdisciplinary clusters of learning objectives, which in turn were developed using input from every academic department on…

  7. Identifying multiple influential spreaders based on generalized closeness centrality

    Science.gov (United States)

    Liu, Huan-Li; Ma, Chuang; Xiang, Bing-Bing; Tang, Ming; Zhang, Hai-Feng

    2018-02-01

    To maximize the spreading influence of multiple spreaders in complex networks, one important fact cannot be ignored: the multiple spreaders should be dispersively distributed in networks, which can effectively reduce the redundance of information spreading. For this purpose, we define a generalized closeness centrality (GCC) index by generalizing the closeness centrality index to a set of nodes. The problem converts to how to identify multiple spreaders such that an objective function has the minimal value. By comparing with the K-means clustering algorithm, we find that the optimization problem is very similar to the problem of minimizing the objective function in the K-means method. Therefore, how to find multiple nodes with the highest GCC value can be approximately solved by the K-means method. Two typical transmission dynamics-epidemic spreading process and rumor spreading process are implemented in real networks to verify the good performance of our proposed method.

  8. Summary of 4th general plan of radioactive wastes

    International Nuclear Information System (INIS)

    Veganzones, A.

    1995-01-01

    The last 9th of December 1994 the Council of Ministries approved the Fourth General Plan of Radioactive Waste (PGRR). The Fourth Plan actualizes former texts taking into into account new circumstances, both technical and economical, affecting radioactive waste. Some of the steps that conform the global waste management process have been revised on the light of the Spanish experience but also considering the evolution and trends in other countries. In this work some of the most important aspects included in the Fourth general Plan are overviewed. (Author)

  9. Allocation base of general production costs as optimization of prime costs

    Directory of Open Access Journals (Sweden)

    Levytska I.O.

    2017-03-01

    Full Text Available Qualified management aimed at optimizing financial results is the key factor in today's society. Effective management decisions depend on the necessary information about the costs of production process in all its aspects – their structure, types, accounting policies of reflecting costs. General production costs, the so-called indirect costs that are not directly related to the production process, but provide its functioning in terms of supporting structural divisions and create the necessary conditions of production, play a significant role in calculating prime costs of goods (works, services. However, the accurate estimate of prime costs of goods (works, services should be determined with the value of indirect costs (in other words, general production costs, and properly determined with the base of their allocation. The choice of allocation base of general production costs is the significant moment, depending on the nature of business, which must guarantee fair distribution regarding to the largest share of direct expenses in the total structure of production costs. The study finds the essence of general production costs based on the analysis of key definitions of leading Ukrainian economists. The optimal allocation approach of general production costs is to calculate these costs as direct production costs within each subsidiary division (department separately without selecting a base as the main one to the their total amount.

  10. Direct and inverse problems for the generalized relativistic Toda lattice and the connection with general orthogonal polynomials

    International Nuclear Information System (INIS)

    Gago-Alonso, A; Santiago-Moreno, L; Piñeiro-Díaz, L R

    2008-01-01

    We study finite nonlinear dynamical systems that are somehow more general and complex than the relativistic Toda lattice. Our dynamical systems have a matrix representation very similar to the ones that were previously studied. It is defined in terms of a one-parameter family (D(x), M(x)) of matrices, where D(x) is a Hessenberg matrix and M(x) is a lower triangular matrix. The Jordan matrix associated with M −1 (x)D(x) is a constant of motion and the auxiliary spectral data have explicit time evolution. Using the connection between Hessenberg matrices and general orthogonal polynomials we associated to our system a one-parameter family of scalar products that we use to prove the integrability of the system. In particular the inverse transform is given by an orthogonalization process on a given scalar product

  11. General Base-General Acid Catalysis in Human Histone Deacetylase 8.

    Science.gov (United States)

    Gantt, Sister M Lucy; Decroos, Christophe; Lee, Matthew S; Gullett, Laura E; Bowman, Christine M; Christianson, David W; Fierke, Carol A

    2016-02-09

    Histone deacetylases (HDACs) regulate cellular processes such as differentiation and apoptosis and are targeted by anticancer therapeutics in development and in the clinic. HDAC8 is a metal-dependent class I HDAC and is proposed to use a general acid-base catalytic pair in the mechanism of amide bond hydrolysis. Here, we report site-directed mutagenesis and enzymological measurements to elucidate the catalytic mechanism of HDAC8. Specifically, we focus on the catalytic function of Y306 and the histidine-aspartate dyads H142-D176 and H143-D183. Additionally, we report X-ray crystal structures of four representative HDAC8 mutants: D176N, D176N/Y306F, D176A/Y306F, and H142A/Y306F. These structures provide a useful framework for understanding enzymological measurements. The pH dependence of kcat/KM for wild-type Co(II)-HDAC8 is bell-shaped with two pKa values of 7.4 and 10.0. The upper pKa reflects the ionization of the metal-bound water molecule and shifts to 9.1 in Zn(II)-HDAC8. The H142A mutant has activity 230-fold lower than that of wild-type HDAC8, but the pKa1 value is not altered. Y306F HDAC8 is 150-fold less active than the wild-type enzyme; crystal structures show that Y306 hydrogen bonds with the zinc-bound substrate carbonyl, poised for transition state stabilization. The H143A and H142A/H143A mutants exhibit activity that is >80000-fold lower than that of wild-type HDAC8; the buried D176N and D176A mutants have significant catalytic effects, with more subtle effects caused by D183N and D183A. These enzymological and structural studies strongly suggest that H143 functions as a single general base-general acid catalyst, while H142 remains positively charged and serves as an electrostatic catalyst for transition state stabilization.

  12. General Guidelines for Remote Operation and Maintenance of Pyroprocess Equipment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. H.; Park, B. S.; Park, H. S.; Lee, H. J.; Choi, C. W.; Lee, J. K

    2007-12-15

    As the pyroprocess handle the high radioactive materials, a high radioactive material handling facility required high safety, radioactive shielding, strict quality control, and the remote handling equipment of high technology. This report describes the guidelines of for pyroprocess based the design guides for radioactive material handling facility and equipment from American Nuclear Society(ANS), design guidelines for remotely maintained equipment from Oak Ridge National Laboratory(ORNL), and the experience of design for ACP equipment installed at the ACPF(Advanced Conditioning Process Facility). The General guidelines in this report are as follows. The General guidelines for remote operation and maintenance of pyroprocess equipment: Pyroprocess, Remote handling equipment for pyroprocess, General guide for remote operation and maintenance, general guidelines for the design of remotely operated and maintained equipment, Estimation and analysis for remote maintenance.

  13. Handing over patients from the ICU to the general ward

    DEFF Research Database (Denmark)

    Bunkenborg, Gitte; Bitsch Hansen, Tina; Hølge-Hazelton, Bibi

    2017-01-01

    AIM: To explore nursing practice and perception of engaging in communicative interaction when handing over multi-morbid patients from the ICU to general medical or surgical wards. BACKGROUND: Communication failures impose risks to patient safety. ICU and general ward nurses communicate in writing...... focused ethnography was applied to the study. METHODS: Participant observation of 22 clinical situations of handing over patients from the ICU to general wards was conducted in November and December 2015, followed by five focus group interviews, three interviews with general ward nurses and two with ICU...... towards patient status and the handing over process" emerged from observation notes. From transcribed focus group interviews, the theme "Balancing and negotiating when passing on, consuming and adapting knowledge" was identified. CONCLUSION: A lack of shared goals regarding handing over patients from...

  14. Intertwining of birth-and-death processes

    Czech Academy of Sciences Publication Activity Database

    Swart, Jan M.

    2011-01-01

    Roč. 47, č. 1 (2011), s. 1-14 ISSN 0023-5954 R&D Projects: GA ČR GA201/09/1931 Institutional research plan: CEZ:AV0Z10750506 Keywords : Intertwining of Markov processes * birth and death process * averaged Markov process * first passage time * coupling * eigenvalues Subject RIV: BA - General Mathematics Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/swart-intertwining of birth-and- death processes.pdf

  15. Statistical Data Processing with R – Metadata Driven Approach

    Directory of Open Access Journals (Sweden)

    Rudi SELJAK

    2016-06-01

    Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.

  16. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  17. 40 CFR 130.5 - Continuing planning process.

    Science.gov (United States)

    2010-07-01

    ... QUALITY PLANNING AND MANAGEMENT § 130.5 Continuing planning process. (a) General. Each State shall establish and maintain a continuing planning process (CPP) as described under section 303(e)(3)(A)-(H) of... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Continuing planning process. 130.5...

  18. 44 CFR 9.6 - Decision-making process.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Decision-making process. 9.6... HOMELAND SECURITY GENERAL FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.6 Decision-making process... protection decision-making process to be followed by the Agency in applying the Orders to its actions. While...

  19. Some remarks regarding the procedure of the appointment of the secretary general of the United Nations

    OpenAIRE

    Novaković Marko

    2016-01-01

    Appointing Secretary-General is a process that has always been enshrined in secrecy. In 2016, due to reforms in the appointment process instigated by the president of the Security Council Mogens Lykketoft, more inclusion and transparency have been achieved, with the non-state actors being much more involved in the process. In the procedure itself, first five straw polls suggested that Antonio Guterres will be the new Secretary-General and this proved to be truth. Will this more transparent sy...

  20. Catalytic Oxidation of Lignins into the Aromatic Aldehydes: General Process Trends and Development Prospects

    Directory of Open Access Journals (Sweden)

    Valery E. Tarabanko

    2017-11-01

    Full Text Available This review discusses principal patterns that govern the processes of lignins’ catalytic oxidation into vanillin (3-methoxy-4-hydroxybenzaldehyde and syringaldehyde (3,5-dimethoxy-4-hydroxybenzaldehyde. It examines the influence of lignin and oxidant nature, temperature, mass transfer, and of other factors on the yield of the aldehydes and the process selectivity. The review reveals that properly organized processes of catalytic oxidation of various lignins are only insignificantly (10–15% inferior to oxidation by nitrobenzene in terms of yield and selectivity in vanillin and syringaldehyde. Very high consumption of oxygen (and consequentially, of alkali in the process—over 10 mol per mol of obtained vanillin—is highlighted as an unresolved and unexplored problem: scientific literature reveals almost no studies devoted to the possibilities of decreasing the consumption of oxygen and alkali. Different hypotheses about the mechanism of lignin oxidation into the aromatic aldehydes are discussed, and the mechanism comprising the steps of single-electron oxidation of phenolate anions, and ending with retroaldol reaction of a substituted coniferyl aldehyde was pointed out as the most convincing one. The possibility and development prospects of single-stage oxidative processing of wood into the aromatic aldehydes and cellulose are analyzed.

  1. Approximate simulation of Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2006-01-01

    Hawkes processes are important in point process theory and its applications, and simulation of such processes are often needed for various statistical purposes. This article concerns a simulation algorithm for unmarked and marked Hawkes processes, exploiting that the process can be constructed...... as a Poisson cluster process. The algorithm suffers from edge effects but is much faster than the perfect simulation algorithm introduced in our previous work Møller and Rasmussen (2004). We derive various useful measures for the error committed when using the algorithm, and we discuss various empirical...... results for the algorithm compared with perfect simulations. Extensions of the algorithm and the results to more general types of marked point processes are also discussed....

  2. Simulation of the Nitriding Process

    Science.gov (United States)

    Krukovich, M. G.

    2004-01-01

    Simulation of the nitriding process makes it possible to solve many practical problems of process control, prediction of results, and development of new treatment modes and treated materials. The presented classification systematizes nitriding processes and processes based on nitriding, enables consideration of the theory and practice of an individual process in interrelation with other phenomena, outlines ways for intensification of various process variants, and gives grounds for development of recommendations for controlling the structure and properties of the obtained layers. The general rules for conducting the process and formation of phases in the layer and properties of the treated surfaces are used to create a prediction computational model based on analytical, numerical, and empirical approaches.

  3. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    Science.gov (United States)

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  4. Initial design process of the repository

    International Nuclear Information System (INIS)

    Osmanlioglu, A.E.

    2001-01-01

    The concept of the final disposal of high level wastes is to isolate the waste from the biosphere for extremely long periods of time by emplacement of wastes into deep stable geological formations. Several geological formations have been considered as candidate host environments for high level waste disposal and several techniques have been developed for repository design. In this study, interrelationships of main parameters of a general repository design have been defined and effective parameters are shown at each step. Initial design process is based on the long term stability of underground openings as disposal galleries. For this reason, this design process includes two main analyses: mechanical analysis and thermal analysis. Each of the analysis systems is directly related to each other by technical precautions. As a result of this design process, general information about the acceptable depth of the repository, layout and emplacement pattern can be taken. Final design study can be established on the result of initial design process. (author)

  5. Stochastic foundations of undulatory transport phenomena: generalized Poisson–Kac processes—part I basic theory

    International Nuclear Information System (INIS)

    Giona, Massimiliano; Brasiello, Antonio; Crescitelli, Silvestro

    2017-01-01

    This article introduces the notion of generalized Poisson–Kac (GPK) processes which generalize the class of ‘telegrapher’s noise dynamics’ introduced by Kac (1974 Rocky Mount. J. Math . 4 497) in 1974, using Poissonian stochastic perturbations. In GPK processes the stochastic perturbation acts as a switching amongst a set of stochastic velocity vectors controlled by a Markov-chain dynamics. GPK processes possess trajectory regularity (almost everywhere) and asymptotic Kac limit, namely the convergence towards Brownian motion (and to stochastic dynamics driven by Wiener perturbations), which characterizes also the long-term/long-distance properties of these processes. In this article we introduce the structural properties of GPK processes, leaving all the physical implications to part II and part III (Giona et al 2016a J. Phys. A: Math. Theor ., 2016b J. Phys. A: Math. Theor .). (paper)

  6. Generalized Cartan Calculus in general dimension

    Science.gov (United States)

    Wang, Yi-Nan

    2015-07-01

    We develop the generalized Cartan Calculus for the groups and SO(5 , 5). They are the underlying algebraic structures of d = 9 , 7 , 6 exceptional field theory, respectively. These algebraic identities are needed for the "tensor hierarchy" structure in exceptional field theory. The validity of Poincaré lemmas in this new differential geometry is also discussed. Finally we explore some possible extension of the generalized Cartan calculus beyond the exceptional series.

  7. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  8. Sensors, Volume 1, Fundamentals and General Aspects

    Science.gov (United States)

    Grandke, Thomas; Ko, Wen H.

    1996-12-01

    'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume deals with the fundamentals and common principles of sensors and covers the wide areas of principles, technologies, signal processing, and applications. Contents include: Sensor Fundamentals, e.g. Sensor Parameters, Modeling, Design and Packaging; Basic Sensor Technologies, e.g. Thin and Thick Films, Integrated Magnetic Sensors, Optical Fibres and Intergrated Optics, Ceramics and Oxides; Sensor Interfaces, e.g. Signal Processing, Multisensor Signal Processing, Smart Sensors, Interface Systems; Sensor Applications, e.g. Automotive: On-board Sensors, Traffic Surveillance and Control, Home Appliances, Environmental Monitoring, etc. This volume is an indispensable reference work and text book for both specialits and newcomers, researchers and developers.

  9. Generalized waste package containment model

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.

    1985-02-01

    The US Department of Energy (DOE) is developing a performance assessment strategy to demonstrate compliance with standards and technical requirements of the Environmental Protection Agency (EPA) and the Nuclear Regulatory Commission (NRC) for the permanent disposal of high-level nuclear wastes in geologic repositories. One aspect of this strategy is the development of a unified performance model of the entire geologic repository system. Details of a generalized waste package containment (WPC) model and its relationship with other components of an overall repository model are presented in this paper. The WPC model provides stochastically determined estimates of the distributions of times-to-failure of the barriers of a waste package by various corrosion mechanisms and degradation processes. The model consists of a series of modules which employ various combinations of stochastic (probabilistic) and mechanistic process models, and which are individually designed to reflect the current state of knowledge. The WPC model is designed not only to take account of various site-specific conditions and processes, but also to deal with a wide range of site, repository, and waste package configurations. 11 refs., 3 figs., 2 tabs

  10. Manganese Exposure in the General Population in a Mining District ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Manganese Exposure in the General Population in a Mining District (Mexico) ... in a population living close to a mine and mineral processing plant in Mexico ... Call for proposals: Innovations for the economic inclusion of marginalized youth.

  11. Behavioral pattern separation and its link to the neural mechanisms of fear generalization.

    Science.gov (United States)

    Lange, Iris; Goossens, Liesbet; Michielse, Stijn; Bakker, Jindra; Lissek, Shmuel; Papalini, Silvia; Verhagen, Simone; Leibold, Nicole; Marcelis, Machteld; Wichers, Marieke; Lieverse, Ritsaert; van Os, Jim; van Amelsvoort, Therese; Schruers, Koen

    2017-11-01

    Fear generalization is a prominent feature of anxiety disorders and post-traumatic stress disorder (PTSD). It is defined as enhanced fear responding to a stimulus that bears similarities, but is not identical to a threatening stimulus. Pattern separation, a hippocampal-dependent process, is critical for stimulus discrimination; it transforms similar experiences or events into non-overlapping representations. This study is the first in humans to investigate the extent to which fear generalization relies on behavioral pattern separation abilities. Participants (N = 46) completed a behavioral task taxing pattern separation, and a neuroimaging fear conditioning and generalization paradigm. Results show an association between lower behavioral pattern separation performance and increased generalization in shock expectancy scores, but not in fear ratings. Furthermore, lower behavioral pattern separation was associated with diminished recruitment of the subcallosal cortex during presentation of generalization stimuli. This region showed functional connectivity with the orbitofrontal cortex and ventromedial prefrontal cortex. Together, the data provide novel experimental evidence that pattern separation is related to generalization of threat expectancies, and reduced fear inhibition processes in frontal regions. Deficient pattern separation may be critical in overgeneralization and therefore may contribute to the pathophysiology of anxiety disorders and PTSD. © The Author (2017). Published by Oxford University Press.

  12. Toward a new “Fractals-General Science”

    Directory of Open Access Journals (Sweden)

    Hassen Taher Dorrah

    2014-09-01

    Full Text Available A recent study has shown that everywhere real systems follow common “fractals-general stacking behavior” during their change pathways (or evolutionary life cycles. This fact leads to the emergence of the new discipline “Fractals-General Science” as a mother-discipline (and acting as upper umbrella of existing natural and applied sciences to commonly handle their fractals-general change behavior. It is, therefore, the main targets of this short communication are to present the motives, objectives, relations with other existing sciences, and the development map of such new science. It is discussed that there are many foreseen illustrative applications in geology, archeology, astronomy, life sciences, ecology, environmental science, hydrology, agronomy, engineering, materials sciences, chemistry, nanotechnology, biology, medicine, psychiatry, sociology, humanities, education, and arts that could effectively lead the implementation and experimentation of such new science. It is highlighted that the new “Fractals-General Science” could provide through multi-stacking representations the necessary platforms for investigating interactions and mutual changes between real life systems belonging to several sciences and disciplines. Examples are handling problems of the processing of basic formation and changes of matter and substances, propagation of combined corrosion, creep, fatigue and sedimentation of engineering and industrial systems, and the progressing of humans’ evolutionary life cycles.

  13. General Editorial

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education. General Editorial. Articles in Resonance – Journal of Science Education. Volume 19 Issue 1 January 2014 pp 1-2 General Editorial. General Editorial on Publication Ethics · R Ramaswamy · More Details Fulltext PDF. Volume 19 Issue 1 January 2014 pp 3-3 ...

  14. The protection of fundamental human rights in criminal process
    General report

    Directory of Open Access Journals (Sweden)

    Chrisje Brants

    2009-10-01

    Full Text Available This contribution examines the effect of the uniform standards of human rights in international conventions on criminal process in different countries and identifies factors inherent in national systems that influence the scope of international standards and the way in which they are implemented in a national context. Three overreaching issues influence the reception of international fundamental rights and freedoms in criminal process: constitutional arrangements, legal tradition and culture, and practical circumstances. There is no such thing as the uniform implementation of convention standards; even in Europe where the European Convention on Human Rights and Fundamental Freedoms and the case law of the European Court play a significant role, there is still much diversity in the actual implementation of international norms due to the influence of legal traditions which form a counterforce to the weight of convention obligations. An even greater counterforce is at work in practical circumstances that can undermine international norms, most especially global issues of security, crime control and combating terrorism. Although convention norms are still in place, there is a very real risk that they are circumvented or at least diluted in order to increase effective crime control.

  15. Hanford general employee training - A million dollar cost beneficial program

    International Nuclear Information System (INIS)

    Gardner, P.R.

    1991-02-01

    In January 1990, Westinghouse Hanford Company implemented an interactive videodisc training program entitled Hanford General Employee Training. Covering all Institute of Nuclear Power Operations general employee training objectives, training mandated by US Department of Energy orders, and training prescribed by internal Westinghouse Hanford Company policies, Hanford General Employee Training presents and manages engaging training programs individually tailored to each of the 9,000 employees. Development costs for a sophisticated program such as Hanford General Employee Training were high compared to similar costs for developing ''equivalent'' traditional training. Hardware ($500,000) and labor costs ($400,000) totaled $900,000. Annual maintenance costs, equipment plus labor, are totalling about $200,000. On the benefit side, by consolidating some 17 previous Westinghouse Hanford Company courses and more effectively managing the instructional process, Hanford General Employee Training reduced the average student training time from over 11 hours to just under 4 hours. For 9,000 employees, the computed net annual savings exceeds $1.3 million. 2 refs

  16. The Xanthomonas Ax21 protein is processed by the general secretory system and is secreted in association with outer membrane vesicles

    Directory of Open Access Journals (Sweden)

    Ofir Bahar

    2014-01-01

    Full Text Available Pattern recognition receptors (PRRs play an important role in detecting invading pathogens and mounting a robust defense response to restrict infection. In rice, one of the best characterized PRRs is XA21, a leucine rich repeat receptor-like kinase that confers broad-spectrum resistance to multiple strains of the bacterial pathogen Xanthomonas oryzae pv. oryzae (Xoo. In 2009 we reported that an Xoo protein, called Ax21, is secreted by a type I-secretion system and that it serves to activate XA21-mediated immunity. This report has recently been retracted. Here we present data that corrects our previous model. We first show that Ax21 secretion does not depend on the predicted type I secretion system and that it is processed by the general secretion (Sec system. We further show that Ax21 is an outer membrane protein, secreted in association with outer membrane vesicles. Finally, we provide data showing that ax21 knockout strains do not overcome XA21-mediated immunity.

  17. JIT supply chain; an investigation through general system theory

    Directory of Open Access Journals (Sweden)

    O P Mishra

    2013-03-01

    Full Text Available This paper explains theoretical approach of the four theories of General system Theory (GST developed by Yourdon (1989 [Yourdon, E. (1989. Modern Structured Analysis. Yourdon Press, Prentice-Hall International, Englewood Cliffs, New Jersey. Senge] while applying it in information technology and subsequently used by caddy (2007 [Caddy I.N., & Helou, M.M. (2007. Supply chains and their management: Application of general systems theory. Journal of Retailing and Consumer Services, 14, 319–327.] in field of supply chain and management. JIT philosophy in core activities of supply chain i.e. procurement, production processes, and logistics are discussed through general system theory. The growing structure of the supply chain poses the implication restrictions and requires a heavy support system, many times a compromise is done while implementing JIT. The study would be useful to understand the general trends generated naturally regarding the adoption of the JIT philosophy in the supply chain.

  18. The Nursing Process

    Directory of Open Access Journals (Sweden)

    M. Hammond

    1978-09-01

    Full Text Available The essence of the nursing process can be summed up in this quotation by Sir Francis Bacon: “Human knowledge and human powers meet in one; for where the cause is not known the effect cannot be produced.” Arriving at a concise, accurate definition of the nursing process was, for me, an impossible task. It is altogether too vast and too personal a topic to contract down into a niftylooking, we-pay-lip-service-to-it cliché. So what I propose to do is to present my understanding of the nursing process throughout this essay, and then to leave the reader with some overall, general impression of what it all entails.

  19. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Directory of Open Access Journals (Sweden)

    Carrillo RafaelE

    2010-01-01

    Full Text Available Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1 filtering for power line communications, (2 estimation in sensor networks with noisy channels, (3 reconstruction methods for compressed sensing, and (4 fuzzy clustering.

  20. Timelike Compton scattering off the neutron and generalized parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Boer, M.; Guidal, M. [CNRS-IN2P3, Universite Paris-Sud, Institut de Physique Nucleaire d' Orsay, Orsay (France); Vanderhaeghen, M. [Johannes Gutenberg Universitaet, Institut fuer Kernphysik and PRISMA Cluster of Excellence, Mainz (Germany)

    2016-02-15

    We study the exclusive photoproduction of an electron-positron pair on a neutron target in the Jefferson Lab energy domain. The reaction consists of two processes: the Bethe-Heitler and the Timelike Compton Scattering. The latter process provides potentially access to the Generalized Parton Distributions (GPDs) of the nucleon. We calculate all the unpolarized, single- and double-spin observables of the reaction and study their sensitivities to GPDs. (orig.)