WorldWideScience

Sample records for leading log approximation

  1. The Log-Linear Return Approximation, Bubbles, and Predictability

    DEFF Research Database (Denmark)

    Engsted, Tom; Pedersen, Thomas Quistgaard; Tanggaard, Carsten

    2012-01-01

    We study in detail the log-linear return approximation introduced by Campbell and Shiller (1988a). First, we derive an upper bound for the mean approximation error, given stationarity of the log dividend-price ratio. Next, we simulate various rational bubbles which have explosive conditional....... Finally, we show that a bubble model in which expected returns are constant can explain the predictability of stock returns from the dividend-price ratio that many previous studies have documented....

  2. On Rational Approximations to Euler's Constant and to +log(/

    Directory of Open Access Journals (Sweden)

    Carsten Elsner

    2009-01-01

    coefficients such that rationals are given by explicit formulae which approximate and +log. It is shown that for every ∈ℚ>0 and every integer ≥42 there are infinitely many rationals / for =1,2,… such that |+log−/|≪((1−1//(−14 and ∣ with log∼1222 for tending to infinity.

  3. Leading Log Solution for Inflationary Yukawa

    CERN Document Server

    Miao, S P; Miao, Shun-Pei

    2006-01-01

    We generalize Starobinskii's stochastic technique to the theory of a massless, minimally coupled scalar interacting with a massless fermion in a locally de Sitter geometry. The scalar is an ``active'' field that can engender infrared logarithms. The fermion is a ``passive'' field that cannot cause infrared logarithms but which can carry them, and which can also induce new interactions between the active fields. The procedure for dealing with passive fields is to integrate them out, then stochastically simplify the resulting effective action following Starobinski\\u{\\i}. Because Yukawa theory is quadratic in the fermion this can be done explicitly using the classic solution of Candelas and Raine. We check the resulting stochastic formulation against an explicit two loop computation. We also derive a nonperturbative, leading log result for the stress tensor. Because the scalar effective potential induced by fermions is unbounded below, back-reaction from this model might dynamically cancel an arbitrarily large c...

  4. Gluon saturation beyond (naive) leading logs

    Energy Technology Data Exchange (ETDEWEB)

    Beuf, Guillaume

    2014-12-15

    An improved version of the Balitsky–Kovchegov equation is presented, with a consistent treatment of kinematics. That improvement allows to resum the most severe of the large higher order corrections which plague the conventional versions of high-energy evolution equations, with approximate kinematics. This result represents a further step towards having high-energy QCD scattering processes under control beyond strict Leading Logarithmic accuracy and with gluon saturation effects.

  5. Leading log expansion of combinatorial Dyson Schwinger equations

    CERN Document Server

    Delage, Lucas

    2016-01-01

    We study combinatorial Dyson Schwinger equations, expressed in the Hopf algebra of words with a quasi shuffle product. We map them into an algebra of polynomials in one indeterminate L and show that the leading log expansion one obtains with such a mapping are simple power law like expression

  6. Computational error estimates for Monte Carlo finite element approximation with log normal diffusion coefficients

    KAUST Repository

    Sandberg, Mattias

    2015-01-07

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with log normal distributed diffusion coefficients, e.g. modelling ground water flow. Typical models use log normal diffusion coefficients with H¨older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. This talk will address how the total error can be estimated by the computable error.

  7. Approximations of the Generalized Log-Logistic Distribution to the Chi-Square Distribution

    Directory of Open Access Journals (Sweden)

    Kartika Candra Buana

    2014-12-01

    Full Text Available The main purpose of this article is to do approximations graphically and mathematically the four-parameter generalized log-logistic distribution, denoted by G4LL(α,β,m_1,m_2, to the one-parameter Chi-square distribution with υ degrees of freedom. In order to achieve this purpose, this article creates graphically the probability density functions of both distribution and derives mathematically the MGF of the both distributions. To prove the MGF of Chi-square as a special case of the MGF of G4LL distribution, we utilized an expansion of the MacLaurin series. The results show that graphically, the Chi-square distribution can be approximated by the generalized log-logistic distribution. Moreover, by letting α=1,β=-ln⁡(2m_2 ,m_1=v/2 and m_2→∞, the MGF of the G4LL distribution can be written in the form of the MGF of the Chi-square distribution. Thus, the Chi-square distribution is a limiting or special case distribution of the generalized log-logistic distribution.The main purpose of this article is to do approximations graphically and mathematically the four-parameter generalized log-logistic distribution, denoted by G4LL(α,β,m_1,m_2, to the one-parameter Chi-square distribution with υ degrees of freedom. In order to achieve this purpose, this article creates graphically the probability density functions of both distribution and derives mathematically the MGF of the both distributions. To prove the MGF of Chi-square as a special case of the MGF of G4LL distribution, we utilized an expansion of the MacLaurin series. The results show that graphically, the Chi-square distribution can be approximated by the generalized log-logistic distribution. Moreover, by letting α=1,β=-ln⁡(2m_2 ,m_1=v/2 and m_2→∞, the MGF of the G4LL distribution can be written in the form of the MGF of the Chi-square distribution. Thus, the Chi-square distribution is a limiting or special case distribution of the generalized log-logistic distribution.

  8. The approximation of generalized Log-aesthetic curves using Quintic Bezier curves

    Science.gov (United States)

    Albayari, Diya’ J.; Gobithaasan, R. U.; Miura, Kenjiro T.

    2017-09-01

    Generalized Log Aesthetic Curve segments (GLAC) are aesthetic curves that have monotonic curvature profile and hence they are considered fair. In the field of Computer-Aided Design (CAD), there exists a demand to construct fair curves for various design intent. However, we cannot implement GLAC in CAD system partly due to its transcendental form. A viable solution is to approximate GLACs using a quintic polynomial curve in the form of Bezier using curvature error measure. The problem of this approach is that it requires a formidable size of computations due to arc length reparametrization. In this paper, we introduce a new method of calculating curvature error measure using natural spline interpolation function to minimize computation effort while preserving the accuracy. The final section shows numerical examples depicting the proposed approximation of two types of the GLAC, which clearly indicate the efficiency of proposed method.

  9. An O(log*n) approximation algorithm for the asymmetric p-center problem

    Energy Technology Data Exchange (ETDEWEB)

    Vishwanathan, S. [Indian Inst. of Technology, Bombay (India)

    1996-12-31

    The input to the asymmetric p-center problem consists of an integer p and an n x n distance matrix D defined on a vertex set V of size n, where d{sub ij} gives the distance from i to j. The distances are assumed to obey the triangle inequality. For a subset S {improper_subset} V the radius of S is the minimum distance R such that every point in V is at a distance at most R from some point in S. The p-center problem consists of picking a set S {improper_subset} V of size p to minimize the radius. This problem is known to be NP-complete. For the symmetric case, when d{sub ij} = d{sub ji}, approximation algorithms that deliver a solution to within 2 of the optimal are known. David Shmoys, in his article, mentions that nothing was known about the asymmetric case. Rina Panigrahy recently gave a simple O(log n) approximation algorithm. We improve this substantially: our algorithm achieves a factor of 0(log* n).

  10. Piecewise log-normal approximation of size distributions for aerosol modelling

    Directory of Open Access Journals (Sweden)

    K. von Salzen

    2006-01-01

    Full Text Available An efficient and accurate method for the representation of particle size distributions in atmospheric models is proposed. The method can be applied, but is not necessarily restricted, to aerosol mass and number size distributions. A piecewise log-normal approximation of the number size distribution within sections of the particle size spectrum is used. Two of the free parameters of the log-normal approximation are obtained from the integrated number and mass concentration in each section. The remaining free parameter is prescribed. The method is efficient in a sense that only relatively few calculations are required for applications of the method in atmospheric models. Applications of the method in simulations of particle growth by condensation and simulations with a single column model for nucleation, condensation, gravitational settling, wet deposition, and mixing are described. The results are compared to results from simulations employing single- and double-moment bin methods that are frequently used in aerosol modelling. According to these comparisons, the accuracy of the method is noticeably higher than the accuracy of the other methods.

  11. Studies of Approximated Log-MAP Turbo Decoders%近似Log-MAP turbo译码器研究

    Institute of Scientific and Technical Information of China (English)

    张琳; 刘星成; 张晓瑜; 张光昭

    2005-01-01

    该文研究近似Log-MAP算法在turbo译码器中的应用.文章对基于近似算法的WCDMA turbo译码器在AWGN信道和平坦慢衰落Rayleigh信道上的纠错性能进行了仿真.仿真结果表明,二阶近似Log-MAP turbo译码器与MAP turbo译码器性能等价,优于SOVA turbo译码器0.7-0.9dB.

  12. Approximate relationship of coal bed methane and magnetic characteristics of rock via magnetic susceptibility logging

    Science.gov (United States)

    Zhao, Yonghui; Wu, Jiansheng; Zhang, Pingsong; Xiao, Pengfei

    2012-02-01

    In coal bed methane (CBM) exploration, how to improve the accuracy for locating and evaluating the CBM deposits is still a problem due to the rarity of occurrence of CBM. Combined with the distribution of the CBM content in the Huainan coalfield, the approximate relationship between the occurrence of CBM and the magnetic properties of the coal bed and adjacent mudstone have been widely discussed by magnetic logging. Experimental results show that magnetic susceptibility of the coal bed and adjacent mudstone would clearly increase with the CBM content in a coal bed. According to the results of the experiment, the prediction of the CBM content has been accomplished for different coal beds, and the results are consistent with the distribution of the CBM content throughout the whole coalfield. Preliminary data analysis reveals that there is indeed a correlation between the changes of magnetic rock characteristics and the occurrence of the CBM, and this finding may shed some light on the evaluation of CBM.

  13. Determination of log P values of new cyclen based antimalarial drug leads using RP-HPLC.

    Science.gov (United States)

    Rudraraju, A V; Amoyaw, P N A; Hubin, T J; Khan, M O F

    2014-09-01

    Lipophilicity, expressed by log P, is an important physicochemical property of drugs that affects many biological processes, including drug absorption and distribution. The main purpose of this study to determine the log P values of newly discovered drug leads using reversed-phase high-performance liquid chromatography (RP-HPLC). The reference standards, with varying polarity ranges, were dissolved in methanol and analyzed by RP-HPLC using a C18 column. The mobile phase consisted of a mixture of acetonitrile, methanol and water in a gradient elution mode. A calibration curve was plotted between the experimental log P values and obtained log k values of the reference standard compounds and a best fit line was obtained. The log k values of the new drug leads were determined in the same solvent system and were used to calculate the respective log P values by using the best fit equation. The log P vs. log k data gave a best fit linear curve that had an R2 of 0.9786 with Pvalues of the intercept and slope of 1.19 x 10(-6) and 1.56 x 10(-10), respectively, at 0.05 level of significance. Log P values of 15 new drug leads and related compounds, all of which are derivatives of macrocyclic polyamines and their metal complexes, were determined. The values obtained are closely related to the calculated log P (Clog P) values using ChemDraw Ultra 12.0. This experiment provided efficient, fast and reasonable estimates of log P values of the new drug leads by using RP-HPLC.

  14. An $O(\\log n)$-approximation for the Set Cover Problem with Set Ownership

    CERN Document Server

    Gonen, Mira

    2008-01-01

    In highly distributed Internet measurement systems distributed agents periodically measure the Internet using a tool called {\\tt traceroute}, which discovers a path in the network graph. Each agent performs many traceroute measurement to a set of destinations in the network, and thus reveals a portion of the Internet graph as it is seen from the agent locations. In every period we need to check whether previously discovered edges still exist in this period, a process termed {\\em validation}. For this end we maintain a database of all the different measurements performed by each agent. Our aim is to be able to {\\em validate} the existence of all previously discovered edges in the minimum possible time. In this work we formulate the validation problem as a generalization of the well know set cover problem. We reduce the set cover problem to the validation problem, thus proving that the validation problem is ${\\cal NP}$-hard. We present a $O(\\log n)$-approximation algorithm to the validation problem, where $n$ i...

  15. An approximate 3D computational method for real-time computation of induction logging responses

    NARCIS (Netherlands)

    Bensdorp, S.; Petersen, S.A.; Van den Berg, P.M.; Fokkema, J.T.

    2014-01-01

    Over many years, induction logging systems have been used to create well formation logs. The major drawback for the utilization of these tools is the long simulation time for a single forward computation. We proposed an efficient computational method based on a contrast-type of integral-equation for

  16. Filtrations in Dyson-Schwinger equations: next-to^{j} -leading log expansions systematically

    CERN Document Server

    Krueger, Olaf

    2014-01-01

    Dyson-Schwinger equations determine the Green functions $G^r(\\alpha,L)$ in quantum field theory. Their solutions are triangular series in a coupling constant $\\alpha$ and an external scale parameter $L$ for a chosen amplitude $r$, with the order in $L$ bounded by the order in the coupling. Perturbation theory calculates the first few orders in $\\alpha$. On the other hand, Dyson--Schwinger equations determine next-to$^{\\{\\mathrm{j}\\}}$-leading log expansions, $G^r(\\alpha,L) = 1 + \\sum_{j=0}^\\infty \\sum_{\\mathcal{M}} p_j^{\\mathcal{M}}\\alpha^j \\mathcal{M}(u)$. $\\sum_{\\mathcal{M}}$ sums a finite number of functions $\\mathcal{M}$ in $u = \\alpha L/2$. The leading logs come from the trivial representation $\\mathcal{M}(u) = \\begin{bsmallmatrix}\\bullet\\end{bsmallmatrix}(u)$ at $j=0$ with $p_0^{\\begin{bsmallmatrix}\\bullet\\end{bsmallmatrix}} = 1$. All non-leading logs are organized by the suppression in powers $\\alpha^j$. We describe an algebraic method to derive all next-to$^{\\{\\mathrm{j}\\}}$-leading log terms from the...

  17. Effects of First-Order Approximations on Head and Specific Discharge Covariances in High-Contrast Log Conductivity

    Science.gov (United States)

    van Lent, Thomas; Kitanidis, Peter K.

    1996-05-01

    The hydraulic head and the specific discharge fluctuations depend nonlinearly on the hydraulic conductivity. However, the methods most commonly used in the stochastic analysis of groundwater flow are based upon the linearization of these relations. In this paper we apply a numerical spectral approach to investigate the range of validity of the small perturbation approximation for head and specific discharge moments in two-dimensional finite domains. We find that the small perturbation approximation tends to underestimate the variance of large-scale head and specific discharge fluctuations and error increases with increasing log-conductivity variance and increasing domain size. The head fluctuations do not appear ergodic even when the small perturbation approximation predicts they will be ergodic. The specific discharge fluctuations, on the other hand, do appear ergodic. The small perturbation approximation performs well in estimating specific discharge variance in the longitudinal direction but significantly underestimates transverse specific discharge variance.

  18. A $O(\\log m)$, deterministic, polynomial-time computable approximation of Lewis Carroll's scoring rule

    CERN Document Server

    Covey, Jason

    2008-01-01

    We provide deterministic, polynomial-time computable voting rules that approximate Dodgson's and (the ``minimization version'' of) Young's scoring rules to within a logarithmic factor. Our approximation of Dodgson's rule is tight up to a constant factor, as Dodgson's rule is $\\NP$-hard to approximate to within some logarithmic factor. The ``maximization version'' of Young's rule is known to be $\\NP$-hard to approximate by any constant factor. Both approximations are simple, and natural as rules in their own right: Given a candidate we wish to score, we can regard either its Dodgson or Young score as the edit distance between a given set of voter preferences and one in which the candidate to be scored is the Condorcet winner. (The difference between the two scoring rules is the type of edits allowed.) We regard the marginal cost of a sequence of edits to be the number of edits divided by the number of reductions (in the candidate's deficit against any of its opponents in the pairwise race against that opponent...

  19. Simulation of mineral dust aerosol with piecewise log-normal approximation (PLA in CanAM4-PAM

    Directory of Open Access Journals (Sweden)

    Y. Peng

    2011-09-01

    Full Text Available A new size-resolved dust scheme based on the numerical method of piecewise log-normal approximation (PLA was developed and implemented in the fourth generation of the Canadian Atmospheric Global Climate Model with the PLA Aerosol Module (CanAM4-PAM. The total simulated annual mean dust burden is 37.8 mg m−2 for year 2000, which is consistent with estimates from other models. Results from simulations are compared with multiple surface measurements near and away from dust source regions, validating the generation, transport and deposition of dust in the model. Most discrepancies between model results and surface measurements are due to unresolved aerosol processes. Radiative properties of dust aerosol are derived from approximated parameters in two size modes using Mie theory. The simulated aerosol optical depth (AOD is compared with several satellite observations and shows good agreements. The model yields a dust AOD of 0.042 and total AOD of 0.126 for the year 2000. The simulated aerosol direct radiative forcings (ADRF of dust and total aerosol over ocean are −1.24 W m−2 and −4.76 W m−2 respectively, which show good consistency with satellite estimates for the year 2001.

  20. Simulation of mineral dust aerosol with Piecewise Log-normal Approximation (PLA in CanAM4-PAM

    Directory of Open Access Journals (Sweden)

    Y. Peng

    2012-08-01

    Full Text Available A new size-resolved dust scheme based on the numerical method of piecewise log-normal approximation (PLA was developed and implemented in the fourth generation of the Canadian Atmospheric Global Climate Model with the PLA Aerosol Model (CanAM4-PAM. The total simulated annual global dust emission is 2500 Tg yr−1, and the dust mass load is 19.3 Tg for year 2000. Both are consistent with estimates from other models. Results from simulations are compared with multiple surface measurements near and away from dust source regions, validating the generation, transport and deposition of dust in the model. Most discrepancies between model results and surface measurements are due to unresolved aerosol processes. Biases in long-range transport are also contributing. Radiative properties of dust aerosol are derived from approximated parameters in two size modes using Mie theory. The simulated aerosol optical depth (AOD is compared with satellite and surface remote sensing measurements and shows general agreement in terms of the dust distribution around sources. The model yields a dust AOD of 0.042 and dust aerosol direct radiative forcing (ADRF of −1.24 W m−2 respectively, which show good consistency with model estimates from other studies.

  1. NPH Log: Validation of a New Assessment Tool Leading to Earlier Diagnosis of Normal Pressure Hydrocephalus.

    Science.gov (United States)

    Jusué-Torres, Ignacio; Lu, Jennifer; Robison, Jamie; Hoffberger, Jamie B; Hulbert, Alicia; Sanyal, Abanti; Wemmer, Jan; Elder, Benjamin D; Rigamonti, Daniele

    2016-06-27

    Early treatment of normal pressure hydrocephalus (NPH) yields better postoperative outcomes. Our current tests often fail to detect significant changes at early stages. We developed a new scoring system (LP log score) to determine if this tool is more sensitive in detecting clinical differences than current tests. Sixty-two consecutive new patients with suspected idiopathic NPH were studied. Secondary, previously treated and obstructive cases were not included. We collected age, pre- and post-lumbar puncture (LP) Tinetti, Timed Up and Go (TUG) Test, European NPH scale, and LP log scores. The LP log score is recorded at baseline and for seven consecutive days after removing 40 cc of cerebrospinal fluid (CSF) via LP. We studied the diagnostic accuracy of the tests for surgical indication. The post-LP log showed improvement in 90% of people with good baseline gait tests and in 93% of people who did not show any pre-LP and post-LP change in gait tests. Sensitivity, specificity, and accuracy to detect intention to treat when positive post-LP improvements were 4%, 100%, and 24%, respectively, for TUG, 21%, 86%, and 34%, respectively, for the Tinetti Mobility Test, 66%, 29%, and 58%, respectively, for Medical College of Virginia (MCV) grade, and 98%, 33%, and 85%, respectively, for LP log score. Pre-LP and post-LP TUG improvement and pre-LP and post-LP Tinetti improvement were not associated with a surgical indication (p > 0.05). LP log improvement was associated with surgical indication odds ratio (OR): 24.5 95% CI (2.4-248.12) (p = 0.007). LP log showed better sensitivity, diagnostic accuracy, and association with surgical indication than the current diagnostic approach. An LP log may be useful detecting NPH patients at earlier stages and, therefore, yield better surgical outcomes.

  2. NPH Log: Validation of a New Assessment Tool Leading to Earlier Diagnosis of Normal Pressure Hydrocephalus

    Science.gov (United States)

    Lu, Jennifer; Robison, Jamie; Hoffberger, Jamie B; Hulbert, Alicia; Sanyal, Abanti; Wemmer, Jan; Elder, Benjamin D; Rigamonti, Daniele

    2016-01-01

    Introduction: Early treatment of normal pressure hydrocephalus (NPH) yields better postoperative outcomes. Our current tests often fail to detect significant changes at early stages. We developed a new scoring system (LP log score) to determine if this tool is more sensitive in detecting clinical differences than current tests. Material and Methods: Sixty-two consecutive new patients with suspected idiopathic NPH were studied. Secondary, previously treated and obstructive cases were not included. We collected age, pre- and post-lumbar puncture (LP) Tinetti, Timed Up and Go (TUG) Test, European NPH scale, and LP log scores. The LP log score is recorded at baseline and for seven consecutive days after removing 40 cc of cerebrospinal fluid (CSF) via LP. We studied the diagnostic accuracy of the tests for surgical indication. Results: The post-LP log showed improvement in 90% of people with good baseline gait tests and in 93% of people who did not show any pre-LP and post-LP change in gait tests. Sensitivity, specificity, and accuracy to detect intention to treat when positive post-LP improvements were 4%, 100%, and 24%, respectively, for TUG, 21%, 86%, and 34%, respectively, for the Tinetti Mobility Test, 66%, 29%, and 58%, respectively, for Medical College of Virginia (MCV) grade, and 98%, 33%, and 85%, respectively, for LP log score. Pre-LP and post-LP TUG improvement and pre-LP and post-LP Tinetti improvement were not associated with a surgical indication (p > 0.05). LP log improvement was associated with surgical indication odds ratio (OR): 24.5 95% CI (2.4-248.12) (p = 0.007). Conclusions: LP log showed better sensitivity, diagnostic accuracy, and association with surgical indication than the current diagnostic approach. An LP log may be useful detecting NPH patients at earlier stages and, therefore, yield better surgical outcomes. PMID:27489752

  3. Some thoughts on how to match Leading Log Parton Showers with NLO Matrix Elements

    CERN Document Server

    Friberg, C; Friberg, Christer; Sjöstrand, Torbjörn

    1999-01-01

    We propose a scheme that could offer a convenient Monte Carlo sampling of next-to-leading-order matrix elements and, at the same time, allow the interfacing of such parton configurations with a parton-shower approach for the estimation of higher-order effects. No actual implementation exists so far, so this note should only be viewed as the outline of a possible road for the future, submitted for discussion.

  4. Leading corrections to local approximations. II. The case with turning points

    Science.gov (United States)

    Ribeiro, Raphael F.; Burke, Kieron

    2017-03-01

    Quantum corrections to Thomas-Fermi (TF) theory are investigated for noninteracting one-dimensional fermions with known uniform semiclassical approximations to the density and kinetic energy. Their structure is analyzed, and contributions from distinct phase space regions (classically-allowed versus forbidden at the Fermi energy) are derived analytically. Universal formulas are derived for both particle numbers and energy components in each region. For example, in the semiclassical limit, exactly (6π √{3 }) -1 of a particle leaks into the evanescent region beyond a turning point. The correct normalization of semiclassical densities is proven analytically in the semiclassical limit. Energies and densities are tested numerically in a variety of one-dimensional potentials, especially in the limit where TF theory becomes exact. The subtle relation between the pointwise accuracy of the semiclassical approximation and integrated expectation values is explored. The limitations of the semiclassical formulas are also investigated when the potential varies too rapidly. The approximations are shown to work for multiple wells, except right at the mid-phase point of the evanescent regions. The implications for density functional approximations are discussed.

  5. Two-loop Bhabha scattering at high energy beyond leading power approximation

    Directory of Open Access Journals (Sweden)

    Alexander A. Penin

    2016-09-01

    Full Text Available We evaluate the two-loop O(me2/s contribution to the wide-angle high-energy electron–positron scattering in the double-logarithmic approximation. The origin and the general structure of the power-suppressed double logarithmic corrections are discussed in detail.

  6. $O(N)$ model in Euclidean de Sitter space: beyond the leading infrared approximation

    CERN Document Server

    Nacir, Diana López; Trombetta, Leonardo G

    2016-01-01

    We consider an $O(N)$ scalar field model with quartic interaction in $d$-dimensional Euclidean de Sitter space. In order to avoid the problems of the standard perturbative calculations for light and massless fields, we generalize to the $O(N)$ theory a systematic method introduced previously for a single field, which treats the zero modes exactly and the nonzero modes perturbatively. We compute the two-point functions taking into account not only the leading infrared contribution, coming from the self-interaction of the zero modes, but also corrections due to the interaction of the ultraviolet modes. For the model defined in the corresponding Lorentzian de Sitter spacetime, we obtain the two-point functions by analytical continuation. We point out that a partial resummation of the leading secular terms (which necessarily involves nonzero modes) is required to obtain a decay at large distances for massless fields. We implement this resummation along with a systematic double expansion in an effective coupling c...

  7. Optimisation of spectrometric gamma-gamma probe configuration using very low radioactivity sources for lead and zinc grade determination in borehole logging

    CERN Document Server

    Asfahani, J

    1999-01-01

    The suitability of spectrometric backscattered gamma-gamma well logging measurements to predict lead and zinc metal equivalent content is demonstrated. A centralised tool employing a gamma-ray source of very low radioactivity (1.8 MBq) is used. The logging tool is tested using sup 1 sup 3 sup 3 Ba and sup 1 sup 3 sup 7 Cs sources with a 37 mm (diameter)x75 mm NaI (TI) scintillation detector. Five source-to-detector configurations were analysed for 18 geophysical models, 13 of which had a borehole diameter of 130 mm and the other 5 had a borehole diameter of 160 mm. Regression analysis on the laboratory logging data for each configuration in order to establish the calibration equation for a lead (Pb) and zinc metal equivalent (ZME) prediction is carried out. The optimum configuration for the logging probe using a sup 1 sup 3 sup 3 Ba source was determined to be 52 mm source-to-detector spacing. This configuration gives the best results for both Pb and ZME grade. The rms deviations for Pb and ZME were 0.33 and ...

  8. Asymptotic solution of the diffusion equation in slender impermeable tubes of revolution. I. The leading-term approximation

    Energy Technology Data Exchange (ETDEWEB)

    Traytak, Sergey D., E-mail: sergtray@mail.ru [Centre de Biophysique Moléculaire, CNRS-UPR4301, Rue C. Sadron, 45071 Orléans (France); Le STUDIUM (Loire Valley Institute for Advanced Studies), 3D av. de la Recherche Scientifique, 45071 Orléans (France); Semenov Institute of Chemical Physics RAS, 4 Kosygina St., 117977 Moscow (Russian Federation)

    2014-06-14

    The anisotropic 3D equation describing the pointlike particles diffusion in slender impermeable tubes of revolution with cross section smoothly depending on the longitudinal coordinate is the object of our study. We use singular perturbations approach to find the rigorous asymptotic expression for the local particles concentration as an expansion in the ratio of the characteristic transversal and longitudinal diffusion relaxation times. The corresponding leading-term approximation is a generalization of well-known Fick-Jacobs approximation. This result allowed us to delineate the conditions on temporal and spatial scales under which the Fick-Jacobs approximation is valid. A striking analogy between solution of our problem and the method of inner-outer expansions for low Knudsen numbers gas kinetic theory is established. With the aid of this analogy we clarify the physical and mathematical meaning of the obtained results.

  9. Asymptotic solution of the diffusion equation in slender impermeable tubes of revolution. I. The leading-term approximation.

    Science.gov (United States)

    Traytak, Sergey D

    2014-06-14

    The anisotropic 3D equation describing the pointlike particles diffusion in slender impermeable tubes of revolution with cross section smoothly depending on the longitudinal coordinate is the object of our study. We use singular perturbations approach to find the rigorous asymptotic expression for the local particles concentration as an expansion in the ratio of the characteristic transversal and longitudinal diffusion relaxation times. The corresponding leading-term approximation is a generalization of well-known Fick-Jacobs approximation. This result allowed us to delineate the conditions on temporal and spatial scales under which the Fick-Jacobs approximation is valid. A striking analogy between solution of our problem and the method of inner-outer expansions for low Knudsen numbers gas kinetic theory is established. With the aid of this analogy we clarify the physical and mathematical meaning of the obtained results.

  10. Asymptotic solution of the diffusion equation in slender impermeable tubes of revolution. I. The leading-term approximation

    CERN Document Server

    Traytak, Sergey D

    2013-01-01

    The anisotropic 3D equation describing the pointlike particles diffusion in slender impermeable tubes of revolution with cross section smoothly depending on the longitudial coordinate is the object of our study. We use singular perturbations approach to find the rigorous asymptotic expression for the local particles concentration as an expansion in the ratio of the characteristic transversal and longitudial diffusion relaxation times. The corresponding leading-term approximation is a generalization of well-known Fick-Jacobs approximation. This result allowed us to delineate the conditions on temporal and spatial scales under which the Fick-Jacobs approximation is valid. A striking analogy between solution of our problem and the method of inner-outer expansions for low Knudsen numbers gas kinetic theory is established. With the aid of this analogy we clarify the physical and mathematical meaning of the obtained results.

  11. Power Series Approximation for the Correlation Kernel Leading to Kohn-Sham Methods Combining Accuracy, Computational Efficiency, and General Applicability

    Science.gov (United States)

    Erhard, Jannis; Bleiziffer, Patrick; Görling, Andreas

    2016-09-01

    A power series approximation for the correlation kernel of time-dependent density-functional theory is presented. Using this approximation in the adiabatic-connection fluctuation-dissipation (ACFD) theorem leads to a new family of Kohn-Sham methods. The new methods yield reaction energies and barriers of unprecedented accuracy and enable a treatment of static (strong) correlation with an accuracy of high-level multireference configuration interaction methods but are single-reference methods allowing for a black-box-like handling of static correlation. The new methods exhibit a better scaling of the computational effort with the system size than rivaling wave-function-based electronic structure methods. Moreover, the new methods do not suffer from the problem of singularities in response functions plaguing previous ACFD methods and therefore are applicable to any type of electronic system.

  12. Greedy scheduling of cellular self-replication leads to optimal doubling times with a log-Frechet distribution.

    Science.gov (United States)

    Pugatch, Rami

    2015-02-24

    Bacterial self-replication is a complex process composed of many de novo synthesis steps catalyzed by a myriad of molecular processing units, e.g., the transcription-translation machinery, metabolic enzymes, and the replisome. Successful completion of all production tasks requires a schedule-a temporal assignment of each of the production tasks to its respective processing units that respects ordering and resource constraints. Most intracellular growth processes are well characterized. However, the manner in which they are coordinated under the control of a scheduling policy is not well understood. When fast replication is favored, a schedule that minimizes the completion time is desirable. However, if resources are scarce, it is typically computationally hard to find such a schedule, in the worst case. Here, we show that optimal scheduling naturally emerges in cellular self-replication. Optimal doubling time is obtained by maintaining a sufficiently large inventory of intermediate metabolites and processing units required for self-replication and additionally requiring that these processing units be "greedy," i.e., not idle if they can perform a production task. We calculate the distribution of doubling times of such optimally scheduled self-replicating factories, and find it has a universal form-log-Frechet, not sensitive to many microscopic details. Analyzing two recent datasets of Escherichia coli growing in a stationary medium, we find excellent agreement between the observed doubling-time distribution and the predicted universal distribution, suggesting E. coli is optimally scheduling its replication. Greedy scheduling appears as a simple generic route to optimal scheduling when speed is the optimization criterion. Other criteria such as efficiency require more elaborate scheduling policies and tighter regulation.

  13. Shear and Bulk Viscosities of a Weakly Coupled Quark Gluon Plasma with Finite Chemical Potential and Temperature---Leading-Log Results

    CERN Document Server

    Chen, Jiunn-Wei; Song, Yu-Kun; Wang, Qun

    2012-01-01

    We calculate the shear (eta) and bulk (zeta) viscosities of a weakly coupled quark gluon plasma at the leading-log order with finite temperature T and quark chemical potential mu. We find that the shear viscosity to entropy density ratio eta/s increases monotonically with mu and eventually scales as (mu/T)^2 at large mu. In contrary, zeta/s is insensitive to mu. Both eta/s and zeta/s are monotonically decreasing functions of the quark flavor number N_f when N_f \\geq 2. This property is also observed in pion gas systems. Our perturbative calculation suggests that QCD becomes the most perfect (i.e. with the smallest eta/s) at mu=0 and N_f = 16 (the maximum N_f with asymptotic freedom). It would be interesting to test whether the currently smallest eta/s computed close to the phase transition with mu=0 and N_f = 0 can be further reduced by increasing N_f.

  14. Some peculiarities of averaging of functions leading to two-dimensional MHD problems in the zero-induction approximation

    Energy Technology Data Exchange (ETDEWEB)

    Birzvalk, Yu.A.

    1977-10-01

    The peculiarities of averaging of a function with respect to one of its coordinates are studied, resulting in the formulation of two-dimensional MHD problems in the zero-induction approximation. The transition to the two-dimensional approximation is achieved by averaging all of the functions analyzed with respect to one of the coordinates. It is shown that when there is symmetry in the Poisson equation for the potential, components of the scalar product v.rot B appear, as a result of the fact that rot B = O. However, their appearance can also be explained by a clearer, though less strict, method. The importance of consideration of these components must be estimated in each specific problem. An elementary modeling problem is solved allowing the relative significance of the current density component in the direction with respect to which averaging is performed to be estimated. 2 references, 4 figures.

  15. Lead

    Science.gov (United States)

    ... found? Who is at risk? What are the health effects of lead? Get educational material about lead Get certified as a Lead Abatement Worker, or other abatement discipline Lead in drinking water Lead air pollution Test your child Check and maintain your home ...

  16. Lead

    Science.gov (United States)

    ... Chapter 6 Chapter 7 Chapter 8 Chapter 9 Appendix I Appendix II Tables Figures State Programs Alabama Alaska Arizona ... Tool Kit Resources Healthy Homes and Lead Poisoning Prevention Training Center (HHLPPTC) Training Tracks File Formats Help: ...

  17. The magnetic lead field theorem in the quasi-static approximation and its use for magnetoencephalography forward calculation in realistic volume conductors

    Energy Technology Data Exchange (ETDEWEB)

    Nolte, Guido [Human Motor Control Section, NINDS, NIH, Bethesda, MD (United States)

    2003-11-21

    The equation for the magnetic lead field for a given magnetoencephalography (MEG) channel is well known for arbitrary frequencies but is not directly applicable to MEG in the quasi-static approximation. In this paper we derive an equationstarting from the very definition of the lead field instead of using Helmholtz's reciprocity theorems. The results are (a) the transpose of the conductivity times the lead field is divergence-free, and (b) the lead field differs from the one in any other volume conductor by a gradient of a scalar function. Consequently, for a piecewise homogeneous and isotropic volume conductor, the lead field is always tangential at the outermost surface. Based on this theoretical result, we formulated a simple and fast method for the MEG forward calculation for one shell of arbitrary shape: we correct the corresponding lead field for a spherical volume conductor by a superposition of basis functions, gradients of harmonic functions constructed here from spherical harmonics, with coefficients fitted to the boundary conditions. The algorithm was tested for a prolate spheroid of realistic shape for which the analytical solution is known. For high order in the expansion, we found the solutions to be essentially exact and for reasonable accuracies much fewer multiplications are needed than in typical implementations of the boundary element methods. The generalization to more shells is straightforward.

  18. Approximate Matching of Hierarchial Data

    DEFF Research Database (Denmark)

    Augsten, Nikolaus

    The goal of this thesis is to design, develop, and evaluate new methods for the approximate matching of hierarchical data represented as labeled trees. In approximate matching scenarios two items should be matched if they are similar. Computing the similarity between labeled trees is hard...... as in addition to the data values also the structure must be considered. A well-known measure for comparing trees is the tree edit distance. It is computationally expensive and leads to a prohibitively high run time. Our solution for the approximate matching of hierarchical data are pq-grams. The pq...... formally proof that the pq-gram index can be incrementally updated based on the log of edit operations without reconstructing intermediate tree versions. The incremental update is independent of the data size and scales to a large number of changes in the data. We introduce windowed pq...

  19. Análise do lead time nos processos logísticos de uma rede varejista de flores/Analysis of the lead time in the logistics processes of a retail flower network

    National Research Council Canada - National Science Library

    Luciana Torres Correia de Mello; Hugo Carlos Mansano Dornfeld; Givaldo Guilherme dos Santos; Débora Passos; Rafael Ribeiro; Moacir Godinho Filho

    2016-01-01

    .... The aim of this study was to analyze the lead time of logistics processes of a flower retailer network with headquarters in Campinas/SP using the tool Manufacturing Critical-path Time of the Quick...

  20. Application of the Mixed-language Technology to LEAD3.0 Logging Software%语言混编技术在测井统一软件中的应用

    Institute of Scientific and Technical Information of China (English)

    张娟; 周军; 余春昊; 李国军

    2011-01-01

    Mixed-language programming is the process of building programs in which the source code is written in two or more languages. Although mixed-language programming presents some additional challenges, it is worthwhile because it enables to call existing code that may be written in another language. With the development of logging and computer science, the traditional single system, single applications can not fully meet the needs of users. Multi-system integration, data sharing, multi-class application of parallel processing software have become the logging and management software development trend. To develop a data-source-based multi-desktop computer program and network application program are also the main logging technology developing trend. The mixed-language programming technology may be used to: develop different types of application program in one data source; integrate multi-language development module into one system and reduce module reconstruction cost and rick; exchange language data among different logging system so to rednce data loss and abnormal data in data transmission. The bottom layer language (e. G. Assembly language) is used to realize a small section of time-limited code to enhance some functions of higher efficient development system. Discussed are the prospects of the mixed-language programming in logging technology software applications. Mixed-language technology is used in LEAD 3. 0 system and have a good result.%多系统集成、数据共享、多类应用并行已成为测井处理软件及管理软件的发展趋势.基于数据源的多种桌面应用和网络应用类型程序开发是测井软件发展的主要方向.语言混编能够满足在同一种数据源上开发不同应用类型的程序;集成多种语言开发模块到1个系统中,降低模块重构成本和风险;将不同测井系统的语言数据进行数据交换,便于各系统间的数据流通,减少数据在传输转换过程的丢失、异常等问题;

  1. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  2. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  3. Log N-log S in inconclusive

    Science.gov (United States)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  4. BFKL equation for the adjoint representation of the gauge group in the next-to-leading approximation at N=4 SUSY

    Energy Technology Data Exchange (ETDEWEB)

    Fadin, V.S. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Budker Nuclear Physics Institute, Novosibirsk (Russian Federation); Novosibirskij Gosudarstvennyj Univ., Novosibirsk (Russian Federation); Lipatov, L.N. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); St. Petersburg State Univ., Gatchina (Russian Federation)

    2011-12-15

    We calculate the eigenvalues of the next-to-leading kernel for the BFKL equation in the adjoint representation of the gauge group SU(N{sub c}) in the N=4 supersymmetric Yang-Mills model. These eigenvalues are used to obtain the high energy behavior of the remainder function for the 6-point scattering amplitude with the maximal helicity violation in the kinematical regions containing the Mandelstam cut contribution. The leading and next-to-leading singularities of the corresponding collinear anomalous dimension are calculated in all orders of perturbation theory. We compare our result with the known collinear limit and with the recently suggested ansatz for the remainder function in three loops and obtain the full agreement providing that the numerical parameters in this anzatz are chosen in an appropriate way.

  5. Selective logging in the Brazilian Amazon.

    Science.gov (United States)

    Asner, Gregory P; Knapp, David E; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Silva, Jose N

    2005-10-21

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of approximately 0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

  6. CCSD Well Logging Engineering Program

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper introduces briefly the tasks and characteristics of China Continent Science Drilling (CCSD) Well Logging Engineering, the logging methods measured with CCSD, the quality control of original logging information, the logging plan of CCSD, the logging engineering management of CCSD,the logging interpretation and the results and reports made with CCSD.

  7. Power to the logs!

    CERN Document Server

    CERN. Geneva; MACMAHON, Joseph

    2015-01-01

    Are you tired of using grep, vi and emacs to read your logs? Do you feel like you’re missing the big picture? Does the word "statistics" put a smile on your face? Then it’s time to give power to the logs!

  8. The Karlqvist approximation revisited

    CERN Document Server

    Tannous, C

    2015-01-01

    The Karlqvist approximation signaling the historical beginning of magnetic recording head theory is reviewed and compared to various approaches progressing from Green, Fourier, Conformal mapping that obeys the Sommerfeld edge condition at angular points and leads to exact results.

  9. Calculation of the Longitudinal Structure Function from Regge-Like Behaviour of the Gluon Distribution Function in Leading Order Approximation at Low x

    Institute of Scientific and Technical Information of China (English)

    G.R. Boroun; B. Rezaie

    2007-01-01

    We present the calculations of FL longitudinal structure functions from DGLAP evolution equation in leading order (LO) at low-x, assuming the Regge-like behaviour of gluon distribution at this limit. The calculated results are compared with the HI data and QCD fit. It is shown that the obtained results are very close to the mentioned methods. The proposed simple analytical relation for FL provides a t-evolution equation for the determination of the longitudinal structure function at low-x. All the results can consistently be described within the framework of perturbative QCD, which essentially shows increases as x decreases.

  10. Randomized approximate nearest neighbors algorithm.

    Science.gov (United States)

    Jones, Peter Wilcox; Osipov, Andrei; Rokhlin, Vladimir

    2011-09-20

    We present a randomized algorithm for the approximate nearest neighbor problem in d-dimensional Euclidean space. Given N points {x(j)} in R(d), the algorithm attempts to find k nearest neighbors for each of x(j), where k is a user-specified integer parameter. The algorithm is iterative, and its running time requirements are proportional to T·N·(d·(log d) + k·(d + log k)·(log N)) + N·k(2)·(d + log k), with T the number of iterations performed. The memory requirements of the procedure are of the order N·(d + k). A by-product of the scheme is a data structure, permitting a rapid search for the k nearest neighbors among {x(j)} for an arbitrary point x ∈ R(d). The cost of each such query is proportional to T·(d·(log d) + log(N/k)·k·(d + log k)), and the memory requirements for the requisite data structure are of the order N·(d + k) + T·(d + N). The algorithm utilizes random rotations and a basic divide-and-conquer scheme, followed by a local graph search. We analyze the scheme's behavior for certain types of distributions of {x(j)} and illustrate its performance via several numerical examples.

  11. Canonical Angles In A Compact Binary Star System With Spinning Components: Approximative Solution Through Next-To-Leading-Order Spin-Orbit Interaction for Circular Orbits

    CERN Document Server

    Tessmer, Manuel; Schäfer, Gerhard

    2013-01-01

    This publication will deal with an explicit determination of the time evolution of the spin orientation axes and the evolution of the orbital phase in the case of circular orbits under next-to-leading order spin-orbit interactions. We modify the method of Schneider and Cui proposed in ["Theoreme \\"uber Bewegungsintegrale und ihre Anwendungen in Bahntheorien", Verlag der Bayerischen Akademie der Wissenschaften, volume 212, 2005.] to iteratively remove oscillatory terms in the equations of motion for different masses that were not present in the case of equal masses. Our smallness parameter is chosen to be the difference of the symmetric mass ratio to the value 1/4. Before the first Lie transformation, the set of conserved quantities consists of the total angular momentum, the amplitudes of the orbital angular momentum and of the spins, $L, S_1,$ and $S_2$. In contrary, the magnitude of the total spin $S=|S_1+S_2|$ is not conserved and we wish to shift its non-conservation to higher orders of the smallness para...

  12. On log surfaces

    CERN Document Server

    Fujino, Osamu

    2012-01-01

    This paper is an announcement of the minimal model theory for log surfaces in all characteristics and contains some related results including a simplified proof of the Artin-Keel contraction theorem in the surface case.

  13. Application of well log normalization in coalfield seismic inversion

    Institute of Scientific and Technical Information of China (English)

    Qing-Xi LIN; Su-Zhen SHI; Shan-Shan LI; Li LUO; Juan LI; Zi-Liang YU

    2013-01-01

    During the process of coal prospecting and exploration,different measurement time,different logging instruments and series can lead to systematic errors in well logs.Accordingly,all logging curves need to be normalized in the mining area.By studying well-logging normalization methods,and focusing on the characteristics of the coalfield,the frequency histogram method was used in accordance with the condition of the Guqiao Coal Mine.In this way,the density and sonic velocity at marker bed in the non-key well were made to close to those in the key well,and were eventually equal.Well log normalization was completed when this method was applied to the entire logging curves.The results show that the scales of logging data were unified by normalizing coal logging curves,and the logging data were consistent with wave impedance inversion data.A satisfactory inversion effect was obtained.

  14. Well Logging Symposium News

    Institute of Scientific and Technical Information of China (English)

    Yang Chunsheng

    1996-01-01

    @@ ‘96 International Symposium on Well Logging Techniques for Oilfield Development under Waterflood was held on 17-21 September, 1996 in Beijing. The symdrew than 160 experts and scholars in the well logging circle from Russia,The United States, France, Britain, Indonesia and China. About 80 papers were presented duringthe symposium. Mr. Zhang Yongyi,Vice President of CNPC delivered the opening remarks.

  15. Hierarchical matrix approximation of large covariance matrices

    KAUST Repository

    Litvinenko, Alexander

    2015-11-30

    We approximate large non-structured Matérn covariance matrices of size n×n in the H-matrix format with a log-linear computational cost and storage O(kn log n), where rank k ≪ n is a small integer. Applications are: spatial statistics, machine learning and image analysis, kriging and optimal design.

  16. Approximate Representations and Approximate Homomorphisms

    CERN Document Server

    Moore, Cristopher

    2010-01-01

    Approximate algebraic structures play a defining role in arithmetic combinatorics and have found remarkable applications to basic questions in number theory and pseudorandomness. Here we study approximate representations of finite groups: functions f:G -> U_d such that Pr[f(xy) = f(x) f(y)] is large, or more generally Exp_{x,y} ||f(xy) - f(x)f(y)||^2$ is small, where x and y are uniformly random elements of the group G and U_d denotes the unitary group of degree d. We bound these quantities in terms of the ratio d / d_min where d_min is the dimension of the smallest nontrivial representation of G. As an application, we bound the extent to which a function f : G -> H can be an approximate homomorphism where H is another finite group. We show that if H's representations are significantly smaller than G's, no such f can be much more homomorphic than a random function. We interpret these results as showing that if G is quasirandom, that is, if d_min is large, then G cannot be embedded in a small number of dimensi...

  17. Approximate Likelihood

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  18. On a modified non-singular log-conformation formulation for Johnson-Segalman viscoelastic fluids

    OpenAIRE

    Saramito, Pierre

    2014-01-01

    A modified log-conformation formulation of viscoelastic fluid flows is presented in this paper. This new formulation is non-singular for vanishing Weissenberg numbers and allows a direct steady numerical resolution by a Newton method. Moreover, an exact computation of all the terms of the linearized problem is provided. The use of an exact divergence-free finite element method for velocity-pressure approximation and a discontinuous Galerkin upwinding treatment for stresses leads to a robust d...

  19. NMR logging apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  20. NMR logging apparatus

    Science.gov (United States)

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  1. Log4J

    CERN Document Server

    Perry, Steven

    2009-01-01

    Log4j has been around for a while now, and it seems like so many applications use it. I've used it in my applications for years now, and I'll bet you have too. But every time I need to do something with log4j I've never done before I find myself searching for examples of how to do whatever that is, and I don't usually have much luck. I believe the reason for this is that there is a not a great deal of useful information about log4j, either in print or on the Internet. The information is too simple to be of real-world use, too complicated to be distilled quickly (which is what most developers

  2. Approximate Distance Oracles with Improved Query Time

    CERN Document Server

    Wulff-Nilsen, Christian

    2012-01-01

    Given an undirected graph $G$ with $m$ edges, $n$ vertices, and non-negative edge weights, and given an integer $k\\geq 2$, we show that a $(2k-1)$-approximate distance oracle for $G$ of size $O(kn^{1 + 1/k})$ and with $O(\\log k)$ query time can be constructed in $O(\\min\\{kmn^{1/k},\\sqrt km + kn^{1 + c/\\sqrt k}\\})$ time for some constant $c$. This improves the $O(k)$ query time of Thorup and Zwick. For any $0 0$ and $k = O(\\log n/\\log\\log n)$.

  3. Logging on to Learn

    Science.gov (United States)

    Butler, Kevin

    2010-01-01

    A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…

  4. Local spline approximants

    OpenAIRE

    Norton, Andrew H.

    1991-01-01

    Local spline approximants offer a means for constructing finite difference formulae for numerical solution of PDEs. These formulae seem particularly well suited to situations in which the use of conventional formulae leads to non-linear computational instability of the time integration. This is explained in terms of frequency responses of the FDF.

  5. Approximation by Cylinder Surfaces

    DEFF Research Database (Denmark)

    Randrup, Thomas

    1997-01-01

    We present a new method for approximation of a given surface by a cylinder surface. It is a constructive geometric method, leading to a monorail representation of the cylinder surface. By use of a weighted Gaussian image of the given surface, we determine a projection plane. In the orthogonal...

  6. Convergence of posteriors for discretized log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2004-01-01

    In Markov chain Monte Carlo posterior computation for log Gaussian Cox processes (LGCPs) a discretization of the continuously indexed Gaussian field is required. It is demonstrated that approximate posterior expectations computed from discretized LGCPs converge to the exact posterior expectations...

  7. Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation.

    Science.gov (United States)

    Yamane, Ikko; Sasaki, Hiroaki; Sugiyama, Masashi

    2016-07-01

    Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering.

  8. Approximability and Parameterized Complexity of Minmax Values

    DEFF Research Database (Denmark)

    Hansen, Kristoffer Arnsfelt; Hansen, Thomas Dueholm; Miltersen, Peter Bro;

    2008-01-01

    We consider approximating the minmax value of a multi player game in strategic form. Tightening recent bounds by Borgs et al., we observe that approximating the value with a precision of ε log n digits (for any constant ε > 0) is NP-hard, where n is the size of the game. On the other hand...

  9. China Offshore Oil Logging Technology

    Institute of Scientific and Technical Information of China (English)

    Duan Kang

    1996-01-01

    @@ China offshore oil logging business entered a faster developing stage sin-ce 1982 with the beginning of international cooperation in its offshore oil exploration. Nearly 90% of the logging expertises of China National Offshore Oil Corporation (CNOOC)are in China Offshore Oil Logging Company (COOLC), headquartered in Yanjiao, Hebei Province.

  10. Grid Logging: Best Practices Guide

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, Brian L; Tierney, Brian L; Gunter, Dan

    2008-04-01

    The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

  11. Dynamic Planar Convex Hull with Optimal Query Time and O(log n · log log n ) Update Time

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Jakob, Riko

    2000-01-01

    The dynamic maintenance of the convex hull of a set of points in the plane is one of the most important problems in computational geometry. We present a data structure supporting point insertions in amortized O(log n · log log log n) time, point deletions in amortized O(log n · log log n) time...

  12. Hierarchical matrix approximation of large covariance matrices

    KAUST Repository

    Litvinenko, Alexander

    2015-01-07

    We approximate large non-structured covariance matrices in the H-matrix format with a log-linear computational cost and storage O(n log n). We compute inverse, Cholesky decomposition and determinant in H-format. As an example we consider the class of Matern covariance functions, which are very popular in spatial statistics, geostatistics, machine learning and image analysis. Applications are: kriging and optimal design

  13. Well logging utilizing superposition of step-profile responses of logging tools to improve logs

    Energy Technology Data Exchange (ETDEWEB)

    Edmundson, H. N.

    1984-11-20

    Disclosed is a method of improving well logs, such as induction logs and laterologs, by taking into account the effect on a logs and laterologs, by taking into account the effect on a log measurement both of the particular bed in which it is taken and of one or more other beds. In one example the process starts with an original induction log and a provisional layered formation which is based thereon and is characterized by bed boundaries and constant induction log levels within a bed. A provisional reconstructed log is built up by applying the tool response to the layered formation by a new technique which makes this expedient enough to be practicable. The reconstructed log is matched against the original log, and the layered formation is refined accordingly, by adding and/or shifting boundaries and/or by changing the measurement levels within beds, until the match is satisfactory. The original log is then converted to an improved an improved log on the basis of the latest layered formation. Modifications include use of laterologs and to improve induction logs.

  14. Interpretation of horizontal well production logs: influence of logging tool

    Energy Technology Data Exchange (ETDEWEB)

    Ozkan, E. [Colorado School of Mines, Boulder, CO (United States); Sarica, C. [Pennsylvania State Univ., College Park, PA (United States); Haci, M. [Drilling Measurements, Inc (United States)

    1998-12-31

    The influence of a production-logging tool on wellbore flow rate and pressure measurements was investigated, focusing on the disturbence caused by the production-logging tool and the coiled tubing on the original flow conditions in the wellbore. The investigation was carried out using an analytical model and single-phase liquid flow was assumed. Results showed that the production-logging tool influenced the measurements as shown by the deviation of the original flow-rate, pressure profiles and low-conductivity wellbores. High production rates increase the effect of the production-logging tool. Recovering or inferring the original flow conditions in the wellbore from the production-logging data is a very complex process which cannot be solved easily. For this reason, the conditions under which the information obtained by production-logging is meaningful is of considerable practical interest. 7 refs., 2 tabs., 15 figs.

  15. Well logging: utilizing superposition of step-profile responses of logging tools to improve logs

    Energy Technology Data Exchange (ETDEWEB)

    Lacour-Gayet, Ph. J.

    1984-12-04

    Disclosed is a method of improving well logs, such as induction logs and LATEROLOGS, by taking into account the effect on a log measurement both of the particular bed in which it is taken and of one or more other beds. In one example the process starts with an original induction log and a provisional layered formation which is based thereon and is characterized by bed boundaries and constant induction log levels within a bed. A provisional reconstructed log is built up by applying the tool response to the layered formation by a new technique which makes this expedient enough to be practicable. The reconstructed log is matched against the original log, and the layered formation is refined accordingly, by adding and/or shifting boundaries and/or by changing the measurement levels within beds, until the match is satisfactory. The original log is then converted to an improved log on the basis of the latest layered formation. Modifications include use of LATEROLOG measurements to improve LATEROLOGS and to improve induction logs.

  16. Well logging utilizing superposition of step-profile responses of logging tools to improve logs

    Energy Technology Data Exchange (ETDEWEB)

    Minne, J.-C.

    1984-11-13

    Disclosed is a method of improving well logs, such as induction logs and LATERLOGS, by taking into account the effect on a log measurement both of the particular bed in which it is taken and of one or more other beds. In one example the process starts with an original induction log and a provisional layered formation which is based thereon and is characterized by bed boundaries and constant induction log levels within a bed. A provisional reconstructed log is built up by applying the tool response to the layered formation by a new technique which makes this expedient enough to be practicable. The reconstructed log is matched against the original log, and the layered formation is refined accordingly, by adding and/or shifting boundaries and/or by changing the measurement levels within beds, until the match is satisfactory. The original log is then converted to an improved log on the basis of the latest layered formation. Modifications include use of LATEROLOG measurements to improve LATEROLOGS and to improve induction logs.

  17. Laser scanning measurements on trees for logging harvesting operations.

    Science.gov (United States)

    Zheng, Yili; Liu, Jinhao; Wang, Dian; Yang, Ruixi

    2012-01-01

    Logging harvesters represent a set of high-performance modern forestry machinery, which can finish a series of continuous operations such as felling, delimbing, peeling, bucking and so forth with human intervention. It is found by experiment that during the process of the alignment of the harvesting head to capture the trunk, the operator needs a lot of observation, judgment and repeated operations, which lead to the time and fuel losses. In order to improve the operation efficiency and reduce the operating costs, the point clouds for standing trees are collected with a low-cost 2D laser scanner. A cluster extracting algorithm and filtering algorithm are used to classify each trunk from the point cloud. On the assumption that every cross section of the target trunk is approximate a standard circle and combining the information of an Attitude and Heading Reference System, the radii and center locations of the trunks in the scanning range are calculated by the Fletcher-Reeves conjugate gradient algorithm. The method is validated through experiments in an aspen forest, and the optimized calculation time consumption is compared with the previous work of other researchers. Moreover, the implementation of the calculation result for automotive capturing trunks by the harvesting head during the logging operation is discussed in particular.

  18. Laser Scanning Measurements on Trees for Logging Harvesting Operations

    Directory of Open Access Journals (Sweden)

    Ruixi Yang

    2012-07-01

    Full Text Available Logging harvesters represent a set of high-performance modern forestry machinery, which can finish a series of continuous operations such as felling, delimbing, peeling, bucking and so forth with human intervention. It is found by experiment that during the process of the alignment of the harvesting head to capture the trunk, the operator needs a lot of observation, judgment and repeated operations, which lead to the time and fuel losses. In order to improve the operation efficiency and reduce the operating costs, the point clouds for standing trees are collected with a low-cost 2D laser scanner. A cluster extracting algorithm and filtering algorithm are used to classify each trunk from the point cloud. On the assumption that every cross section of the target trunk is approximate a standard circle and combining the information of an Attitude and Heading Reference System, the radii and center locations of the trunks in the scanning range are calculated by the Fletcher-Reeves conjugate gradient algorithm. The method is validated through experiments in an aspen forest, and the optimized calculation time consumption is compared with the previous work of other researchers. Moreover, the implementation of the calculation result for automotive capturing trunks by the harvesting head during the logging operation is discussed in particular.

  19. Minimal Log Gravity

    CERN Document Server

    Giribet, Gaston

    2014-01-01

    Minimal Massive Gravity (MMG) is an extension of three-dimensional Topologically Massive Gravity that, when formulated about Anti-de Sitter space, accomplishes to solve the tension between bulk and boundary unitarity that other models in three dimensions suffer from. We study this theory at the chiral point, i.e. at the point of the parameter space where one of the central charges of the dual conformal field theory vanishes. We investigate the non-linear regime of the theory, meaning that we study exact solutions to the MMG field equations that are not Einstein manifolds. We exhibit a large class of solutions of this type, which behave asymptotically in different manners. In particular, we find analytic solutions that represent two-parameter deformations of extremal Banados-Teitelboim-Zanelli (BTZ) black holes. These geometries behave asymptotically as solutions of the so-called Log Gravity, and, despite the weakened falling-off close to the boundary, they have finite mass and finite angular momentum, which w...

  20. Minimal log gravity

    Science.gov (United States)

    Giribet, Gaston; Vásquez, Yerko

    2015-01-01

    Minimal massive gravity (MMG) is an extension of three-dimensional topologically massive gravity that, when formulated about anti-de Sitter space, accomplishes solving the tension between bulk and boundary unitarity that other models in three dimensions suffer from. We study this theory at the chiral point, i.e. at the point of the parameter space where one of the central charges of the dual conformal field theory vanishes. We investigate the nonlinear regime of the theory, meaning that we study exact solutions to the MMG field equations that are not Einstein manifolds. We exhibit a large class of solutions of this type, which behave asymptotically in different manners. In particular, we find analytic solutions that represent two-parameter deformations of extremal Bañados-Teitelboim-Zanelli black holes. These geometries behave asymptotically as solutions of the so-called log gravity, and, despite the weakened falling off close to the boundary, they have finite mass and finite angular momentum, which we compute. We also find time-dependent deformations of Bañados-Teitelboim-Zanelli that obey Brown-Henneaux asymptotic boundary conditions. The existence of such solutions shows that the Birkhoff theorem does not hold in MMG at the chiral point. Other peculiar features of the theory at the chiral point, such as the degeneracy it exhibits in the decoupling limit, are discussed.

  1. Approximation for Bayesian Ability Estimation.

    Science.gov (United States)

    1987-02-18

    posterior pdfs of ande are given by p(-[Y) p(F) F P((y lei’ j)P )d. SiiJ i (4) a r~d p(e Iy) - p(t0) 1 J i P(Yij ei, (5) As shown in Tsutakawa and Lin...inverse A Hessian of the log of (27) with respect to , evaulatedat a Then, under regularity conditions, the marginal posterior pdf of O is...two-way contingency tables. Journal of Educational Statistics, 11, 33-56. Lindley, D.V. (1980). Approximate Bayesian methods. Trabajos Estadistica , 31

  2. SNG-logs at Skjern

    DEFF Research Database (Denmark)

    Korsbech, Uffe C C; Petersen, Jesper; Aage, Helle Karina

    1998-01-01

    Spectral Natural Gamma-ray logs have been run in two water supply borings at Skjern. The log data have been examined by a new technique - Noise Adjusted Singular Value Decomposition - in order to get a detailed and reliable picture of the distribution of uranium and thorium gamma-rays from heavy...

  3. Approximating Frequent Items in Asynchronous Data Stream over a Sliding Window

    DEFF Research Database (Denmark)

    Ting, Hing-Fung; Lee, Lap Kei; Chan, Ho-Leung;

    2011-01-01

    In an asynchronous data stream, the data items may be out of order with respect to their original timestamps. This paper studies the space complexity required by a data structure to maintain such a data stream so that it can approximate the set of frequent items over a sliding time window...... with sufficient accuracy. Prior to our work, the best solution is given by Cormode et al. [1], who gave an O (1/ε log W log (εB/ log W) min {log W, 1/ε} log |U|)- space data structure that can approximate the frequent items within an ε error bound, where W and B are parameters of the sliding window, and U...... is the set of all possible item names. We gave a more space-efficient data structure that only requires O (1/ε log W log (εB/ logW) log log W) space....

  4. Acoustic Logging Modeling by Refined Biot's Equations

    Science.gov (United States)

    Plyushchenkov, Boris D.; Turchaninov, Victor I.

    An explicit uniform completely conservative finite difference scheme for the refined Biot's equations is proposed. This system is modified according to the modern theory of dynamic permeability and tortuosity in a fluid-saturated elastic porous media. The approximate local boundary transparency conditions are constructed. The acoustic logging device is simulated by the choice of appropriate boundary conditions on its external surface. This scheme and these conditions are satisfactory for exploring borehole acoustic problems in permeable formations in a real axial-symmetrical situation. The developed approach can be adapted for a nonsymmetric case also.

  5. Preserving Privacy in Transparency Logging

    OpenAIRE

    Pulls, Tobias

    2015-01-01

    The subject of this dissertation is the construction of privacy-enhancing technologies (PETs) for transparency logging, a technology at the intersection of privacy, transparency, and accountability. Transparency logging facilitates the transportation of data from service providers to users of services and is therefore a key enabler for ex-post transparency-enhancing tools (TETs). Ex-post transparency provides information to users about how their personal data have been processed by service pr...

  6. Approximation properties of haplotype tagging

    Directory of Open Access Journals (Sweden)

    Dreiseitl Stephan

    2006-01-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs are locations at which the genomic sequences of population members differ. Since these differences are known to follow patterns, disease association studies are facilitated by identifying SNPs that allow the unique identification of such patterns. This process, known as haplotype tagging, is formulated as a combinatorial optimization problem and analyzed in terms of complexity and approximation properties. Results It is shown that the tagging problem is NP-hard but approximable within 1 + ln((n2 - n/2 for n haplotypes but not approximable within (1 - ε ln(n/2 for any ε > 0 unless NP ⊂ DTIME(nlog log n. A simple, very easily implementable algorithm that exhibits the above upper bound on solution quality is presented. This algorithm has running time O((2m - p + 1 ≤ O(m(n2 - n/2 where p ≤ min(n, m for n haplotypes of size m. As we show that the approximation bound is asymptotically tight, the algorithm presented is optimal with respect to this asymptotic bound. Conclusion The haplotype tagging problem is hard, but approachable with a fast, practical, and surprisingly simple algorithm that cannot be significantly improved upon on a single processor machine. Hence, significant improvement in computatational efforts expended can only be expected if the computational effort is distributed and done in parallel.

  7. Weighted approximation with varying weight

    CERN Document Server

    Totik, Vilmos

    1994-01-01

    A new construction is given for approximating a logarithmic potential by a discrete one. This yields a new approach to approximation with weighted polynomials of the form w"n"(" "= uppercase)P"n"(" "= uppercase). The new technique settles several open problems, and it leads to a simple proof for the strong asymptotics on some L p(uppercase) extremal problems on the real line with exponential weights, which, for the case p=2, are equivalent to power- type asymptotics for the leading coefficients of the corresponding orthogonal polynomials. The method is also modified toyield (in a sense) uniformly good approximation on the whole support. This allows one to deduce strong asymptotics in some L p(uppercase) extremal problems with varying weights. Applications are given, relating to fast decreasing polynomials, asymptotic behavior of orthogonal polynomials and multipoint Pade approximation. The approach is potential-theoretic, but the text is self-contained.

  8. Approximation by Cylinder Surfaces

    DEFF Research Database (Denmark)

    Randrup, Thomas

    1997-01-01

    We present a new method for approximation of a given surface by a cylinder surface. It is a constructive geometric method, leading to a monorail representation of the cylinder surface. By use of a weighted Gaussian image of the given surface, we determine a projection plane. In the orthogonal...... projection of the surface onto this plane, a reference curve is determined by use of methods for thinning of binary images. Finally, the cylinder surface is constructed as follows: the directrix of the cylinder surface is determined by a least squares method minimizing the distance to the points...... in the projection within a tolerance given by the reference curve, and the rulings are lines perpendicular to the projection plane. Application of the method in ship design is given....

  9. A Modified max-log-MAP Decoding Algorithm for Turbo Decoding

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Turbo decoding is iterative decoding, and the MAP algorithm is optimal in terms of performance in Turbo decoding. The log-MAP algorithm is the MAP executed in the logarithmic domain, so it is also optimal. Both the MAP and the log-MAP algorithm are complicated for implementation. The max-log-MAP algorithm is derived from the log-MAP with approximation, which is simply compared with the log-MAP algorithm but is suboptimal in terms of performance. A modified max-log-MAP algorithm is presented in this paper, based on the Taylor series of logarithm and exponent. Analysis and simulation results show that the modified max-log-MAP algorithm outperforms the max-log-MAP algorithm with almost the same complexity.

  10. Log-periodic Critical Amplitudes: A Perturbative Approach

    Science.gov (United States)

    Derrida, Bernard; Giacomin, Giambattista

    2013-06-01

    Log-periodic amplitudes appear in the critical behavior of a large class of systems, in particular when a discrete scale invariance is present. Here we show how to compute these critical amplitudes perturbatively when they originate from a renormalization map which is close to a monomial. In this case, the log-periodic amplitudes of the subdominant corrections to the leading critical behavior can also be calculated.

  11. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  12. Diophantine approximation and badly approximable sets

    DEFF Research Database (Denmark)

    Kristensen, S.; Thorn, R.; Velani, S.

    2006-01-01

    Let (X,d) be a metric space and (Omega, d) a compact subspace of X which supports a non-atomic finite measure m.  We consider `natural' classes of badly approximable  subsets of Omega. Loosely speaking, these consist of points in Omega which `stay clear' of some given set of points in X. The clas......Let (X,d) be a metric space and (Omega, d) a compact subspace of X which supports a non-atomic finite measure m.  We consider `natural' classes of badly approximable  subsets of Omega. Loosely speaking, these consist of points in Omega which `stay clear' of some given set of points in X....... The classical set Bad of `badly approximable' numbers in the theory of Diophantine approximation falls within our framework as do the sets Bad(i,j) of simultaneously badly approximable numbers. Under various natural conditions we prove that the badly approximable subsets of Omega have full Hausdorff dimension...

  13. Mean shift based log-Gabor wavelet image coding

    Institute of Scientific and Technical Information of China (English)

    LI Ji-liang; FANG Xiang-zhong; HOU Jun

    2007-01-01

    In this paper, we propose a sparse overcomplete image approximation method based on the ideas of overcomplete log-Gabor wavelet, mean shift and energy concentration. The proposed approximation method selects the necessary wavelet coefficients with a mean shift based algorithm, and concentrates energy on the selected coefficients. It can sparsely approximate the original image, and converges faster than the existing local competition based method. Then, we propose a new compression scheme based on the above approximation method. The scheme has compression performance similar to JPEG 2000. The images decoded with the proposed compression scheme appear more pleasant to the human eyes than those with JPEG 2000.

  14. Log-periodic route to fractal functions.

    Science.gov (United States)

    Gluzman, S; Sornette, D

    2002-03-01

    Log-periodic oscillations have been found to decorate the usual power-law behavior found to describe the approach to a critical point, when the continuous scale-invariance symmetry is partially broken into a discrete-scale invariance symmetry. For Ising or Potts spins with ferromagnetic interactions on hierarchical systems, the relative magnitude of the log-periodic corrections are usually very small, of order 10(-5). In growth processes [diffusion limited aggregation (DLA)], rupture, earthquake, and financial crashes, log-periodic oscillations with amplitudes of the order of 10% have been reported. We suggest a "technical" explanation for this 4 order-of-magnitude difference based on the property of the "regular function" g(x) embodying the effect of the microscopic degrees of freedom summed over in a renormalization group (RG) approach F(x)=g(x)+mu(-1)F(gamma x) of an observable F as a function of a control parameter x. For systems for which the RG equation has not been derived, the previous equation can be understood as a Jackson q integral, which is the natural tool for describing discrete-scale invariance. We classify the "Weierstrass-type" solutions of the RG into two classes characterized by the amplitudes A(n) of the power-law series expansion. These two classes are separated by a novel "critical" point. Growth processes (DLA), rupture, earthquake, and financial crashes thus seem to be characterized by oscillatory or bounded regular microscopic functions that lead to a slow power-law decay of A(n), giving strong log-periodic amplitudes. If in addition, the phases of A(n) are ergodic and mixing, the observable presents self-affine nondifferentiable properties. In contrast, the regular function of statistical physics models with "ferromagnetic"-type interactions at equilibrium involves unbound logarithms of polynomials of the control variable that lead to a fast exponential decay of A(n) giving weak log-periodic amplitudes and smoothed observables.

  15. Testing and analysis of internal hardwood log defect prediction models

    Science.gov (United States)

    R. Edward. Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  16. Mud Logging; Control geologico en perforaciones petroliferas (Mud Logging)

    Energy Technology Data Exchange (ETDEWEB)

    Pumarega Lafuente, J.C.

    1994-12-31

    Mud Logging is an important activity in the oil field and it is a key job in drilling operations, our duties are the acquisition, collection and interpretation of the geological and engineering data at the wellsite, also inform the client immediately of any significant changes in the well. (Author)

  17. Counting independent sets using the Bethe approximation

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Laboratory; Chandrasekaran, V [MIT; Gamarmik, D [MIT; Shah, D [MIT; Sin, J [MIT

    2009-01-01

    The authors consider the problem of counting the number of independent sets or the partition function of a hard-core model in a graph. The problem in general is computationally hard (P hard). They study the quality of the approximation provided by the Bethe free energy. Belief propagation (BP) is a message-passing algorithm can be used to compute fixed points of the Bethe approximation; however, BP is not always guarantee to converge. As the first result, they propose a simple message-passing algorithm that converges to a BP fixed pont for any grapy. They find that their algorithm converges within a multiplicative error 1 + {var_epsilon} of a fixed point in {Omicron}(n{sup 2}E{sup -4} log{sup 3}(nE{sup -1})) iterations for any bounded degree graph of n nodes. In a nutshell, the algorithm can be thought of as a modification of BP with 'time-varying' message-passing. Next, they analyze the resulting error to the number of independent sets provided by such a fixed point of the Bethe approximation. Using the recently developed loop calculus approach by Vhertkov and Chernyak, they establish that for any bounded graph with large enough girth, the error is {Omicron}(n{sup -{gamma}}) for some {gamma} > 0. As an application, they find that for random 3-regular graph, Bethe approximation of log-partition function (log of the number of independent sets) is within o(1) of corret log-partition - this is quite surprising as previous physics-based predictions were expecting an error of o(n). In sum, their results provide a systematic way to find Bethe fixed points for any graph quickly and allow for estimating error in Bethe approximation using novel combinatorial techniques.

  18. Detecting Botnets Through Log Correlation

    CERN Document Server

    Al-Hammadi, Yousof

    2010-01-01

    Botnets, which consist of thousands of compromised machines, can cause significant threats to other systems by launching Distributed Denial of Service (SSoS) attacks, keylogging, and backdoors. In response to these threats, new effective techniques are needed to detect the presence of botnets. In this paper, we have used an interception technique to monitor Windows Application Programming Interface (API) functions calls made by communication applications and store these calls with their arguments in log files. Our algorithm detects botnets based on monitoring abnormal activity by correlating the changes in log file sizes from different hosts.

  19. Approximation of Surfaces by Cylinders

    DEFF Research Database (Denmark)

    Randrup, Thomas

    1998-01-01

    We present a new method for approximation of a given surface by a cylinder surface. It is a constructive geometric method, leading to a monorail representation of the cylinder surface. By use of a weighted Gaussian image of the given surface, we determine a projection plane. In the orthogonal...

  20. Optimal Belief Approximation

    CERN Document Server

    Leike, Reimar H

    2016-01-01

    In Bayesian statistics probability distributions express beliefs. However, for many problems the beliefs cannot be computed analytically and approximations of beliefs are needed. We seek a ranking function that quantifies how "embarrassing" it is to communicate a given approximation. We show that there is only one ranking under the requirements that (1) the best ranked approximation is the non-approximated belief and (2) that the ranking judges approximations only by their predictions for actual outcomes. We find that this ranking is equivalent to the Kullback-Leibler divergence that is frequently used in the literature. However, there seems to be confusion about the correct order in which its functional arguments, the approximated and non-approximated beliefs, should be used. We hope that our elementary derivation settles the apparent confusion. We show for example that when approximating beliefs with Gaussian distributions the optimal approximation is given by moment matching. This is in contrast to many su...

  1. Approximating Graphic TSP by Matchings

    CERN Document Server

    Mömke, Tobias

    2011-01-01

    We present a framework for approximating the metric TSP based on a novel use of matchings. Traditionally, matchings have been used to add edges in order to make a given graph Eulerian, whereas our approach also allows for the removal of certain edges leading to a decreased cost. For the TSP on graphic metrics (graph-TSP), the approach yields a 1.461-approximation algorithm with respect to the Held-Karp lower bound. For graph-TSP restricted to a class of graphs that contains degree three bounded and claw-free graphs, we show that the integrality gap of the Held-Karp relaxation matches the conjectured ratio 4/3. The framework allows for generalizations in a natural way and also leads to a 1.586-approximation algorithm for the traveling salesman path problem on graphic metrics where the start and end vertices are prespecified.

  2. Exact and Approximate Unitary 2-Designs: Constructions and Applications

    CERN Document Server

    Dankert, C; Emerson, J; Livine, E; Dankert, Christoph; Cleve, Richard; Emerson, Joseph; Livine, Etera

    2006-01-01

    We consider an extension of the concept of spherical t-designs to the unitary group in order to develop a unified framework for analyzing the resource requirements of randomized quantum algorithms. We show that certain protocols based on twirling require a unitary 2-design. We describe an efficient construction for an exact unitary 2-design based on the Clifford group, and then develop a method for generating an epsilon-approximate unitary 2-design that requires only O(n log(1/epsilon)) gates, where n is the number of qubits and epsilon is an appropriate measure of precision. These results lead to a protocol with exponential resource savings over existing experimental methods for estimating the characteristic fidelities of physical quantum processes.

  3. A forward modeling approach for interpreting impeller flow logs.

    Science.gov (United States)

    Parker, Alison H; West, L Jared; Odling, Noelle E; Bown, Richard T

    2010-01-01

    A rigorous and practical approach for interpretation of impeller flow log data to determine vertical variations in hydraulic conductivity is presented and applied to two well logs from a Chalk aquifer in England. Impeller flow logging involves measuring vertical flow speed in a pumped well and using changes in flow with depth to infer the locations and magnitudes of inflows into the well. However, the measured flow logs are typically noisy, which leads to spurious hydraulic conductivity values where simplistic interpretation approaches are applied. In this study, a new method for interpretation is presented, which first defines a series of physical models for hydraulic conductivity variation with depth and then fits the models to the data, using a regression technique. Some of the models will be rejected as they are physically unrealistic. The best model is then selected from the remaining models using a maximum likelihood approach. This balances model complexity against fit, for example, using Akaike's Information Criterion.

  4. Log-periodic dipole antenna with low cross-polarization

    DEFF Research Database (Denmark)

    Pivnenko, Sergey

    2006-01-01

    In this work, log-periodic antennas with improved cross-polarization level were studied. It was found that some modifications of the traditional design lead to an essential improvement of the co-to-cross polarization ratio up to 40 dB. An improved design of a log-periodic dipole antenna with low...... cross-polarization level is described. Some recommendations regarding improvement of the polarization characteristics of log- periodic antennas in general are also given. It was also found that log-periodic antennas can be attributed to the class of so-called first-order (m = plusmn1) antennas, which...... is an important requirement for probes in spherical near- field antenna measurements....

  5. Avoid Logs to Avoid Ticks

    Institute of Scientific and Technical Information of China (English)

    莫文佳

    2004-01-01

    扁虱是莱姆关节炎的罪魁祸首。研究人员为了弄明白何处扁虱最猖獗, 不惜以身作饵,他们发现:The ticks were all over the log surface。因此告诫人 们:Avoid sitting on logs。

  6. Decomposable log-linear models

    DEFF Research Database (Denmark)

    Eriksen, Poul Svante

    can be characterized by a structured set of conditional independencies between some variables given some other variables. We term the new model class decomposable log-linear models, which is illustrated to be a much richer class than decomposable graphical models.It covers a wide range of non...

  7. 29 CFR 1918.88 - Log operations.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  8. Hardwood log supply: a broader perspective

    Science.gov (United States)

    Iris Montague; Adri Andersch; Jan Wiedenbeck; Urs. Buehlmann

    2015-01-01

    At regional and state meetings we talk with others in our business about the problems we face: log exports, log quality, log markets, logger shortages, cash flow problems, the weather. These are familiar talking points and real and persistent problems. But what is the relative importance of these problems for log procurement in different regions of...

  9. Approximate flavor symmetries

    OpenAIRE

    Rašin, Andrija

    1994-01-01

    We discuss the idea of approximate flavor symmetries. Relations between approximate flavor symmetries and natural flavor conservation and democracy models is explored. Implications for neutrino physics are also discussed.

  10. On Element SDD Approximability

    CERN Document Server

    Avron, Haim; Toledo, Sivan

    2009-01-01

    This short communication shows that in some cases scalar elliptic finite element matrices cannot be approximated well by an SDD matrix. We also give a theoretical analysis of a simple heuristic method for approximating an element by an SDD matrix.

  11. Bi-log-concave Distribution Functions

    DEFF Research Database (Denmark)

    Dümbgen, Lutz; Kolesnyk, Petro; Wilke, Ralf

    2017-01-01

    Nonparametric statistics for distribution functions F or densities f=F′ under qualitative shape constraints constitutes an interesting alternative to classical parametric or entirely nonparametric approaches. We contribute to this area by considering a new shape constraint: F is said to be bi......-log-concave, if both logF and log(1−F) are concave. Many commonly considered distributions are compatible with this constraint. For instance, any c.d.f. F with log-concave density f=F′ is bi-log-concave. But in contrast to log-concavity of f, bi-log-concavity of F allows for multimodal densities. We provide various...

  12. Approximate iterative algorithms

    CERN Document Server

    Almudevar, Anthony Louis

    2014-01-01

    Iterative algorithms often rely on approximate evaluation techniques, which may include statistical estimation, computer simulation or functional approximation. This volume presents methods for the study of approximate iterative algorithms, providing tools for the derivation of error bounds and convergence rates, and for the optimal design of such algorithms. Techniques of functional analysis are used to derive analytical relationships between approximation methods and convergence properties for general classes of algorithms. This work provides the necessary background in functional analysis a

  13. Hierarchical matrix approximation of large covariance matrices

    KAUST Repository

    Litvinenko, Alexander

    2015-01-05

    We approximate large non-structured covariance matrices in the H-matrix format with a log-linear computational cost and storage O(nlogn). We compute inverse, Cholesky decomposition and determinant in H-format. As an example we consider the class of Matern covariance functions, which are very popular in spatial statistics, geostatistics, machine learning and image analysis. Applications are: kriging and op- timal design.

  14. Approximation of distributed delays

    CERN Document Server

    Lu, Hao; Eberard, Damien; Simon, Jean-Pierre

    2010-01-01

    We address in this paper the approximation problem of distributed delays. Such elements are convolution operators with kernel having bounded support, and appear in the control of time-delay systems. From the rich literature on this topic, we propose a general methodology to achieve such an approximation. For this, we enclose the approximation problem in the graph topology, and work with the norm defined over the convolution Banach algebra. The class of rational approximates is described, and a constructive approximation is proposed. Analysis in time and frequency domains is provided. This methodology is illustrated on the stabilization control problem, for which simulations results show the effectiveness of the proposed methodology.

  15. Diophantine approximation and badly approximable sets

    DEFF Research Database (Denmark)

    Kristensen, S.; Thorn, R.; Velani, S.

    2006-01-01

    Let (X,d) be a metric space and (Omega, d) a compact subspace of X which supports a non-atomic finite measure m.  We consider `natural' classes of badly approximable  subsets of Omega. Loosely speaking, these consist of points in Omega which `stay clear' of some given set of points in X. The clas......Let (X,d) be a metric space and (Omega, d) a compact subspace of X which supports a non-atomic finite measure m.  We consider `natural' classes of badly approximable  subsets of Omega. Loosely speaking, these consist of points in Omega which `stay clear' of some given set of points in X...

  16. Sparse approximation with bases

    CERN Document Server

    2015-01-01

    This book systematically presents recent fundamental results on greedy approximation with respect to bases. Motivated by numerous applications, the last decade has seen great successes in studying nonlinear sparse approximation. Recent findings have established that greedy-type algorithms are suitable methods of nonlinear approximation in both sparse approximation with respect to bases and sparse approximation with respect to redundant systems. These insights, combined with some previous fundamental results, form the basis for constructing the theory of greedy approximation. Taking into account the theoretical and practical demand for this kind of theory, the book systematically elaborates a theoretical framework for greedy approximation and its applications.  The book addresses the needs of researchers working in numerical mathematics, harmonic analysis, and functional analysis. It quickly takes the reader from classical results to the latest frontier, but is written at the level of a graduate course and do...

  17. Compressed 'energy logs'; Energiatukit tulevat

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, T.

    1999-07-01

    hydraulic pressure pump has to be 250-300 l/min at the pressure range of 300 bar. The productivity, weight and price (1.2 million SEK) of the Wood Pac are approximately the same as those of the Fiberpac machine. Logging residues are fed into a chamber formed of eight rollers. The size of the 'log' is almost the same as that of the Fiberpac, but the amount of wood is somewhat lower. A single Wood Pac -log contains a little over 1.0 m{sup 3} of chips, while that of the Fiberpac log is about 1.5 m{sup 3}.

  18. Summing threshold logs in a parton shower

    CERN Document Server

    Nagy, Zoltan

    2016-01-01

    When parton distributions are falling steeply as the momentum fractions of the partons increases, there are effects that occur at each order in $\\alpha_s$ that combine to affect hard scattering cross sections and need to be summed. We show how to accomplish this in a leading approximation in the context of a parton shower Monte Carlo event generator.

  19. Summing threshold logs in a parton shower

    Energy Technology Data Exchange (ETDEWEB)

    Nagy, Zoltan [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Soper, Davison E. [Oregon Univ., Eugene, OR (United States). Inst. of Theoretical Science

    2016-05-15

    When parton distributions are falling steeply as the momentum fractions of the partons increases, there are effects that occur at each order in α{sub s} that combine to affect hard scattering cross sections and need to be summed. We show how to accomplish this in a leading approximation in the context of a parton shower Monte Carlo event generator.

  20. Assessing the Utility of a Daily Log for Measuring Principal Leadership Practice

    Science.gov (United States)

    Camburn, Eric M.; Spillane, James P.; Sebastian, James

    2010-01-01

    Purpose: This study examines the feasibility and utility of a daily log for measuring principal leadership practice. Setting and Sample: The study was conducted in an urban district with approximately 50 principals. Approach: The log was assessed against two criteria: (a) Is it feasible to induce strong cooperation and high response rates among…

  1. Fully Retroactive Approximate Range and Nearest Neighbor Searching

    CERN Document Server

    Goodrich, Michael T

    2011-01-01

    We describe fully retroactive dynamic data structures for approximate range reporting and approximate nearest neighbor reporting. We show how to maintain, for any positive constant $d$, a set of $n$ points in $\\R^d$ indexed by time such that we can perform insertions or deletions at any point in the timeline in $O(\\log n)$ amortized time. We support, for any small constant $\\epsilon>0$, $(1+\\epsilon)$-approximate range reporting queries at any point in the timeline in $O(\\log n + k)$ time, where $k$ is the output size. We also show how to answer $(1+\\epsilon)$-approximate nearest neighbor queries for any point in the past or present in $O(\\log n)$ time.

  2. Accurately determining log and bark volumes of saw logs using high-resolution laser scan data

    Science.gov (United States)

    R. Edward Thomas; Neal D. Bennett

    2014-01-01

    Accurately determining the volume of logs and bark is crucial to estimating the total expected value recovery from a log. Knowing the correct size and volume of a log helps to determine which processing method, if any, should be used on a given log. However, applying volume estimation methods consistently can be difficult. Errors in log measurement and oddly shaped...

  3. Chemical logging of geothermal wells

    Science.gov (United States)

    Allen, C.A.; McAtee, R.E.

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  4. Logística empresarial

    Directory of Open Access Journals (Sweden)

    Feres Sahid

    1987-04-01

    Full Text Available RESUMEN El concepto logístico, se pudo ver reflejado con exactitud desde el punto de vista etimológico e histórico a través de la revista de la E.A.N; ya que  tiene cierto carácter militar que lo hace característico a la gestión empresarial y de esto se formula un debate definitivo de este concepto.

  5. AUTOMATED TECHNIQUE FOR CREATING LITHOLOGIC LOG PLOTS

    Directory of Open Access Journals (Sweden)

    Kristijan Posavec

    2006-12-01

    Full Text Available Paper presents automated technique for creating lithologic log plots. Technique is based on three computer tools: Microsoft (MS Access program, LogPlot program, and Visual Basic (VB macros for MS Excel. MS Access ensures professional storage of lithologic data which can be in that way easier and faster entered, searched, updated, and also used for different purposes, while LogPlot provides tools for creating lithologic log plots. VB macros enable transfer of lithologic data from MS Access to LogPlot. Data stored in MS Access are exported in ASCII files which are later used by LogPlot for creation of lithologic log plots. Presented concept facilitates creation of lithologic log plots, and automated technique enables processing of a large number of data i.e. creation of lareg number lithologic log plots in a short period of time (the paper is published in Croatian.

  6. Chiral Logs in Quenched QCD

    CERN Document Server

    Dong, S J; Horváth, I; Lee, F X; Liu, K F; Mathur, N; Zhang, J B

    2003-01-01

    The quenched chiral logs are examined on a $16^3 \\times 28$ lattice with Iwasaki gauge action and overlap fermions. The pion decay constant $f_{\\pi}$ is used to set the lattice spacing, $a = 0.200(3)$ fm. With pion mass as low as $\\sim 180 {\\rm MeV}$, we see the quenched chiral logs clearly in $m_{\\pi}^2/m$ and $f_P$, the pseudoscalar decay constant. We analyze the data to determine how low the pion mass needs to be in order for the quenched one-loop chiral perturbation theory ($\\chi$PT) to apply. With the constrained curve fitting, we are able to extract the quenched chiral log parameter $\\delta$ together with the chiral cutoff $\\Lambda_{\\chi}$ and other parameters. Only for $m_{\\pi} \\leq 300 {\\rm MeV}$ do we obtain a consistent and stable fit with a constant $\\delta$ which we determine to be 0.23(2). By comparing to the $12^3 \\times 28$ lattice, we estimate the finite volume effect to be about 1.8% for the smallest pion mass. We also study the quenched non-analytic terms in the nucleon and the $\\rho$ masses...

  7. Approximation techniques for engineers

    CERN Document Server

    Komzsik, Louis

    2006-01-01

    Presenting numerous examples, algorithms, and industrial applications, Approximation Techniques for Engineers is your complete guide to the major techniques used in modern engineering practice. Whether you need approximations for discrete data of continuous functions, or you''re looking for approximate solutions to engineering problems, everything you need is nestled between the covers of this book. Now you can benefit from Louis Komzsik''s years of industrial experience to gain a working knowledge of a vast array of approximation techniques through this complete and self-contained resource.

  8. Theory of approximation

    CERN Document Server

    Achieser, N I

    2004-01-01

    A pioneer of many modern developments in approximation theory, N. I. Achieser designed this graduate-level text from the standpoint of functional analysis. The first two chapters address approximation problems in linear normalized spaces and the ideas of P. L. Tchebysheff. Chapter III examines the elements of harmonic analysis, and Chapter IV, integral transcendental functions of the exponential type. The final two chapters explore the best harmonic approximation of functions and Wiener's theorem on approximation. Professor Achieser concludes this exemplary text with an extensive section of pr

  9. Validity of the eikonal approximation

    CERN Document Server

    Kabat, D

    1992-01-01

    We summarize results on the reliability of the eikonal approximation in obtaining the high energy behavior of a two particle forward scattering amplitude. Reliability depends on the spin of the exchanged field. For scalar fields the eikonal fails at eighth order in perturbation theory, when it misses the leading behavior of the exchange-type diagrams. In a vector theory the eikonal gets the exchange diagrams correctly, but fails by ignoring certain non-exchange graphs which dominate the asymptotic behavior of the full amplitude. For spin--2 tensor fields the eikonal captures the leading behavior of each order in perturbation theory, but the sum of eikonal terms is subdominant to graphs neglected by the approximation. We also comment on the eikonal for Yang-Mills vector exchange, where the additional complexities of the non-abelian theory may be absorbed into Regge-type modifications of the gauge boson propagators.

  10. Validity of the Eikonal Approximation

    OpenAIRE

    Kabat, Daniel

    1992-01-01

    We summarize results on the reliability of the eikonal approximation in obtaining the high energy behavior of a two particle forward scattering amplitude. Reliability depends on the spin of the exchanged field. For scalar fields the eikonal fails at eighth order in perturbation theory, when it misses the leading behavior of the exchange-type diagrams. In a vector theory the eikonal gets the exchange diagrams correctly, but fails by ignoring certain non-exchange graphs which dominate the asymp...

  11. Determining Partition Coefficient (Log P), Distribution Coefficient (Log D) and Ionization Constant (pKa) in Early Drug Discovery.

    Science.gov (United States)

    Bharate, Sonali S; Kumar, Vikas; Vishwakarma, Ram A

    2016-01-01

    An early prediction of physicochemical properties is highly desirable during drug discovery to find out a viable lead candidate. Although there are several methods available to determine partition coefficient (log P), distribution coefficient (log D) and ionization constant (pKa), none of them involves simple and fixed, miniaturized protocols for diverse set of compounds. Therefore, it is necessary to establish simple, uniform and medium-throughput protocols requiring small sample quantities for the determination of these physicochemical properties. Log P and log D were determined by shake flask method, wherein, the compound was partitioned between presaturated noctanol and water phase (water/PBS pH 7.4) and the concentration of compound in each phase was determined by HPLC. The pKa determination made use of UV spectrophotometric analysis in a 96-well microtiter plate containing a series of aqueous buffers ranging from pH 1.0 to 13.0. The medium-throughput miniaturized protocols described herein, for determination of log P, log D and pKa, are straightforward to set up and require very small quantities of sample (< 5 mg for all three properties). All established protocols were validated using diverse set of compounds.

  12. Expectation Consistent Approximate Inference

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2005-01-01

    We propose a novel framework for approximations to intractable probabilistic models which is based on a free energy formulation. The approximation can be understood from replacing an average over the original intractable distribution with a tractable one. It requires two tractable probability dis...

  13. Ordered cones and approximation

    CERN Document Server

    Keimel, Klaus

    1992-01-01

    This book presents a unified approach to Korovkin-type approximation theorems. It includes classical material on the approximation of real-valuedfunctions as well as recent and new results on set-valued functions and stochastic processes, and on weighted approximation. The results are notonly of qualitative nature, but include quantitative bounds on the order of approximation. The book is addressed to researchers in functional analysis and approximation theory as well as to those that want to applythese methods in other fields. It is largely self- contained, but the readershould have a solid background in abstract functional analysis. The unified approach is based on a new notion of locally convex ordered cones that are not embeddable in vector spaces but allow Hahn-Banach type separation and extension theorems. This concept seems to be of independent interest.

  14. Approximate Modified Policy Iteration

    CERN Document Server

    Scherrer, Bruno; Ghavamzadeh, Mohammad; Geist, Matthieu

    2012-01-01

    Modified policy iteration (MPI) is a dynamic programming (DP) algorithm that contains the two celebrated policy and value iteration methods. Despite its generality, MPI has not been thoroughly studied, especially its approximation form which is used when the state and/or action spaces are large or infinite. In this paper, we propose three approximate MPI (AMPI) algorithms that are extensions of the well-known approximate DP algorithms: fitted-value iteration, fitted-Q iteration, and classification-based policy iteration. We provide an error propagation analysis for AMPI that unifies those for approximate policy and value iteration. We also provide a finite-sample analysis for the classification-based implementation of AMPI (CBMPI), which is more general (and somehow contains) than the analysis of the other presented AMPI algorithms. An interesting observation is that the MPI's parameter allows us to control the balance of errors (in value function approximation and in estimating the greedy policy) in the fina...

  15. The Log Log Prior for the Frequency of Extraterrestrial Intelligences

    CERN Document Server

    Lacki, Brian C

    2016-01-01

    It is unclear how frequently life and intelligence arise on planets. I consider a Bayesian prior for the probability P(ETI) that intelligence evolves at a suitable site, with weight distributed evenly over ln(1 - ln P(ETI)). This log log prior can handle a very wide range of P(ETI) values, from 1 to 10^(-10^122), while remaining responsive to evidence about extraterrestrial societies. It is motivated by our uncertainty in the number of conditions that must be fulfilled for intelligence to arise, and it is related to considerations of information, entropy, and state space dimensionality. After setting a lower limit to P(ETI) from the number of possible genome sequences, I calculate a Bayesian confidence of 18% that aliens exist within the observable Universe. With different assumptions about the minimum P(ETI) and the number of times intelligence can appear on a planet, this value falls between 1.4% and 47%. Overall, the prior leans towards our being isolated from extraterrestrial intelligences, but indicates ...

  16. Coal quality estimation using geophysical logging without radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Beretta, F.; De Souza, V.C.G.; Salvadoretti, P.; Costa, J.F.C.L.; Koppe, J.C. [Univ. Federal do Rio Grando do Sul, Rio Grando do Sul (Brazil). Dept. of Mining Engineering; Bastiani, G.A.; Carvalho, J.A.Jr. [Copelmi Mineracao, Rio Grando do Sul (Brazil)

    2009-07-01

    Drill hole logging is widely used in mining and mineral exploration to determine the physical and chemical properties of ore. Geophysical probes are used to delineate coal seams and measure coal seam density. Gamma-gamma logging of the seams is used to determine correlations in ash content and coal density. This study evaluated the accuracy of geophysical logging techniques in predicting coal quality. Seventeen holes in the study were drilled in an irregular grid with spaces ranging from 200 to 600 m. The average recorded values of natural gamma and the resistivity from the logs were obtained. Differences between the coal seams in the deposit were analyzed statistically. The study showed a strong correlation between natural gamma and ash content in the deposit. Relative errors were approximately 10 per cent for a confidence interval of 99.99 per cent. It was concluded that natural gamma logging can be used to accurately measure the ash content in coal seams. 19 refs., 2 tabs., 3 figs.

  17. Data Mining of Network Logs

    Science.gov (United States)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  18. The Closed Form on a Kind of Log-cosine and Log-sine Integral%一类log-cosine和log-sine积分的闭形式

    Institute of Scientific and Technical Information of China (English)

    商妮娜; 秦惠增

    2012-01-01

    We consider some recursion formulas for the partial derivative of Beta function. Using these recursion formulas, we give the closed form of the log-cosine and log-sine integral.%考虑了Beta函数偏导数的递推公式的表示问题.利用Beta函数偏导数的递推公式,给出一类log-cosine和log-sine积分的闭形式.

  19. Approximate calculation of integrals

    CERN Document Server

    Krylov, V I

    2006-01-01

    A systematic introduction to the principal ideas and results of the contemporary theory of approximate integration, this volume approaches its subject from the viewpoint of functional analysis. In addition, it offers a useful reference for practical computations. Its primary focus lies in the problem of approximate integration of functions of a single variable, rather than the more difficult problem of approximate integration of functions of more than one variable.The three-part treatment begins with concepts and theorems encountered in the theory of quadrature. The second part is devoted to t

  20. Approximate and renormgroup symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Ibragimov, Nail H. [Blekinge Institute of Technology, Karlskrona (Sweden). Dept. of Mathematics Science; Kovalev, Vladimir F. [Russian Academy of Sciences, Moscow (Russian Federation). Inst. of Mathematical Modeling

    2009-07-01

    ''Approximate and Renormgroup Symmetries'' deals with approximate transformation groups, symmetries of integro-differential equations and renormgroup symmetries. It includes a concise and self-contained introduction to basic concepts and methods of Lie group analysis, and provides an easy-to-follow introduction to the theory of approximate transformation groups and symmetries of integro-differential equations. The book is designed for specialists in nonlinear physics - mathematicians and non-mathematicians - interested in methods of applied group analysis for investigating nonlinear problems in physical science and engineering. (orig.)

  1. Approximating Stationary Statistical Properties

    Institute of Scientific and Technical Information of China (English)

    Xiaoming WANG

    2009-01-01

    It is well-known that physical laws for large chaotic dynamical systems are revealed statistically. Many times these statistical properties of the system must be approximated numerically. The main contribution of this manuscript is to provide simple and natural criterions on numerical methods (temporal and spatial discretization) that are able to capture the stationary statistical properties of the underlying dissipative chaotic dynamical systems asymptotically. The result on temporal approximation is a recent finding of the author, and the result on spatial approximation is a new one. Applications to the infinite Prandtl number model for convection and the barotropic quasi-geostrophic model are also discussed.

  2. Non-Linear Logging Parameters Inversion

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The non-linear logging parameters inversion is based on the field theory, information optimization and predication theory. It uses seismic charaoters,geological model and logging data as a restriction to inverse 2D, 3D logging parameters data volume. Using this method,

  3. 29 CFR 1917.18 - Log handling.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  4. 47 CFR 73.1820 - Station log.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  5. 47 CFR 87.109 - Station logs.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  6. 10 CFR 34.71 - Utilization logs.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  7. Selective logging in the Brazilian Amazon.

    Science.gov (United States)

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  8. 领导干部接访制度运行方式研究--基于信访日志的分析%Research on Operation Mode of Receiving Visiting System of Leading Cadres-Analysis of the Petition Log

    Institute of Scientific and Technical Information of China (English)

    方付建

    2015-01-01

    为分析中央推动的领导干部接访制度在地方运行状况,文章以干部信访日志为材料,利用统计分析与现象归纳方法,对领导干部接访制度运行中接访方式、接访事项、接访事项处置方式等展开分析。研究结果表明,领导干部接访制度以接访、约访、研访等方式呈现,且接访事项十分多元,在处置方式上,领导干部通过事项交办、建议提供、资源动用、报请上级等方式处置。从取向看,虽然领导干部接访制度有利于排查消除矛盾、拉近干群距离、推动民众参与和增加公共服务,但接访制度依赖领导权威发挥作用,且存在救火式维稳问题,此外,领导干部接访制度作为一种治标性举措,也会陷入两难困境。%In order to analyze the local operation situation of the receiving visiting system of leading cadres driv⁃en by the central government,this paper uses the petition log as the material,and with statistics analysis meth⁃od and inductive method,analyses the ways of receiving visiting,its matters,and diposal methods. The results show that the leading cadres’receiving visiting system is expressed by the reception,visit,and visit about re⁃search,and the reception matters are diverse. In the mode of disposal,the leading cadres are disposed through the matters assigned,providing advice,resource use,and report to the superior. From the orientation of leading cadres,although the reception system is conducive to the elimination of contradictions,narrowing the gap be⁃tween the cardres and the masses,promoting public participation and increasing public service,the operational system relies on leadership authority to play a role,and there is fire type dimension stability problems,in addi⁃tion,the leading cadres’reception system,as a stopgap measure,will fall into a dilemma.

  9. Squashed entanglement and approximate private states

    Science.gov (United States)

    Wilde, Mark M.

    2016-09-01

    The squashed entanglement is a fundamental entanglement measure in quantum information theory, finding application as an upper bound on the distillable secret key or distillable entanglement of a quantum state or a quantum channel. This paper simplifies proofs that the squashed entanglement is an upper bound on distillable key for finite-dimensional quantum systems and solidifies such proofs for infinite-dimensional quantum systems. More specifically, this paper establishes that the logarithm of the dimension of the key system (call it log 2K ) in an ɛ -approximate private state is bounded from above by the squashed entanglement of that state plus a term that depends only ɛ and log 2K . Importantly, the extra term does not depend on the dimension of the shield systems of the private state. The result holds for the bipartite squashed entanglement, and an extension of this result is established for two different flavors of the multipartite squashed entanglement.

  10. Approximation of irrationals

    Directory of Open Access Journals (Sweden)

    Malvina Baica

    1985-01-01

    Full Text Available The author uses a new modification of Jacobi-Perron Algorithm which holds for complex fields of any degree (abbr. ACF, and defines it as Generalized Euclidean Algorithm (abbr. GEA to approximate irrationals.

  11. Approximations in Inspection Planning

    DEFF Research Database (Denmark)

    Engelund, S.; Sørensen, John Dalsgaard; Faber, M. H.

    2000-01-01

    Planning of inspections of civil engineering structures may be performed within the framework of Bayesian decision analysis. The effort involved in a full Bayesian decision analysis is relatively large. Therefore, the actual inspection planning is usually performed using a number of approximations....... One of the more important of these approximations is the assumption that all inspections will reveal no defects. Using this approximation the optimal inspection plan may be determined on the basis of conditional probabilities, i.e. the probability of failure given no defects have been found...... by the inspection. In this paper the quality of this approximation is investigated. The inspection planning is formulated both as a full Bayesian decision problem and on the basis of the assumption that the inspection will reveal no defects....

  12. Approximations in Inspection Planning

    DEFF Research Database (Denmark)

    Engelund, S.; Sørensen, John Dalsgaard; Faber, M. H.

    2000-01-01

    Planning of inspections of civil engineering structures may be performed within the framework of Bayesian decision analysis. The effort involved in a full Bayesian decision analysis is relatively large. Therefore, the actual inspection planning is usually performed using a number of approximations....... One of the more important of these approximations is the assumption that all inspections will reveal no defects. Using this approximation the optimal inspection plan may be determined on the basis of conditional probabilities, i.e. the probability of failure given no defects have been found...... by the inspection. In this paper the quality of this approximation is investigated. The inspection planning is formulated both as a full Bayesian decision problem and on the basis of the assumption that the inspection will reveal no defects....

  13. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  14. Approximation Behooves Calibration

    DEFF Research Database (Denmark)

    da Silva Ribeiro, André Manuel; Poulsen, Rolf

    2013-01-01

    Calibration based on an expansion approximation for option prices in the Heston stochastic volatility model gives stable, accurate, and fast results for S&P500-index option data over the period 2005–2009.......Calibration based on an expansion approximation for option prices in the Heston stochastic volatility model gives stable, accurate, and fast results for S&P500-index option data over the period 2005–2009....

  15. Quality control for quantitative geophysical logging

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Kyu; Hwang, Se Ho; Hwang, Hak Soo; Park, In Hwa [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    Despite the great availability of geophysical data obtained from boreholes, the interpretation is subject to significant uncertainties. More accurate data with less statistical uncertainties should require an employment of more quantitative techniques in log acquisition and interpretation technique. The long-term objective of this project is the development of techniques in both quality control of log measurement and the quantitative interpretation. In the first year, the goals of the project will include establishing the procedure of log acquisition using various tests, analysing the effect of logging velocity change on the logging data, examining the repeatability and reproducibility, analyzing of filtering effect on the log measurements, and finally the zonation and the correlation of single-and inter-well log data. For the establishment of logging procedure, we have tested the multiple factors affecting the accuracy in depth. The factors are divided into two parts: human and mechanical. These factors include the zero setting of depth, the calculation of offset for the sonde, the stretching effect of cable, and measuring wheel accuracy. We conclude that the error in depth setting results primarily from human factor, and also in part from the stretching of cable. The statistical fluctuation of log measurements increases according to increasing the logging speed for the zone of lower natural gamma. Thus, the problem related with logging speed is a trifling matter in case of the application of resources exploration, the logging speed should run more slowly to reduce the statistical fluctuation of natural gamma with lithologic correlation in mind. The repeatability and reproducibility of logging measurements are tested. The results of repeatability test for the natural gamma sonde are qualitatively acceptable in the reproducibility test, the errors occurs in logging data between two operators and successive trials. We conclude that the errors result from the

  16. Approximate kernel competitive learning.

    Science.gov (United States)

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches.

  17. Approximation and supposition

    Directory of Open Access Journals (Sweden)

    Maksim Duškin

    2015-11-01

    Full Text Available Approximation and supposition This article compares exponents of approximation (expressions like Russian около, примерно, приблизительно, более, свыше and the words expressing supposition (for example Russian скорее всего, наверное, возможно. These words are often confused in research, in particular researchers often mention exponents of supposition in case of exponents of approximation. Such approach arouses some objections. The author intends to demonstrate in this article a notional difference between approximation and supposition, therefore the difference between exponents of these two notions. This difference could be described by specifying different attitude of approximation and supposition to the notion of knowledge. Supposition implies speaker’s ignorance of the exact number, while approximation does not mean such ignorance. The article offers examples proving this point of view.

  18. Local regularity analysis of strata heterogeneities from sonic logs

    Directory of Open Access Journals (Sweden)

    S. Gaci

    2010-09-01

    Full Text Available Borehole logs provide geological information about the rocks crossed by the wells. Several properties of rocks can be interpreted in terms of lithology, type and quantity of the fluid filling the pores and fractures.

    Here, the logs are assumed to be nonhomogeneous Brownian motions (nhBms which are generalized fractional Brownian motions (fBms indexed by depth-dependent Hurst parameters H(z. Three techniques, the local wavelet approach (LWA, the average-local wavelet approach (ALWA, and Peltier Algorithm (PA, are suggested to estimate the Hurst functions (or the regularity profiles from the logs.

    First, two synthetic sonic logs with different parameters, shaped by the successive random additions (SRA algorithm, are used to demonstrate the potential of the proposed methods. The obtained Hurst functions are close to the theoretical Hurst functions. Besides, the transitions between the modeled layers are marked by Hurst values discontinuities. It is also shown that PA leads to the best Hurst value estimations.

    Second, we investigate the multifractional property of sonic logs data recorded at two scientific deep boreholes: the pilot hole VB and the ultra deep main hole HB, drilled for the German Continental Deep Drilling Program (KTB. All the regularity profiles independently obtained for the logs provide a clear correlation with lithology, and from each regularity profile, we derive a similar segmentation in terms of lithological units. The lithological discontinuities (strata' bounds and faults contacts are located at the local extrema of the Hurst functions. Moreover, the regularity profiles are compared with the KTB estimated porosity logs, showing a significant relation between the local extrema of the Hurst functions and the fluid-filled fractures. The Hurst function may then constitute a tool to characterize underground heterogeneities.

  19. Porosity Log Prediction Using Artificial Neural Network

    Science.gov (United States)

    Dwi Saputro, Oki; Lazuardi Maulana, Zulfikar; Dzar Eljabbar Latief, Fourier

    2016-08-01

    Well logging is important in oil and gas exploration. Many physical parameters of reservoir is derived from well logging measurement. Geophysicists often use well logging to obtain reservoir properties such as porosity, water saturation and permeability. Most of the time, the measurement of the reservoir properties are considered expensive. One of method to substitute the measurement is by conducting a prediction using artificial neural network. In this paper, artificial neural network is performed to predict porosity log data from other log data. Three well from ‘yy’ field are used to conduct the prediction experiment. The log data are sonic, gamma ray, and porosity log. One of three well is used as training data for the artificial neural network which employ the Levenberg-Marquardt Backpropagation algorithm. Through several trials, we devise that the most optimal input training is sonic log data and gamma ray log data with 10 hidden layer. The prediction result in well 1 has correlation of 0.92 and mean squared error of 5.67 x10-4. Trained network apply to other well data. The result show that correlation in well 2 and well 3 is 0.872 and 0.9077 respectively. Mean squared error in well 2 and well 3 is 11 x 10-4 and 9.539 x 10-4. From the result we can conclude that sonic log and gamma ray log could be good combination for predicting porosity with neural network.

  20. LHCb: Universal Logging System for LHCb

    CERN Multimedia

    Nikolaidis, F; Garnier, J-C; Neufeld, N

    2010-01-01

    In a large-scale IT infrastructure such as the LHCb Online system many applications are running on thousands of machines producing many GBs of logs every day. Although most of the logs are just routine logs, some of them may indicate an attack, a malfunction or provide vital debugging information. Due to their volume only automatisation of the analysis of the logs can provide us with an efficient way to handle all of these logs, ensuring that even the most rare logs will be processed. We present a centralized logging system which allow us to do in-depth analysis of every log. The description of the architecture includes information from how we integrate logging from many devices to a centralized server using syslog and in particular how a correlation can indicate an attack. Special emphasis is given both to security monitoring as well as to the logs that indicate developing malfunctions. To secure our network we have deployed the most known of HIDS, NIDS , LIDS (Host , Network, Log intrusion detection). Each ...

  1. Tight Lower Bounds on Envy-Free Makespan Approximation

    CERN Document Server

    Fiat, Amos

    2012-01-01

    In this work we give a tight lower bound on makespan approximations for envy-free allocation mechanism dedicated to scheduling tasks on unrelated machines. Specifically, we show that no mechanism exists that can guarantee an envy-free allocation of jobs to $m$ machines with a makespan of less than a factor of $O(\\log m)$ of the minimal makespan. Combined with previous results, this paper definitively proves that the optimal algorithm for obtaining a minimal makespan for any envy-free division can at best approximate the makespan to a factor of $O(\\log m)$.

  2. Fixed-dimensional parallel linear programming via relative {Epsilon}-approximations

    Energy Technology Data Exchange (ETDEWEB)

    Goodrich, M.T.

    1996-12-31

    We show that linear programming in IR{sup d} can be solved deterministically in O((log log n){sup d}) time using linear work in the PRAM model of computation, for any fixed constant d. Our method is developed for the CRCW variant of the PRAM parallel computation model, and can be easily implemented to run in O(log n(log log n){sup d-1}) time using linear work on an EREW PRAM. A key component in these algorithms is a new, efficient parallel method for constructing E-nets and E-approximations (which have wide applicability in computational geometry). In addition, we introduce a new deterministic set approximation for range spaces with finite VC-exponent, which we call the {delta}-relative {epsilon}-approximation, and we show how such approximations can be efficiently constructed in parallel.

  3. Log-ω-hyponormal Operators%Log-ω亚正规算子

    Institute of Scientific and Technical Information of China (English)

    王斌; 张敏

    2008-01-01

    Let T be an operator on a separable Hilbert space H and T=U|T|bethe polaf decomposition.T is said to be log-ω-hyponormal if log|T|≥log|T|≥log|(T)*|In this paper we prove that the point spectrum of T is equal to its joint point spectrum if T is log-ω-hyponormal.We also prove that a log-ω-hyponormal operator is normaloid,i.e.,r(T)=‖T‖.Finally,we obtain Putnam's theorem for log-ω-hyponormal Operators.

  4. Analysis of Web Proxy Logs

    Science.gov (United States)

    Fei, Bennie; Eloff, Jan; Olivier, Martin; Venter, Hein

    Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy logs can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.

  5. Quirks of Stirling's Approximation

    Science.gov (United States)

    Macrae, Roderick M.; Allgeier, Benjamin M.

    2013-01-01

    Stirling's approximation to ln "n"! is typically introduced to physical chemistry students as a step in the derivation of the statistical expression for the entropy. However, naive application of this approximation leads to incorrect conclusions. In this article, the problem is first illustrated using a familiar "toy…

  6. Approximate Inference for Wireless Communications

    DEFF Research Database (Denmark)

    Hansen, Morten

    This thesis investigates signal processing techniques for wireless communication receivers. The aim is to improve the performance or reduce the computationally complexity of these, where the primary focus area is cellular systems such as Global System for Mobile communications (GSM) (and extensions...... complexity can potentially lead to limited power consumption, which translates into longer battery life-time in the handsets. The scope of the thesis is more specifically to investigate approximate (nearoptimal) detection methods that can reduce the computationally complexity significantly compared...... to the optimal one, which usually requires an unacceptable high complexity. Some of the treated approximate methods are based on QL-factorization of the channel matrix. In the work presented in this thesis it is proven how the QL-factorization of frequency-selective channels asymptotically provides the minimum...

  7. Uniform Approximate Estimation for Nonlinear Nonhomogenous Stochastic System with Unknown Parameter

    OpenAIRE

    2012-01-01

    The error bound in probability between the approximate maximum likelihood estimator (AMLE) and the continuous maximum likelihood estimator (MLE) is investigated for nonlinear nonhomogenous stochastic system with unknown parameter. The rates of convergence of the approximations for Itô and ordinary integral are introduced under some regular assumptions. Based on these results, the in probability rate of convergence of the approximate log-likelihood function to the true continuous log-likelihoo...

  8. Covariant approximation averaging

    CERN Document Server

    Shintani, Eigo; Blum, Thomas; Izubuchi, Taku; Jung, Chulwoo; Lehner, Christoph

    2014-01-01

    We present a new class of statistical error reduction techniques for Monte-Carlo simulations. Using covariant symmetries, we show that correlation functions can be constructed from inexpensive approximations without introducing any systematic bias in the final result. We introduce a new class of covariant approximation averaging techniques, known as all-mode averaging (AMA), in which the approximation takes account of contributions of all eigenmodes through the inverse of the Dirac operator computed from the conjugate gradient method with a relaxed stopping condition. In this paper we compare the performance and computational cost of our new method with traditional methods using correlation functions and masses of the pion, nucleon, and vector meson in $N_f=2+1$ lattice QCD using domain-wall fermions. This comparison indicates that AMA significantly reduces statistical errors in Monte-Carlo calculations over conventional methods for the same cost.

  9. Diophantine approximations on fractals

    CERN Document Server

    Einsiedler, Manfred; Shapira, Uri

    2009-01-01

    We exploit dynamical properties of diagonal actions to derive results in Diophantine approximations. In particular, we prove that the continued fraction expansion of almost any point on the middle third Cantor set (with respect to the natural measure) contains all finite patterns (hence is well approximable). Similarly, we show that for a variety of fractals in [0,1]^2, possessing some symmetry, almost any point is not Dirichlet improvable (hence is well approximable) and has property C (after Cassels). We then settle by similar methods a conjecture of M. Boshernitzan saying that there are no irrational numbers x in the unit interval such that the continued fraction expansions of {nx mod1 : n is a natural number} are uniformly eventually bounded.

  10. Development of a clinically feasible logMAR alternative to the Snellen chart: performance of the "compact reduced logMAR" visual acuity chart in amblyopic children.

    Science.gov (United States)

    Laidlaw, D A H; Abbott, A; Rosser, D A

    2003-10-01

    The "compact reduced logMAR" (cRLM) chart is being developed as a logMAR alternative to the Snellen chart. It is closer spaced and has fewer letters per line than conventional logMAR charts. Information regarding the performance of such a chart in amblyopes and children is therefore required. This study aimed to investigate the performance of the cRLM chart in amblyopic children. Timed test and retest measurements using two versions of each chart design were obtained on the amblyopic eye of 43 children. Using the methods of Bland and Altman the agreement, test-retest variability (95% confidence limits for agreement, TRV) and test time of the cRLM and the current clinical standard Snellen chart were compared to the gold standard ETDRS logMAR chart. No systematic bias between chart designs was found. For line assignment scoring the respective TRVs were 0.20 logMAR, 0.20 logMAR, and 0.30 logMAR. Single letter scoring TRVs were cRLM (95% CL 0.17) logMAR, ETDRS (95% CL 0.14) logMAR, and Snellen (95% CL 0.29) logMAR. Median testing times were ETDRS 60 seconds, cRLM 40 seconds, Snellen 30 seconds. The sensitivity to change of the cRLM equalled or approached that of the gold standard ETDRS and was at least 50% better than that of Snellen. This enhanced sensitivity to change was at the cost of only a 10 second time penalty compared to Snellen. The cRLM chart was approximately half the width of the ETDRS chart. The cRLM chart may represent a clinically acceptable compromise between the desire to obtain logMAR acuities of reasonable and known sensitivity to change, chart size, and testing time.

  11. Monotone Boolean approximation

    Energy Technology Data Exchange (ETDEWEB)

    Hulme, B.L.

    1982-12-01

    This report presents a theory of approximation of arbitrary Boolean functions by simpler, monotone functions. Monotone increasing functions can be expressed without the use of complements. Nonconstant monotone increasing functions are important in their own right since they model a special class of systems known as coherent systems. It is shown here that when Boolean expressions for noncoherent systems become too large to treat exactly, then monotone approximations are easily defined. The algorithms proposed here not only provide simpler formulas but also produce best possible upper and lower monotone bounds for any Boolean function. This theory has practical application for the analysis of noncoherent fault trees and event tree sequences.

  12. Dynamic and approximate pattern matching in 2D

    DEFF Research Database (Denmark)

    Clifford, Raphaël; Fontaine, Allyx; Starikovskaya, Tatiana

    2016-01-01

    updates can be performed in O(log2 n) time and queries in O(log2 m) time. - We then consider a model where an update is a new 2D pattern and a query is a location in the text. For this setting we show that Hamming distance queries can be answered in O(log m + H) time, where H is the relevant Hamming...... distance. - Extending this work to allow approximation, we give an efficient algorithm which returns a (1+ε) approximation of the Hamming distance at a given location in O(ε−2 log2 m log log n) time. Finally, we consider a different setting inspired by previous work on locality sensitive hashing (LSH......). Given a threshold k and after building the 2D text index and receiving a 2D query pattern, we must output a location where the Hamming distance is at most (1 + ε)k as long as there exists a location where the Hamming distance is at most k. - For our LSH inspired 2D indexing problem, the text can...

  13. logR-logT Figure and Unknown Solar System Planets%logR-logT图与太阳系未知行星

    Institute of Scientific and Technical Information of China (English)

    潘彩娟; 王小波; 韦鸿铭

    2011-01-01

    Using logR-logT figure to research the planet-satellite system in Solar system, the moving of satellites are in accordance with the Kepler's Third Law, And there is a relationship between the mass of central celestial body and the straight line distances. The logR-logT figure of the unknown planet-satellite system, Mercury-satellite and Venus-satellite are described, we speculated the mass and the period of the unknown planet, the location and the period of the satellites of Mercury or Venus.%利用log-logT图研究太阳系的行星卫星系统,得出八大行星卫星系统都符合开普勒第三定律,推导出行星卫星系统logR-logT图线截距与中心天体质量的关系;通过描绘未知行星卫星系统、水星卫星系统和金星卫星系统的logR-logT图线,推测未知行星的质量和公转周期,以及水星和金星的未知卫星可能存在的位置与周期.

  14. Prestack wavefield approximations

    KAUST Repository

    Alkhalifah, Tariq

    2013-09-01

    The double-square-root (DSR) relation offers a platform to perform prestack imaging using an extended single wavefield that honors the geometrical configuration between sources, receivers, and the image point, or in other words, prestack wavefields. Extrapolating such wavefields, nevertheless, suffers from limitations. Chief among them is the singularity associated with horizontally propagating waves. I have devised highly accurate approximations free of such singularities which are highly accurate. Specifically, I use Padé expansions with denominators given by a power series that is an order lower than that of the numerator, and thus, introduce a free variable to balance the series order and normalize the singularity. For the higher-order Padé approximation, the errors are negligible. Additional simplifications, like recasting the DSR formula as a function of scattering angle, allow for a singularity free form that is useful for constant-angle-gather imaging. A dynamic form of this DSR formula can be supported by kinematic evaluations of the scattering angle to provide efficient prestack wavefield construction. Applying a similar approximation to the dip angle yields an efficient 1D wave equation with the scattering and dip angles extracted from, for example, DSR ray tracing. Application to the complex Marmousi data set demonstrates that these approximations, although they may provide less than optimal results, allow for efficient and flexible implementations. © 2013 Society of Exploration Geophysicists.

  15. On Convex Quadratic Approximation

    NARCIS (Netherlands)

    den Hertog, D.; de Klerk, E.; Roos, J.

    2000-01-01

    In this paper we prove the counterintuitive result that the quadratic least squares approximation of a multivariate convex function in a finite set of points is not necessarily convex, even though it is convex for a univariate convex function. This result has many consequences both for the field of

  16. On Convex Quadratic Approximation

    NARCIS (Netherlands)

    den Hertog, D.; de Klerk, E.; Roos, J.

    2000-01-01

    In this paper we prove the counterintuitive result that the quadratic least squares approximation of a multivariate convex function in a finite set of points is not necessarily convex, even though it is convex for a univariate convex function. This result has many consequences both for the field of

  17. Log-periodic behavior in a forest-fire model

    Directory of Open Access Journals (Sweden)

    B. D. Malamud

    2005-01-01

    Full Text Available This paper explores log-periodicity in a forest-fire cellular-automata model. At each time step of this model a tree is dropped on a randomly chosen site; if the site is unoccupied, the tree is planted. Then, for a given sparking frequency, matches are dropped on a randomly chosen site; if the site is occupied by a tree, the tree ignites and an 'instantaneous' model fire consumes that tree and all adjacent trees. The resultant frequency-area distribution for the small and medium model fires is a power-law. However, if we consider very small sparking frequencies, the large model fires that span the square grid are dominant, and we find that the peaks in the frequency-area distribution of these large fires satisfy log-periodic scaling to a good approximation. This behavior can be examined using a simple mean-field model, where in time, the density of trees on the grid exponentially approaches unity. This exponential behavior coupled with a periodic or near-periodic sparking frequency also generates a sequence of peaks in the frequency-area distribution of large fires that satisfy log-periodic scaling. We conclude that the forest-fire model might provide a relatively simple explanation for the log-periodic behavior often seen in nature.

  18. Face logging in Copenhagen Limestone, Denmark

    DEFF Research Database (Denmark)

    Jakobsen, Lisa; Foged, Niels Nielsen; Erichsen, Lars;

    2015-01-01

    The requirement for excavation support can be assessed from face logging. Face logs can also improve our knowledge of lithological and structural conditions within bedrock and supplement information from boreholes and geophysical logs. During the construction of 8 km metro tunnel and 4 km heating...... tunnel in Copenhagen more than 2.5 km face logs were made in 467 locations at underground stations, shafts, caverns and along bored tunnels. Over 160 geotechnical boreholes, many with geophysical logging were executed prior to construction works. The bedrock consists of Paleogene "Copenhagen limestone......" and face logs show a sub-horizontally layered structure, with alternate extremely weak to extremely strong beds of variable thickness. The rhythmicity is thought to be climatically controlled. Stronger beds represent reduced sedimentation rate related to climatic deterioration while weaker beds result from...

  19. MultiLog: a tool for the control and output merging of multiple logging applications.

    Science.gov (United States)

    Woodruff, Jonathan; Alexander, Jason

    2016-12-01

    MultiLog is a logging tool that controls, gathers, and combines the output, on-the-fly, from existing research and commercial logging applications or "loggers." Loggers record a specific set of user actions on a computing device, helping researchers to better understand environments or interactions, guiding the design of new or improved interfaces and applications. MultiLog reduces researchers' required implementation effort by simplifying the set-up of multiple loggers and seamlessly combining their output. This in turn increases the availability of logging systems to non-technical experimenters for both short-term and longitudinal observation studies. MultiLog supports two operating modes: "researcher mode" where experimenters configure multiple logging systems, and "deployment mode" where the system is deployed to user-study participants' systems. Researcher mode allows researchers to install, configure log filtering and obfuscation, observe near real-time event streams, and save configuration files ready for deployment. Deployment mode simplifies data collection from multiple loggers by running in the system tray at user log-in, starting loggers, combining their output, and securely uploading the data to a web-server. It also supports real-time browsing of log data, pausing of logging, and removal of log lines. Performance evaluations show that MultiLog does not adversely affect system performance, even when simultaneously running several logging systems. Initial studies show the system runs reliably over a period of 10 weeks.

  20. Bounds and approximations for sums of dependent log-elliptical random variables

    NARCIS (Netherlands)

    Valdez, E.A.; Dhaene, J.; Maj, M.; Vanduffel, S.

    2009-01-01

    Dhaene, Denuit, Goovaerts, Kaas and Vyncke [Dhaene, J., Denuit, M., Goovaerts, M.J., Kaas, R., Vyncke, D., 2002a. The concept of comonotonicity in actuarial science and finance: theory. Insurance Math. Econom. 31 (1), 3-33; Dhaene, J., Denuit, M., Goovaerts, M.J., Kaas, R., Vyncke, D., 2002b. The co

  1. Logging data representation based on XML

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    As an open standard of data representation, XML breathes new energy to the Web application and the network computing. The development, advantage and status of XML and some standards relating to XML are reviewed. In addition, the authors put forward a method representing logging data and using UML language to establish the conceptual and logical model of logging data; using a logging data, explain how to establish the model as well as how to use XML to display and process geology data.

  2. Topology, calculus and approximation

    CERN Document Server

    Komornik, Vilmos

    2017-01-01

    Presenting basic results of topology, calculus of several variables, and approximation theory which are rarely treated in a single volume, this textbook includes several beautiful, but almost forgotten, classical theorems of Descartes, Erdős, Fejér, Stieltjes, and Turán. The exposition style of Topology, Calculus and Approximation follows the Hungarian mathematical tradition of Paul Erdős and others. In the first part, the classical results of Alexandroff, Cantor, Hausdorff, Helly, Peano, Radon, Tietze and Urysohn illustrate the theories of metric, topological and normed spaces. Following this, the general framework of normed spaces and Carathéodory's definition of the derivative are shown to simplify the statement and proof of various theorems in calculus and ordinary differential equations. The third and final part is devoted to interpolation, orthogonal polynomials, numerical integration, asymptotic expansions and the numerical solution of algebraic and differential equations. Students of both pure an...

  3. Prestack traveltime approximations

    KAUST Repository

    Alkhalifah, Tariq Ali

    2011-01-01

    Most prestack traveltime relations we tend work with are based on homogeneous (or semi-homogenous, possibly effective) media approximations. This includes the multi-focusing or double square-root (DSR) and the common reflection stack (CRS) equations. Using the DSR equation, I analyze the associated eikonal form in the general source-receiver domain. Like its wave-equation counterpart, it suffers from a critical singularity for horizontally traveling waves. As a result, I derive expansion based solutions of this eikonal based on polynomial expansions in terms of the reflection and dip angles in a generally inhomogenous background medium. These approximate solutions are free of singularities and can be used to estimate travetimes for small to moderate offsets (or reflection angles) in a generally inhomogeneous medium. A Marmousi example demonstrates the usefulness of the approach. © 2011 Society of Exploration Geophysicists.

  4. Flow rate logging seepage meter

    Science.gov (United States)

    Reay, William G. (Inventor); Walthall, Harry G. (Inventor)

    1996-01-01

    An apparatus for remotely measuring and logging the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.

  5. Likelihood Approximation With Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-09-03

    We use available measurements to estimate the unknown parameters (variance, smoothness parameter, and covariance length) of a covariance function by maximizing the joint Gaussian log-likelihood function. To overcome cubic complexity in the linear algebra, we approximate the discretized covariance function in the hierarchical (H-) matrix format. The H-matrix format has a log-linear computational cost and storage O(kn log n), where the rank k is a small integer and n is the number of locations. The H-matrix technique allows us to work with general covariance matrices in an efficient way, since H-matrices can approximate inhomogeneous covariance functions, with a fairly general mesh that is not necessarily axes-parallel, and neither the covariance matrix itself nor its inverse have to be sparse. We demonstrate our method with Monte Carlo simulations and an application to soil moisture data. The C, C++ codes and data are freely available.

  6. Optimization and approximation

    CERN Document Server

    Pedregal, Pablo

    2017-01-01

    This book provides a basic, initial resource, introducing science and engineering students to the field of optimization. It covers three main areas: mathematical programming, calculus of variations and optimal control, highlighting the ideas and concepts and offering insights into the importance of optimality conditions in each area. It also systematically presents affordable approximation methods. Exercises at various levels have been included to support the learning process.

  7. Topics in Metric Approximation

    Science.gov (United States)

    Leeb, William Edward

    This thesis develops effective approximations of certain metrics that occur frequently in pure and applied mathematics. We show that distances that often arise in applications, such as the Earth Mover's Distance between two probability measures, can be approximated by easily computed formulas for a wide variety of ground distances. We develop simple and easily computed characterizations both of norms measuring a function's regularity -- such as the Lipschitz norm -- and of their duals. We are particularly concerned with the tensor product of metric spaces, where the natural notion of regularity is not the Lipschitz condition but the mixed Lipschitz condition. A theme that runs throughout this thesis is that snowflake metrics (metrics raised to a power less than 1) are often better-behaved than ordinary metrics. For example, we show that snowflake metrics on finite spaces can be approximated by the average of tree metrics with a distortion bounded by intrinsic geometric characteristics of the space and not the number of points. Many of the metrics for which we characterize the Lipschitz space and its dual are snowflake metrics. We also present applications of the characterization of certain regularity norms to the problem of recovering a matrix that has been corrupted by noise. We are able to achieve an optimal rate of recovery for certain families of matrices by exploiting the relationship between mixed-variable regularity conditions and the decay of a function's coefficients in a certain orthonormal basis.

  8. Spin Polarized Photons from Axially Charged Plasma at Weak Coupling: Complete Leading Order

    CERN Document Server

    Mamo, Kiminad A

    2015-01-01

    In the presence of (approximately conserved) axial charge in the QCD plasma at finite temperature, the emitted photons are spin-aligned, which is a unique P- and CP-odd signature of axial charge in the photon emission observables. We compute this "P-odd photon emission rate" in weak coupling regime at high temperature limit to complete leading order in the QCD coupling constant: the leading log as well as the constant under the log. As in the P-even total emission rate in the literature, the computation of P-odd emission rate at leading order consists of three parts: 1) Compton and Pair Annihilation processes with hard momentum exchange, 2) soft t- and u-channel contributions with Hard Thermal Loop re-summation, 3) Landau-Pomeranchuk-Migdal (LPM) re-summation of collinear Bremstrahlung and Pair Annihilation. We present analytical and numerical evaluations of these contributions to our P-odd photon emission rate observable.

  9. Exact and Approximate Sizes of Convex Datacubes

    Science.gov (United States)

    Nedjar, Sébastien

    In various approaches, data cubes are pre-computed in order to efficiently answer Olap queries. The notion of data cube has been explored in various ways: iceberg cubes, range cubes, differential cubes or emerging cubes. Previously, we have introduced the concept of convex cube which generalizes all the quoted variants of cubes. More precisely, the convex cube captures all the tuples satisfying a monotone and/or antimonotone constraint combination. This paper is dedicated to a study of the convex cube size. Actually, knowing the size of such a cube even before computing it has various advantages. First of all, free space can be saved for its storage and the data warehouse administration can be improved. However the main interest of this size knowledge is to choose at best the constraints to apply in order to get a workable result. For an aided calibrating of constraints, we propose a sound characterization, based on inclusion-exclusion principle, of the exact size of convex cube as long as an upper bound which can be very quickly yielded. Moreover we adapt the nearly optimal algorithm HyperLogLog in order to provide a very good approximation of the exact size of convex cubes. Our analytical results are confirmed by experiments: the approximated size of convex cubes is really close to their exact size and can be computed quasi immediately.

  10. Approximate option pricing

    Energy Technology Data Exchange (ETDEWEB)

    Chalasani, P.; Saias, I. [Los Alamos National Lab., NM (United States); Jha, S. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-04-08

    As increasingly large volumes of sophisticated options (called derivative securities) are traded in world financial markets, determining a fair price for these options has become an important and difficult computational problem. Many valuation codes use the binomial pricing model, in which the stock price is driven by a random walk. In this model, the value of an n-period option on a stock is the expected time-discounted value of the future cash flow on an n-period stock price path. Path-dependent options are particularly difficult to value since the future cash flow depends on the entire stock price path rather than on just the final stock price. Currently such options are approximately priced by Monte carlo methods with error bounds that hold only with high probability and which are reduced by increasing the number of simulation runs. In this paper the authors show that pricing an arbitrary path-dependent option is {number_sign}-P hard. They show that certain types f path-dependent options can be valued exactly in polynomial time. Asian options are path-dependent options that are particularly hard to price, and for these they design deterministic polynomial-time approximate algorithms. They show that the value of a perpetual American put option (which can be computed in constant time) is in many cases a good approximation to the value of an otherwise identical n-period American put option. In contrast to Monte Carlo methods, the algorithms have guaranteed error bounds that are polynormally small (and in some cases exponentially small) in the maturity n. For the error analysis they derive large-deviation results for random walks that may be of independent interest.

  11. Numerical Modeling of Electroacoustic Logging Including Joule Heating

    Science.gov (United States)

    Plyushchenkov, Boris D.; Nikitin, Anatoly A.; Turchaninov, Victor I.

    It is well known that electromagnetic field excites acoustic wave in a porous elastic medium saturated with fluid electrolyte due to electrokinetic conversion effect. Pride's equations describing this process are written in isothermal approximation. Update of these equations, which allows to take influence of Joule heating on acoustic waves propagation into account, is proposed here. This update includes terms describing the initiation of additional acoustic waves excited by thermoelastic stresses and the heat conduction equation with right side defined by Joule heating. Results of numerical modeling of several problems of propagation of acoustic waves excited by an electric field source with and without consideration of Joule heating effect in their statements are presented. From these results, it follows that influence of Joule heating should be taken into account at the numerical simulation of electroacoustic logging and at the interpretation of its log data.

  12. Frankenstein's Glue: Transition functions for approximate solutions

    CERN Document Server

    Yunes, N

    2006-01-01

    Approximations are commonly employed to find approximate solutions to the Einstein equations. These solutions, however, are usually only valid in some specific spacetime region. A global solution can be constructed by gluing approximate solutions together, but this procedure is difficult because discontinuities can arise, leading to large violations of the Einstein equations. In this paper, we provide an attempt to formalize this gluing scheme by studying transition functions that join approximate solutions together. In particular, we propose certain sufficient conditions on these functions and proof that these conditions guarantee that the joined solution still satisfies the Einstein equations to the same order as the approximate ones. An example is also provided for a binary system of non-spinning black holes, where the approximate solutions are taken to be given by a post-Newtonian expansion and a perturbed Schwarzschild solution. For this specific case, we show that if the transition functions satisfy the...

  13. Finite elements and approximation

    CERN Document Server

    Zienkiewicz, O C

    2006-01-01

    A powerful tool for the approximate solution of differential equations, the finite element is extensively used in industry and research. This book offers students of engineering and physics a comprehensive view of the principles involved, with numerous illustrative examples and exercises.Starting with continuum boundary value problems and the need for numerical discretization, the text examines finite difference methods, weighted residual methods in the context of continuous trial functions, and piecewise defined trial functions and the finite element method. Additional topics include higher o

  14. Diffusion approximation of Lévy processes with a view towards finance

    KAUST Repository

    Kiessling, Jonas

    2011-01-01

    Let the (log-)prices of a collection of securities be given by a d-dimensional Lévy process X t having infinite activity and a smooth density. The value of a European contract with payoff g(x) maturing at T is determined by E[g(X T)]. Let X̄ T be a finite activity approximation to X T, where diffusion is introduced to approximate jumps smaller than a given truncation level ∈ > 0. The main result of this work is a derivation of an error expansion for the resulting model error, E[g(X T) - g(X̄ T)], with computable leading order term. Our estimate depends both on the choice of truncation level ∈ and the contract payoff g, and it is valid even when g is not continuous. Numerical experiments confirm that the error estimate is indeed a good approximation of the model error. Using similar techniques we indicate how to construct an adaptive truncation type approximation. Numerical experiments indicate that a substantial amount of work is to be gained from such adaptive approximation. Finally, we extend the previous model error estimates to the case of Barrier options, which have a particular path dependent structure. © de Gruyter 2011.

  15. 40 CFR 91.412 - Data logging.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic...

  16. 40 CFR 89.409 - Data logging.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE NONROAD COMPRESSION-IGNITION ENGINES Exhaust Emission Test Procedures § 89.409 Data logging. (a) A computer or...

  17. 40 CFR 90.412 - Data logging.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NONROAD SPARK-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Gaseous Exhaust Test Procedures § 90.412 Data logging. (a) A computer...

  18. New Achievements in Well Logging Technology

    Institute of Scientific and Technical Information of China (English)

    Tan Tingdong

    1996-01-01

    @@ In the first five years of 1990s, new achievements made in China's onshore well logging technology have enhanced the benefits of exploration and development for complex oil and gas reservoirs and have thus showed the trend of the development of China's well logging technology towards the end of this century.

  19. LHCb Online Log Analysis and Maintenance System

    CERN Document Server

    Garnier, J-C; Neufeld, N; Nikolaidis, F

    2011-01-01

    History has shown, many times computer logs are the only information an administrator may have for an incident, which could be caused either by a malfunction or an attack. Due to the huge amount of logs that are produced from large-scale IT infrastructures, such as LHCb Online, critical information may be overlooked or simply be drowned in a sea of other messages. This clearly demonstrates the need for an automatic system for long-term maintenance and real time analysis of the logs. We have constructed a low cost, fault tolerant centralized logging system which is able to do in-depth analysis and cross-correlation of every log. This system is capable of handling O(10000) different log sources and numerous formats, while trying to keep the overhead as low as possible. It provides log gathering and management, Offline analysis and online analysis. We call Offline analysis the procedure of analyzing old logs for critical information, while Online analysis refer to the procedure of early alerting and reacting. ...

  20. Chernoff's density is log-concave.

    Science.gov (United States)

    Balabdaoui, Fadoua; Wellner, Jon A

    2014-02-01

    We show that the density of Z = argmax{W (t) - t(2)}, sometimes known as Chernoff's density, is log-concave. We conjecture that Chernoff's density is strongly log-concave or "super-Gaussian", and provide evidence in support of the conjecture.

  1. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus

    This paper reviews useful results related to Palm distributions of spatial point processes and provides a new result regarding the characterization of Palm distributions for the class of log Gaussian Cox processes. This result is used to study functional summary statistics for a log Gaussian Cox...

  2. Aggregation of log-linear risks

    DEFF Research Database (Denmark)

    Embrechts, Paul; Hashorva, Enkeleijd; Mikosch, Thomas Valentin

    2014-01-01

    In this paper we work in the framework of a k-dimensional vector of log-linear risks. Under weak conditions on the marginal tails and the dependence structure of a vector of positive risks, we derive the asymptotic tail behaviour of the aggregated risk {and present} an application concerning log...

  3. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  4. Write-Combined Logging: An Optimized Logging for Consistency in NVRAM

    OpenAIRE

    Wenzhe Zhang; Kai Lu; Mikel Luján; Xiaoping Wang; Xu Zhou

    2015-01-01

    Nonvolatile memory (e.g., Phase Change Memory) blurs the boundary between memory and storage and it could greatly facilitate the construction of in-memory durable data structures. Data structures can be processed and stored directly in NVRAM. To maintain the consistency of persistent data, logging is a widely adopted mechanism. However, logging introduces write-twice overhead. This paper introduces an optimized write-combined logging to reduce the writes to NVRAM log. By leveraging the fast-r...

  5. Designing and Piloting a Leadership Daily Practice Log: Using Logs to Study the Practice of Leadership

    Science.gov (United States)

    Spillane, James P.; Zuberi, Anita

    2009-01-01

    Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

  6. Dynamical evolution of active detached binaries on log Jo - log M diagram and contact binary formation

    CERN Document Server

    Eker, Z; Bilir, S; Karatas, Y

    2006-01-01

    Orbital angular momentum (Jo), systemic mass (M) and orbital period (P) distributions of chromospherically active binaries (CAB) and W Ursae Majoris (W UMa) systems were investigated. The diagrams of log Jo - log P, log M - log P and log Jo-log M were formed from 119 CAB and 102 W UMa stars. The log Jo-log M diagram is found to be most meaningful in demonstrating dynamical evolution of binary star orbits. A slightly curved borderline (contact border) separating the detached and the contact systems was discovered on the log Jo - log M diagram. Since orbital size (a) and period (P) of binaries are determined by their current Jo, M and mass ratio q, the rates of orbital angular momentum loss (dlog Jo/dt) and mass loss (dlog M/dt) are primary parameters to determine the direction and the speed of the dynamical evolution. A detached system becomes a contact system if its own dynamical evolution enables it to pass the contact border on the log Jo - log M diagram. Evolution of q for a mass loosing detached system is...

  7. A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data

    KAUST Repository

    Liang, Faming

    2013-03-01

    The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  8. Approximate strip exchanging.

    Science.gov (United States)

    Roy, Swapnoneel; Thakur, Ashok Kumar

    2008-01-01

    Genome rearrangements have been modelled by a variety of primitives such as reversals, transpositions, block moves and block interchanges. We consider such a genome rearrangement primitive Strip Exchanges. Given a permutation, the challenge is to sort it by using minimum number of strip exchanges. A strip exchanging move interchanges the positions of two chosen strips so that they merge with other strips. The strip exchange problem is to sort a permutation using minimum number of strip exchanges. We present here the first non-trivial 2-approximation algorithm to this problem. We also observe that sorting by strip-exchanges is fixed-parameter-tractable. Lastly we discuss the application of strip exchanges in a different area Optical Character Recognition (OCR) with an example.

  9. S-Approximation: A New Approach to Algebraic Approximation

    Directory of Open Access Journals (Sweden)

    M. R. Hooshmandasl

    2014-01-01

    Full Text Available We intend to study a new class of algebraic approximations, called S-approximations, and their properties. We have shown that S-approximations can be used for applied problems which cannot be modeled by inclusion based approximations. Also, in this work, we studied a subclass of S-approximations, called Sℳ-approximations, and showed that this subclass preserves most of the properties of inclusion based approximations but is not necessarily inclusionbased. The paper concludes by studying some basic operations on S-approximations and counting the number of S-min functions.

  10. Comparison of formation and fluid-column logs in a heterogeneous basalt aquifer.

    Science.gov (United States)

    Paillet, F L; Williams, J H; Oki, D S; Knutson, K D

    2002-01-01

    Deep observation boreholes in the vicinity of active production wells in Honolulu, Hawaii, exhibit the anomalous condition that fluid-column electrical conductivity logs and apparent profiles of pore-water electrical conductivity derived from induction conductivity logs are nearly identical if a formation factor of 12.5 is assumed. This condition is documented in three boreholes where fluid-column logs clearly indicate the presence of strong borehole flow induced by withdrawal from partially penetrating water-supply wells. This result appears to contradict the basic principles of conductivity-log interpretation. Flow conditions in one of these boreholes was investigated in detail by obtaining flow profiles under two water production conditions using the electromagnetic flowmeter. The flow-log interpretation demonstrates that the fluid-column log resembles the induction log because the amount of inflow to the borehole increases systematically upward through the transition zone between deeper salt water and shallower fresh water. This condition allows the properties of the fluid column to approximate the properties of water entering the borehole as soon as the upflow stream encounters that producing zone. Because this condition occurs in all three boreholes investigated, the similarity of induction and fluid-column logs is probably not a coincidence, and may relate to aquifer response under the influence of pumping from production wells.

  11. Exploratory data analysis of the dependencies between skin permeability, molecular weight and log P.

    Science.gov (United States)

    Kilian, D; Lemmer, H J R; Gerber, M; du Preez, J L; du Plessis, J

    2016-06-01

    Molecular weight and log P remain the most frequently used physicochemical properties in models that predict skin permeability. However, several reports over the past two decades have suggested that predictions made by these models may not be sufficiently accurate. In this study, exploratory data analysis of the probabilistic dependencies between molecular weight, log P and log Kp was performed on a dataset constructed from the combination of several popular datasets. The results suggest that, in general, molecular weight and log P are poorly correlated to log Kp. However, after employing several exploratory data analysis techniques, regions within the dataset of statistically significant dependence were identified. As an example of the applicability of the information extracted from the exploratory data analyses, a multiple linear regression model was constructed, bounded by the ranges of dependence. This model gave reasonable approximations to log Kp values obtained from skin permeability studies of selected non-steroidal ant-inflammatory drugs (NSAIDs) administered from a buffer solution and a lipid-based drug delivery system. A method of testing whether a given drug falls within the regions of statistical dependence was also presented. Knowing the ranges within which molecular weight and log P are statistically related to log Kp can supplement existing methods of screening, risk analysis or early drug development decision making to add confidence to predictions made regarding skin permeability.

  12. Diophantine approximations and Diophantine equations

    CERN Document Server

    Schmidt, Wolfgang M

    1991-01-01

    "This book by a leading researcher and masterly expositor of the subject studies diophantine approximations to algebraic numbers and their applications to diophantine equations. The methods are classical, and the results stressed can be obtained without much background in algebraic geometry. In particular, Thue equations, norm form equations and S-unit equations, with emphasis on recent explicit bounds on the number of solutions, are included. The book will be useful for graduate students and researchers." (L'Enseignement Mathematique) "The rich Bibliography includes more than hundred references. The book is easy to read, it may be a useful piece of reading not only for experts but for students as well." Acta Scientiarum Mathematicarum

  13. Approximation of Surfaces by Cylinders

    DEFF Research Database (Denmark)

    Randrup, Thomas

    1998-01-01

    We present a new method for approximation of a given surface by a cylinder surface. It is a constructive geometric method, leading to a monorail representation of the cylinder surface. By use of a weighted Gaussian image of the given surface, we determine a projection plane. In the orthogonal...... projection of the surface onto this plane, a reference curve is determined by use of methods for thinning of binary images. Finally, the cylinder surface is constructed as follows: the directrix of the cylinder surface is determined by a least squares method minimizing the distance to the points...... in the projection within a tolerance given by the reference curve, and the rulings are lines perpendicular to the projection plane. Application of the method in ship design is given....

  14. Approximate Flavor Symmetry in Supersymmetric Model

    OpenAIRE

    Tao, Zhijian

    1998-01-01

    We investigate the maximal approximate flavor symmetry in the framework of generic minimal supersymmetric standard model. We consider the low energy effective theory of the flavor physics with all the possible operators included. Spontaneous flavor symmetry breaking leads to the approximate flavor symmetry in Yukawa sector and the supersymmetry breaking sector. Fermion mass and mixing hierachies are the results of the hierachy of the flavor symmetry breaking. It is found that in this theory i...

  15. Intuitionistic Fuzzy Automaton for Approximate String Matching

    Directory of Open Access Journals (Sweden)

    K.M. Ravi

    2014-03-01

    Full Text Available This paper introduces an intuitionistic fuzzy automaton model for computing the similarity between pairs of strings. The model details the possible edit operations needed to transform any input (observed string into a target (pattern string by providing a membership and non-membership value between them. In the end, an algorithm is given for approximate string matching and the proposed model computes the similarity and dissimilarity between the pair of strings leading to better approximation.

  16. The fluid-compensated cement bond log

    Energy Technology Data Exchange (ETDEWEB)

    Nayfeh, T.H.; Leslie, H.D.; Wheelis, W.B.

    1984-09-01

    An experimental and numerical wave mechanics study of cement bond logs demonstrated that wellsite computer processing can now segregate wellbore fluid effects from the sonic signal response to changing cement strength. Traditionally, cement logs have been interpreted as if water were in the wellbore, without consideration of wellbore fluid effects. These effects were assumed to be negligible. However, with the increasing number of logs being run in completion fluids such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/, large variations in cement bond logs became apparent. A Schlumberger internal paper showing that bond log amplitude is related to the acoustic impedance of the fluid in which the tool is run led to a comprehensive study of wellbore fluid effects. Numerical and experimental models were developed simulating wellbore geometry. Measurements were conducted in 5-, 7-, and 95/8-in. casings by varying the wellbore fluid densities, viscosities, and fluid types (acoustic impedance). Parallel numerical modeling was undertaken using similar parameters. The results showed that the bond log amplitude varied dramatically with the wellbore fluid's acoustic impedance; for example, there was a 70 percent increase in the signal amplitude for 11.5-lb/ gal CaCl/sub 2/ over the signal amplitude in water. This led to the development of a Fluid-Compensated Bond log that corrects the amplitude for acoustic impedance of varying wellbore fluids, thereby making the measurements more directly related to the cement quality.

  17. Approximate distance oracles for planar graphs with improved query time-space tradeoff

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    2016-01-01

    We consider approximate distance oracles for edge-weighted n-vertex undirected planar graphs. Given fixed ϵ > 0, we present a (1 + ϵ)-approximate distance oracle with O(n(log log n)2) space and O((loglogr?,)3) query time. This improves the previous best product of query time and space of the orac...... of the oracles of Thorup (FOCS 2001, J. ACM 2004) and Klein (SODA 2002) from O(nlogn) to O(n(loglogn)5)....

  18. How Low Can Approximate Degree and Quantum Query Complexity be for Total Boolean Functions?

    CERN Document Server

    Ambainis, Andris

    2012-01-01

    It has long been known that any Boolean function that depends on n input variables has both degree and exact quantum query complexity of Omega(log n), and that this bound is achieved for some functions. In this paper we study the case of approximate degree and bounded-error quantum query complexity. We show that for these measures the correct lower bound is Omega(log n / log log n), and we exhibit quantum algorithms for two functions where this bound is achieved.

  19. SAR image regularization with fast approximate discrete minimization.

    Science.gov (United States)

    Denis, Loïc; Tupin, Florence; Darbon, Jérôme; Sigelle, Marc

    2009-07-01

    Synthetic aperture radar (SAR) images, like other coherent imaging modalities, suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. Numerous approaches have been proposed to filter speckle noise. Markov random field (MRF) modelization provides a convenient way to express both data fidelity constraints and desirable properties of the filtered image. In this context, total variation minimization has been extensively used to constrain the oscillations in the regularized image while preserving its edges. Speckle noise follows heavy-tailed distributions, and the MRF formulation leads to a minimization problem involving nonconvex log-likelihood terms. Such a minimization can be performed efficiently by computing minimum cuts on weighted graphs. Due to memory constraints, exact minimization, although theoretically possible, is not achievable on large images required by remote sensing applications. The computational burden of the state-of-the-art algorithm for approximate minimization (namely the alpha -expansion) is too heavy specially when considering joint regularization of several images. We show that a satisfying solution can be reached, in few iterations, by performing a graph-cut-based combinatorial exploration of large trial moves. This algorithm is applied to joint regularization of the amplitude and interferometric phase in urban area SAR images.

  20. On the log-normal distribution of network traffic

    Science.gov (United States)

    Antoniou, I.; Ivanov, V. V.; Ivanov, Valery V.; Zrelov, P. V.

    2002-07-01

    A detailed analysis of traffic measurements shows that the aggregation of these measurements forms a statistical distribution, which is approximated with high accuracy by the log-normal distribution. The inter-arrival times and packet sizes, contributing to the formation of network traffic, can be considered as independent. Applying the wavelet transform to traffic measurements, we demonstrate the multiplicative character of traffic series. This result confirms that the scheme, developed by Kolmogorov [Dokl. Akad. Nauk SSSR 31 (1941) 99] for the homogeneous fragmentation of grains, applies also to network traffic.

  1. Prestack traveltime approximations

    KAUST Repository

    Alkhalifah, Tariq Ali

    2012-05-01

    Many of the explicit prestack traveltime relations used in practice are based on homogeneous (or semi-homogenous, possibly effective) media approximations. This includes the multifocusing, based on the double square-root (DSR) equation, and the common reflection stack (CRS) approaches. Using the DSR equation, I constructed the associated eikonal form in the general source-receiver domain. Like its wave-equation counterpart, it suffers from a critical singularity for horizontally traveling waves. As a result, I recasted the eikonal in terms of the reflection angle, and thus, derived expansion based solutions of this eikonal in terms of the difference between the source and receiver velocities in a generally inhomogenous background medium. The zero-order term solution, corresponding to ignoring the lateral velocity variation in estimating the prestack part, is free of singularities and can be used to estimate traveltimes for small to moderate offsets (or reflection angles) in a generally inhomogeneous medium. The higher-order terms include limitations for horizontally traveling waves, however, we can readily enforce stability constraints to avoid such singularities. In fact, another expansion over reflection angle can help us avoid these singularities by requiring the source and receiver velocities to be different. On the other hand, expansions in terms of reflection angles result in singularity free equations. For a homogenous background medium, as a test, the solutions are reasonably accurate to large reflection and dip angles. A Marmousi example demonstrated the usefulness and versatility of the formulation. © 2012 Society of Exploration Geophysicists.

  2. Computer vision technology in log volume inspection

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Log volume inspection is very important in forestry research and paper making engineering. This paper proposed a novel approach based on computer vision technology to cope with log volume inspection. The needed hardware system was analyzed and the details of the inspection algorithms were given. A fuzzy entropy based on image enhancement algorithm was presented for enhancing the image of the cross-section of log. In many practical applications the cross-section is often partially invisible, and this is the major obstacle for correct inspection. To solve this problem, a robust Hausdorff distance method was proposed to recover the whole cross-section. Experiment results showed that this method was efficient.

  3. Slim hole logging in shallow boreholes

    Directory of Open Access Journals (Sweden)

    R. Monnet

    2000-06-01

    Full Text Available While well logging, a continuous recording of the physical parameters down a borehole, is employed systematically in petroleum exploration, its application in environmental prospections, such as hydrogeology or civil engeering, has been very limited. This deficiency is partly due to the fact that logging probes used in this kind of boreholes are generally not calibrated and the results are more or less qualitative. The purpose of this lecture is to show that it is possible to calibrate these tools in order to obtain quantitative results, to make available to geologists, engineers and technicians engaged in shallow exploration, the information required for effectively applying the well-logging method.

  4. Sample Log For International Mudlogging Projects

    Institute of Scientific and Technical Information of China (English)

    Wei Xinghua; Yang Haibo; Andrew Romolliwa

    2000-01-01

    Accurate sample logging is very essential part of mud logging at wellsite. During the logging work of the foreign cooperation mudlogging projects, the description of drillling cuttings (sample) required by the foreign companies is quite different from what we did at home.This paper is intended to give some ideas of description of sample at wellsite with the reference of the guidance of standards of several foreign companies and the working experiences of the author, also some problems that the geologists should pay attention to during the description of sample at wellsite.

  5. Recognizing Patterns In Log-Polar Coordinates

    Science.gov (United States)

    Weiman, Carl F. R.

    1992-01-01

    Log-Hough transform is basis of improved method for recognition of patterns - particularly, straight lines - in noisy images. Takes advantage of rotational and scale invariance of mapping from Cartesian to log-polar coordinates, and offers economy of representation and computation. Unification of iconic and Hough domains simplifies computations in recognition and eliminates erroneous quantization of slopes attributable to finite spacing of Cartesian coordinate grid of classical Hough transform. Equally efficient recognizing curves. Log-Hough transform more amenable to massively parallel computing architectures than traditional Cartesian Hough transform. "In-place" nature makes it possible to apply local pixel-neighborhood processing.

  6. Atypical Log D profile of rifampicin

    Directory of Open Access Journals (Sweden)

    Mariappan T

    2007-01-01

    Full Text Available The distribution coefficient (log D values of rifampicin, an essential first-line antitubercular drug, at gastrointestinal pH conditions are not reported in literature. Hence determinations were made using n-octanol and buffers ranging between pH 1-7. Also, log D values were predicted using Prolog D. Both the determinations showed opposite behaviour. The atypical experimental log D profile of rifampicin could be attributed to its surface-active properties, which also explained the reported permeability behaviour of the drug in various gastrointestinal tract segments.

  7. Logging-while-coring method and apparatus

    Science.gov (United States)

    Goldberg, David S.; Myers, Gregory J.

    2007-11-13

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  8. Well Logging Equipment Updated in China

    Institute of Scientific and Technical Information of China (English)

    Xu Lili

    1996-01-01

    @@ As one of the ten principal disciplines in the petroleum industry, well logging has been developed for about 55years in China and is playing an increasingly important role in the country's oil and gas exploration and development.

  9. Supply of Rubber Wood Log in Malaysia

    Directory of Open Access Journals (Sweden)

    A. W. Noraida

    2014-06-01

    Full Text Available Issue on shortage of raw material for wood processing solved by discovery of rubber wood log as one of the substitutes the natural log. This paper examines the supply of rubber wood log in Malaysia. We employ ARDL Bound Approach Test and time series data from 1980 to 2010 which represented the whole Malaysia are used to achieve the established objectives. The result shown, in the long run harvested area and wages have 1% and 10% significant level respectively. While in the short run, there was only harvested area having an impact with 1% significant level. This result indicates that, the harvested area become the most impact towards supply of rubber wood log either in short run or in the long run. While wages as input cost gave less impact in another word it become unburden to the producers.

  10. LATTE - Log and Time Tracking for Elections

    Data.gov (United States)

    Office of Personnel Management — LATTE - Log and Time Tracking for Elections is a time tracking and voucher preparation system used to schedule employees to cover elections, to document their time...

  11. Slim hole logging in shallow boreholes

    OpenAIRE

    Monnet, R.; L. Baron; Chapellier, D. M.

    2000-01-01

    While well logging, a continuous recording of the physical parameters down a borehole, is employed systematically in petroleum exploration, its application in environmental prospections, such as hydrogeology or civil engeering, has been very limited. This deficiency is partly due to the fact that logging probes used in this kind of boreholes are generally not calibrated and the results are more or less qualitative. The purpose of this lecture is to show that it is possible to calibrate these ...

  12. LOG PERIODIC DIPOLE ARRAY WITH PARASITIC ELEMENTS

    Science.gov (United States)

    The design and measured characteristics of dipole and monopole versions of a log periodic array with parasitic elements are discussed. In a dipole...array with parasitic elements, these elements are used in place of every alternate dipole, thereby eliminating the need of a twisted feed arrangement...for the elements to obtain log periodic performance of the anntenna. This design with parasitic elements lends itself to a monopole version of the

  13. Conversation Threads Hidden within Email Server Logs

    Science.gov (United States)

    Palus, Sebastian; Kazienko, Przemysław

    Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

  14. Selective Logging, Fire, and Biomass in Amazonia

    Science.gov (United States)

    Houghton, R. A.

    1999-01-01

    Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

  15. 32 CFR 700.845 - Maintenance of logs.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  16. 29 CFR 42.7 - Complaint/directed action logs.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  17. Computer analysis of digital well logs

    Science.gov (United States)

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  18. Researching Illegal Logging and Deforestation

    Directory of Open Access Journals (Sweden)

    Tim Boekhout van Solinge

    2014-08-01

    Full Text Available Tropical deforestation such as in the Amazon can be studied well from a green criminological perspective. Ethnographic research methods form a useful way to get insight into the dynamics and complexity of tropical deforestation, which often is illegal. This article gives an account of various ethnographic visits to the rainforests of the Amazon in the period 2003-2014. Ethnographic methods provide insight into the overlap between the legal and illegal, the functioning (or not of state institutions, the power of (corporate lobbies, and why tropical deforestation correlates with crimes such as corruption and violence. The use of ethnographic methods in forest areas where trustworthy state actors and institutions are not very present can also present danger and raise ethical issues (such as when the researcher, for reasons of safety, does not present as a criminological researcher. However, a large advantage of ethnographic visits to tropical rainforests is that they allow the gathering of local views and voices, which rarely reach the international level. These local views lead to interesting contradictions at the international level where corporate views and lobbies dominate.

  19. Operators of Approximations and Approximate Power Set Spaces

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xian-yong; MO Zhi-wen; SHU Lan

    2004-01-01

    Boundary inner and outer operators are introduced; and union, intersection, complement operators of approximations are redefined. The approximation operators have a good property of maintaining union, intersection, complement operators, so the rough set theory has been enriched from the operator-oriented and set-oriented views. Approximate power set spaces are defined, and it is proved that the approximation operators are epimorphisms from power set space to approximate power set spaces. Some basic properties of approximate power set space are got by epimorphisms in contrast to power set space.

  20. Approximation Algorithms for Directed Width Parameters

    CERN Document Server

    Kintali, Shiva; Kumar, Akash

    2011-01-01

    Treewidth of an undirected graph measures how close the graph is to being a tree. Several problems that are NP-hard on general graphs are solvable in polynomial time on graphs with bounded treewidth. Motivated by the success of treewidth, several directed analogues of treewidth have been introduced to measure the similarity of a directed graph to a directed acyclic graph (DAG). Directed treewidth, D-width, DAG-width, Kelly-width and directed pathwidth are some such parameters. In this paper, we present the first approximation algorithms for all these five directed width parameters. For directed treewidth and D-width we achieve an approximation factor of O(sqrt{logn}). For DAG-width, Kelly-width and directed pathwidth we achieve an O({\\log}^{3/2}{n}) approximation factor. Our algorithms are constructive, i.e., they construct the decompositions associated with these parameters. The width of these decompositions are within the above mentioned factor of the corresponding optimal width.

  1. Entropy Approximation in Lossy Source Coding Problem

    Directory of Open Access Journals (Sweden)

    Marek Śmieja

    2015-05-01

    Full Text Available In this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained.

  2. Lead Poisoning

    Science.gov (United States)

    Lead is a metal that occurs naturally in the earth's crust. Lead can be found in all parts of our ... from human activities such as mining and manufacturing. Lead used to be in paint; older houses may ...

  3. UiLog:Improving Log-Based Fault Diagnosis by Log Analysis

    Institute of Scientific and Technical Information of China (English)

    De-Qing Zou; Hao Qin; Hai Jin

    2016-01-01

    In modern computer systems, system event logs have always been the primary source for checking system status. As computer systems become more and more complex, the interaction between software and hardware increases frequently. The components will generate enormous log information, including running reports and fault information. The sheer quantity of data is a great challenge for analysis relying on the manual method. In this paper, we implement a management and analysis system of log information, which can assist system administrators to understand the real-time status of the entire system, classify logs into different fault types, and determine the root cause of the faults. In addition, we improve the existing fault correlation analysis method based on the results of system log classification. We apply the system in a cloud computing environment for evaluation. The results show that our system can classify fault logs automatically and effectively. With the proposed system, administrators can easily detect the root cause of faults.

  4. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    Science.gov (United States)

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots.

  5. Factor analysis of borehole logs for evaluating formation shaliness: a hydrogeophysical application for groundwater studies

    Science.gov (United States)

    Szabó, Norbert Péter; Dobróka, Mihály; Turai, Endre; Szűcs, Péter

    2014-05-01

    The calculation of groundwater reserves in shaly sand aquifers requires a reliable estimation of effective porosity and permeability; the amount of shaliness as a related quantity can be determined from well log analysis. The conventionally used linear model, connecting the natural gamma-ray index to shale content, often gives only a rough estimate of shale volume. A non-linear model is suggested, which is derived from the factor analysis of well-logging data. An earlier study of hydrocarbon wells revealed an empirical relationship between the factor scores and shale volume, independent of the well site. Borehole logs from three groundwater wells drilled in the northeastern Great Hungarian Plain are analyzed to derive depth logs of factor variables, which are then correlated with shale volumes given from the method of Larionov. Shale volume logs derived by the statistical procedure are in close agreement with those derived from Larionov's formula, which confirms the validity of the non-linear approximation. The statistical results are in good accordance with laboratory measurements made on core samples. Whereas conventional methods normally use a single well log as input, factor analysis processes all available logs to provide groundwater exploration with reliable estimations of shale volume.

  6. LOG2MARKUP: State module to transform a Stata text log into a markup document

    DEFF Research Database (Denmark)

    2016-01-01

    log2markup extract parts of the text version from the Stata log command and transform the logfile into a markup based document with the same name, but with extension markup (or otherwise specified in option extension) instead of log. The author usually uses markdown for writing documents. However...... other users may decide on all sorts of markup languages, eg HTML or LaTex. The key is that markup of Stata code and Stata output can be set by the options....

  7. Turbo Equalization Using Partial Gaussian Approximation

    DEFF Research Database (Denmark)

    Zhang, Chuanzong; Wang, Zhongyong; Manchón, Carles Navarro

    2016-01-01

    returned by the equalizer by using a partial Gaussian approximation (PGA). We exploit the specific structure of the ISI channel model to compute the latter messages from the beliefs obtained using a Kalman smoother/equalizer. Doing so leads to a significant complexity reduction compared to the initial PGA...

  8. International Conference Approximation Theory XV

    CERN Document Server

    Schumaker, Larry

    2017-01-01

    These proceedings are based on papers presented at the international conference Approximation Theory XV, which was held May 22–25, 2016 in San Antonio, Texas. The conference was the fifteenth in a series of meetings in Approximation Theory held at various locations in the United States, and was attended by 146 participants. The book contains longer survey papers by some of the invited speakers covering topics such as compressive sensing, isogeometric analysis, and scaling limits of polynomials and entire functions of exponential type. The book also includes papers on a variety of current topics in Approximation Theory drawn from areas such as advances in kernel approximation with applications, approximation theory and algebraic geometry, multivariate splines for applications, practical function approximation, approximation of PDEs, wavelets and framelets with applications, approximation theory in signal processing, compressive sensing, rational interpolation, spline approximation in isogeometric analysis, a...

  9. Well log characterization of natural gas hydrates

    Science.gov (United States)

    Collett, Timothy S.; Lee, Myung W.

    2011-01-01

    In the last 25 years we have seen significant advancements in the use of downhole well logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From an early start of using wireline electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells drilled in Arctic permafrost environments to today where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. The most established and well known use of downhole log data in gas hydrate research is the use of electrical resistivity and acoustic velocity data (both compressional- and shear-wave data) to make estimates of gas hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. New downhole logging tools designed to make directionally oriented acoustic and propagation resistivity log measurements have provided the data needed to analyze the acoustic and electrical anisotropic properties of both highly inter-bedded and fracture dominated gas hydrate reservoirs. Advancements in nuclear-magnetic-resonance (NMR) logging and wireline formation testing have also allowed for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids (i.e., free-water along with clay and capillary bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas hydrate reservoir properties (i.e., permeabilities) needed to accurately predict gas production rates for various gas hydrate

  10. La logística integral como ventaja competitiva y sistema logístico

    Directory of Open Access Journals (Sweden)

    Mario Anselmo Martínez gallardo

    2013-07-01

    Full Text Available Esta investigación analiza la logística integral como una ventaja competitiva y el sistema logístico. Es opinión de los autores que la logística integral es esencial para el intercambio de mercancías, toda vez que reduce costos y hace más ágil la actividad comercial. La utilización de esta logística representa una ventaja competitiva con respecto de otras empresas por la optimización del capital y ahorro de tiempo.

  11. Lead Toxicity

    Science.gov (United States)

    ... including some imported jewelry. What are the health effects of lead? • More commonly, lower levels of lead in children over time may lead to reduced IQ, slow learning, Attention Deficit Hyperactivity Disorder (ADHD), or behavioral issues. • Lead also affects other ...

  12. Hierarchical low-rank approximation for high dimensional approximation

    KAUST Repository

    Nouy, Anthony

    2016-01-07

    Tensor methods are among the most prominent tools for the numerical solution of high-dimensional problems where functions of multiple variables have to be approximated. Such high-dimensional approximation problems naturally arise in stochastic analysis and uncertainty quantification. In many practical situations, the approximation of high-dimensional functions is made computationally tractable by using rank-structured approximations. In this talk, we present algorithms for the approximation in hierarchical tensor format using statistical methods. Sparse representations in a given tensor format are obtained with adaptive or convex relaxation methods, with a selection of parameters using crossvalidation methods.

  13. Unconventional neutron sources for oil well logging

    Energy Technology Data Exchange (ETDEWEB)

    Frankle, C.M., E-mail: cfrankle@lanl.gov; Dale, G.E.

    2013-09-21

    Americium–Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well logging purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological ({sup 252}Cf) and electronic accelerator driven (D–D and D–T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well logging tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well logging tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from {sup 252}Cf, D–D, D–T, filtered D–T, and T–T sources. -- Highlights: • AmBe sources are widely used for well logging purposes. • Governmental bodies would prefer to minimize AmBe use. • Other neutron sources are available, both radiological and electronic. • Tritium–tritium spectrum neutrons have similar logging tool response to AmBe. • A tritium–tritium neutron generator may be a viable AmBe replacement.

  14. Event Normalization Through Dynamic Log Format Detection

    Institute of Scientific and Technical Information of China (English)

    Christoph Meinel

    2014-01-01

    The analytical and monitoring capabilities of central event re-positories, such as log servers and intrusion detection sys-tems, are limited by the amount of structured information ex-tracted from the events they receive. Diverse networks and ap-plications log their events in many different formats, and this makes it difficult to identify the type of logs being received by the central repository. The way events are logged by IT systems is problematic for developers of host-based intrusion-detection systems (specifically, host-based systems), develop-ers of security-information systems, and developers of event-management systems. These problems preclude the develop-ment of more accurate, intrusive security solutions that obtain results from data included in the logs being processed. We propose a new method for dynamically normalizing events into a unified super-event that is loosely based on the Common Event Expression standard developed by Mitre Corporation. We explain how our solution can normalize seemingly unrelat-ed events into a single, unified format.

  15. Fractal Correction of Well Logging Curves

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    It is always significant for assessing and evaluation of oil-bearing layers, especially for well logging data processing and interpretation of non-marine oil beds to get more accurate physical properties in thin and inter-thin layers. This paper presents a definition of measures and the measure presents power law relation with the corresponded scale described by fractal theory. Thus, logging curves can be reconstructed according to this power law relation. This method uses the local structure nearby concurrent points to com pensate the average effect of logging probes and measurement errors. As an example, deep and medium induced conductivity (IMPH and IDPH) curves in ODP Leg 127 Hole 797C are reconstructed or corrected. Corrected curves are with less adjacent effects through comparison of corrected curves with original one. And also, the power spectra of corrected well logging curve are abounding with more resolution components than the original one. Thus, fractal correction method makes the well logging more resoluble for thin beds.``

  16. Frankenstein's glue: transition functions for approximate solutions

    Science.gov (United States)

    Yunes, Nicolás

    2007-09-01

    Approximations are commonly employed to find approximate solutions to the Einstein equations. These solutions, however, are usually only valid in some specific spacetime region. A global solution can be constructed by gluing approximate solutions together, but this procedure is difficult because discontinuities can arise, leading to large violations of the Einstein equations. In this paper, we provide an attempt to formalize this gluing scheme by studying transition functions that join approximate analytic solutions together. In particular, we propose certain sufficient conditions on these functions and prove that these conditions guarantee that the joined solution still satisfies the Einstein equations analytically to the same order as the approximate ones. An example is also provided for a binary system of non-spinning black holes, where the approximate solutions are taken to be given by a post-Newtonian expansion and a perturbed Schwarzschild solution. For this specific case, we show that if the transition functions satisfy the proposed conditions, then the joined solution does not contain any violations to the Einstein equations larger than those already inherent in the approximations. We further show that if these functions violate the proposed conditions, then the matter content of the spacetime is modified by the introduction of a matter shell, whose stress energy tensor depends on derivatives of these functions.

  17. Entanglement in the Born-Oppenheimer Approximation

    CERN Document Server

    Izmaylov, Artur F

    2016-01-01

    The role of electron-nuclear entanglement on the validity of the Born-Oppenheimer (BO) approximation is investigated. While nonadiabatic couplings generally lead to entanglement and to a failure of the BO approximation, surprisingly the degree of electron-nuclear entanglement is found to be uncorrelated with the degree of validity of the BO approximation. This is because while the degree of entanglement of BO states is determined by their deviation from the corresponding states in the crude BO approximation, the accuracy of the BO approximation is dictated, instead, by the deviation of the BO states from the exact electron-nuclear states. In fact, in the context of a minimal avoided crossing model, extreme cases are identified where an adequate BO state is seen to be maximally entangled, and where the BO approximation fails but the associated BO state remains approximately unentangled. Further, the BO states are found to not preserve the entanglement properties of the exact electron-nuclear eigenstates, and t...

  18. Sublinear Time Approximate Sum via Uniform Random Sampling

    CERN Document Server

    Fu, Bin; Peng, Zhiyong

    2012-01-01

    We investigate the approximation for computing the sum $a_1+...+a_n$ with an input of a list of nonnegative elements $a_1,..., a_n$. If all elements are in the range $[0,1]$, there is a randomized algorithm that can compute an $(1+\\epsilon)$-approximation for the sum problem in time ${O({n(\\log\\log n)\\over\\sum_{i=1}^n a_i})}$, where $\\epsilon$ is a constant in $(0,1)$. Our randomized algorithm is based on the uniform random sampling, which selects one element with equal probability from the input list each time. We also prove a lower bound $\\Omega({n\\over \\sum_{i=1}^n a_i})$, which almost matches the upper bound, for this problem.

  19. Nonlinear Approximation Using Gaussian Kernels

    CERN Document Server

    Hangelbroek, Thomas

    2009-01-01

    It is well-known that non-linear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian function spatially according to local information about the approximand: error analysis of Gaussian approximation schemes with varying tension are, by and large, an elusive target for approximators. We introduce and analyze in this paper a new algorithm for approximating functions using translates of Gaussian functions with varying tension parameters. Our scheme is sophisticated to a degree that it employs even locally Gaussians with varying tensions, and that it resolves local ...

  20. Forms of Approximate Radiation Transport

    CERN Document Server

    Brunner, G

    2002-01-01

    Photon radiation transport is described by the Boltzmann equation. Because this equation is difficult to solve, many different approximate forms have been implemented in computer codes. Several of the most common approximations are reviewed, and test problems illustrate the characteristics of each of the approximations. This document is designed as a tutorial so that code users can make an educated choice about which form of approximate radiation transport to use for their particular simulation.

  1. Approximation by Multivariate Singular Integrals

    CERN Document Server

    Anastassiou, George A

    2011-01-01

    Approximation by Multivariate Singular Integrals is the first monograph to illustrate the approximation of multivariate singular integrals to the identity-unit operator. The basic approximation properties of the general multivariate singular integral operators is presented quantitatively, particularly special cases such as the multivariate Picard, Gauss-Weierstrass, Poisson-Cauchy and trigonometric singular integral operators are examined thoroughly. This book studies the rate of convergence of these operators to the unit operator as well as the related simultaneous approximation. The last cha

  2. Oil spill characterization in the hybrid polarity SAR domain using log-cumulants

    Science.gov (United States)

    Espeseth, Martine M.; Skrunes, Stine; Brekke, Camilla; Salberg, Arnt-Børre; Jones, Cathleen E.; Holt, Benjamin

    2016-10-01

    Log-cumulants have proven to be an interesting tool for evaluating the statistical properties of potential oil spills in polarimetric Synthetic Aperture Radar (SAR) data within the common horizontal (H) and vertical (V) polarization basis. The use of first, second, and third order sample log-cumulants has shown potential for evaluating the texture and the statistical distributions, as well as discriminating oil from look-alikes. Log-cumulants are cumulants derived in the log-domain and can be applied to both single-polarization and multipolarization SAR data. This study is the first to investigate the differences between hybrid-polarity (HP) and full-polarimetric (FP) modes based on the sample log-cumulants of various oil slicks and open water from nine Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) scenes acquired off the coast of Norway in 2015. The sample log-cumulants calculated from the HP intensities show similar statistical behavior to the FP ones, resulting in a similar interpretation of the sample log-cumulants from HP and FP. Approximately eight hours after release the sample log-cumulants representing emulsion slicks have become more similar to the open water compared to plant oil. We find that the sample log-cumulants of the various oil slicks and open water varies between the scenes and also between the slicks and open water. This might be due to changes in ocean and wind condition, the initial slick properties, and/or the difference in the weathering process of the oil slicks.

  3. Approximations of fractional Brownian motion

    CERN Document Server

    Li, Yuqiang; 10.3150/10-BEJ319

    2012-01-01

    Approximations of fractional Brownian motion using Poisson processes whose parameter sets have the same dimensions as the approximated processes have been studied in the literature. In this paper, a special approximation to the one-parameter fractional Brownian motion is constructed using a two-parameter Poisson process. The proof involves the tightness and identification of finite-dimensional distributions.

  4. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  5. Web Log Clustering Approaches – A Survey

    Directory of Open Access Journals (Sweden)

    G. Sudhamathy,

    2011-07-01

    Full Text Available As more organization rely on the Internet and the World Wide Web to conduct business, the proposed strategies and techniques for market analysis need to be revisited in this context. We therefore present a survey of the most recent work in the field of Web usage mining, focusing on three different approaches towards web logs clustering. Clustering analysis is a widely used data mining algorithm which is a process of partitioning a set of data objects into a number of object clusters, where each data object shares the high similarity with the other objects within the same cluster but is quite dissimilar to objects in other clusters. In this work we discuss three different approaches on web logs clustering, analyze their benefits and drawbacks. We finally conclude on the most efficient algorithm based on the results of experiments conducted with various web log files.

  6. The Life Between Big Data Log Events

    Directory of Open Access Journals (Sweden)

    George Veletsianos

    2016-06-01

    Full Text Available Big data from massive open online courses (MOOCs have enabled researchers to examine learning processes at almost infinite levels of granularity. Yet, such data sets do not track every important element in the learning process. Many strategies that MOOC learners use to overcome learning challenges are not captured in clickstream and log data. In this study, we interviewed 92 MOOC learners to better understand their worlds, investigate possible mechanisms of student attrition, and extend conversations about the use of big data in education. Findings reveal three important domains of the experience of MOOC students that are absent from MOOC tracking logs: the practices at learners’ workstations, learners’ activities online but off-platform, and the wider social context of their lives beyond the MOOC. These findings enrich our understanding of learner agency in MOOCs, clarify the spaces in-between recorded tracking log events, and challenge the view that MOOC learners are disembodied autodidacts.

  7. Financial feasibility of a log sort yard handling small-diameter logs: A preliminary study

    Science.gov (United States)

    Han-Sup Han; E. M. (Ted) Bilek; John (Rusty) Dramm; Dan Loeffler; Dave Calkin

    2011-01-01

    The value and use of the trees removed in fuel reduction thinning and restoration treatments could be enhanced if the wood were effectively evaluated and sorted for quality and highest value before delivery to the next manufacturing destination. This article summarizes a preliminary financial feasibility analysis of a log sort yard that would serve as a log market to...

  8. Arabidopsis lonely guy (LOG) multiple mutants reveal a central role of the LOG-dependent pathway in cytokinin activation.

    Science.gov (United States)

    Tokunaga, Hiroki; Kojima, Mikiko; Kuroha, Takeshi; Ishida, Takashi; Sugimoto, Keiko; Kiba, Takatoshi; Sakakibara, Hitoshi

    2012-01-01

    Cytokinins are phytohormones that play key roles in the maintenance of stem cell activity in plants. Although alternative single-step and two-step activation pathways for cytokinin have been proposed, the significance of the single-step pathway which is catalyzed by LONELY GUY (LOG), is not fully understood. We analyzed the metabolic flow of cytokinin activation in Arabidopsis log multiple mutants using stable isotope-labeled tracers and characterized the mutants' morphological and developmental phenotypes. In tracer experiments, cytokinin activation was inhibited most pronouncedly by log7, while the other log mutations had cumulative effects. Although sextuple or lower-order mutants did not show drastic phenotypes in vegetative growth, the log1log2log3log4log5log7log8 septuple T-DNA insertion mutant in which the LOG-dependent pathway is impaired, displayed severe retardation of shoot and root growth with defects in the maintenance of the apical meristems. Detailed observation of the mutants showed that LOG7 was required for the maintenance of shoot apical meristem size. LOG7 was also suggested to play a role for normal primary root growth together with LOG3 and LOG4. These results suggest a dominant role of the single-step activation pathway mediated by LOGs for cytokinin production, and overlapping but differentiated functions of the members of the LOG gene family in growth and development.

  9. Drilling, construction, geophysical log data, and lithologic log for boreholes USGS 142 and USGS 142A, Idaho National Laboratory, Idaho

    Science.gov (United States)

    Twining, Brian V.; Hodges, Mary K.V.; Schusler, Kyle; Mudge, Christopher

    2017-07-27

    Starting in 2014, the U.S. Geological Survey in cooperation with the U.S. Department of Energy, drilled and constructed boreholes USGS 142 and USGS 142A for stratigraphic framework analyses and long-term groundwater monitoring of the eastern Snake River Plain aquifer at the Idaho National Laboratory in southeast Idaho. Borehole USGS 142 initially was cored to collect rock and sediment core, then re-drilled to complete construction as a screened water-level monitoring well. Borehole USGS 142A was drilled and constructed as a monitoring well after construction problems with borehole USGS 142 prevented access to upper 100 feet (ft) of the aquifer. Boreholes USGS 142 and USGS 142A are separated by about 30 ft and have similar geology and hydrologic characteristics. Groundwater was first measured near 530 feet below land surface (ft BLS) at both borehole locations. Water levels measured through piezometers, separated by almost 1,200 ft, in borehole USGS 142 indicate upward hydraulic gradients at this location. Following construction and data collection, screened water-level access lines were placed in boreholes USGS 142 and USGS 142A to allow for recurring water level measurements.Borehole USGS 142 was cored continuously, starting at the first basalt contact (about 4.9 ft BLS) to a depth of 1,880 ft BLS. Excluding surface sediment, recovery of basalt, rhyolite, and sediment core at borehole USGS 142 was approximately 89 percent or 1,666 ft of total core recovered. Based on visual inspection of core and geophysical data, material examined from 4.9 to 1,880 ft BLS in borehole USGS 142 consists of approximately 45 basalt flows, 16 significant sediment and (or) sedimentary rock layers, and rhyolite welded tuff. Rhyolite was encountered at approximately 1,396 ft BLS. Sediment layers comprise a large percentage of the borehole between 739 and 1,396 ft BLS with grain sizes ranging from clay and silt to cobble size. Sedimentary rock layers had calcite cement. Basalt flows

  10. Development of pulsed neutron uranium logging instrument

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xin-guang, E-mail: wangxg@upc.edu.cn [School of Geosciences, China University of Petroleum, Qingdao 266580 (China); Engineering Research Center of Nuclear Technology Application (East China Institute of Technology), Ministry of Education, Nanchang 330013 (China); Liu, Dan [China Institute of Atomic Energy, Beijing 102413 (China); Zhang, Feng [School of Geosciences, China University of Petroleum, Qingdao 266580 (China)

    2015-03-15

    This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

  11. Geothermal well log interpretation midterm report

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, S.K.; Wells, L.E.; Bickham, R.E.

    1979-02-01

    Reservoir types are defined according to fluid phase and temperature, lithology, geologic province, pore geometry, and salinity and fluid chemistry. Improvements are needed in lithology and porosity definition, fracture detection, and thermal evaluation for more accurate interpretation. Further efforts are directed toward improving diagnostic techniques for relating rock characteristics and log response, developing petrophysical models for geothermal systems, and developing thermal evaluation techniques. The Geothermal Well Log Interpretation study and report has concentrated only on hydrothermal geothermal reservoirs. Other geothermal reservoirs (hot dry rock, geopressured, etc.) are not considered.

  12. Perfil logístico de Colombia

    OpenAIRE

    Cardozo Maglioni, María Victoria; Lozano Suarez, María Goretty

    2012-01-01

    El presente documento ofrece un estudio del perfil logístico de Colombia con la finalidad de dar a conocer el estado de arte de las ciudades, puertos e infraestructura principal para facilitar la toma de decisiones de los empresarios. Para el desarrollo de este trabajo se determinaron a partir de un análisis competitivo del país; las principales ciudades y puertos; logrando establecer un inventario de maquinaria y equipo e infraestructura logística; enmarcados en retos ya establecidos en las ...

  13. Kaizen aplicado à logística

    OpenAIRE

    Santos, Ana Catarina Almeida

    2011-01-01

    O presente trabalho propõe alcançar um aumento de produtividade e consequente redução de custos e desperdícios ajudando o departamento logístico da Empresa Revigrés – Industria de Revestimentos de Grés, Lda, a tornar-se mais eficiente, através da aplicação da filosofia kaizen que propõe melhorias no desempenho por implementação de pequenas ações. O objetivo deste projeto é a melhoria do departamento logístico globalmente, para isso houve o recurso a pesquisas e levantamento ...

  14. International Conference Approximation Theory XIV

    CERN Document Server

    Schumaker, Larry

    2014-01-01

    This volume developed from papers presented at the international conference Approximation Theory XIV,  held April 7–10, 2013 in San Antonio, Texas. The proceedings contains surveys by invited speakers, covering topics such as splines on non-tensor-product meshes, Wachspress and mean value coordinates, curvelets and shearlets, barycentric interpolation, and polynomial approximation on spheres and balls. Other contributed papers address a variety of current topics in approximation theory, including eigenvalue sequences of positive integral operators, image registration, and support vector machines. This book will be of interest to mathematicians, engineers, and computer scientists working in approximation theory, computer-aided geometric design, numerical analysis, and related approximation areas.

  15. Exact constants in approximation theory

    CERN Document Server

    Korneichuk, N

    1991-01-01

    This book is intended as a self-contained introduction for non-specialists, or as a reference work for experts, to the particular area of approximation theory that is concerned with exact constants. The results apply mainly to extremal problems in approximation theory, which in turn are closely related to numerical analysis and optimization. The book encompasses a wide range of questions and problems: best approximation by polynomials and splines; linear approximation methods, such as spline-approximation; optimal reconstruction of functions and linear functionals. Many of the results are base

  16. Solving SDD linear systems in time $\\tilde{O}(m\\log{n}\\log(1/\\epsilon))$

    CERN Document Server

    Koutis, Ioannis; Peng, Richard

    2011-01-01

    We present an algorithm that on input of an $n\\times n$ symmetric diagonally dominant matrix $A$ with $m$ non-zero entries constructs in time ${\\tilde O}(m\\log n)$ in the RAM model a solver which on input of a vector $b$ computes a vector ${x}$ satisfying $||{x}-A^{+}b||_A<\\epsilon ||A^{+}b||_A $ in time ${\\tilde O}(m\\log n \\log (1/\\epsilon))$ The $\\tilde{O}$ notation hides a $(\\log\\log n)^2$ factor. The new algorithm exploits previously unknown structural properties of the output of the incremental sparsification algorithm given in \\cite{KoutisMP10}. It is also based on an accelerated construction of low-stretch spanning trees in the RAM model via substituting Fibonacci heaps with RAM-based priority queues.

  17. Log-cubic method for generation of soil particle size distribution curve.

    Science.gov (United States)

    Shang, Songhao

    2013-01-01

    Particle size distribution (PSD) is a fundamental physical property of soils. Traditionally, the PSD curve was generated by hand from limited data of particle size analysis, which is subjective and may lead to significant uncertainty in the freehand PSD curve and graphically estimated cumulative particle percentages. To overcome these problems, a log-cubic method was proposed for the generation of PSD curve based on a monotone piecewise cubic interpolation method. The log-cubic method and commonly used log-linear and log-spline methods were evaluated by the leave-one-out cross-validation method for 394 soil samples extracted from UNSODA database. Mean error and root mean square error of the cross-validation show that the log-cubic method outperforms two other methods. What is more important, PSD curve generated by the log-cubic method meets essential requirements of a PSD curve, that is, passing through all measured data and being both smooth and monotone. The proposed log-cubic method provides an objective and reliable way to generate a PSD curve from limited soil particle analysis data. This method and the generated PSD curve can be used in the conversion of different soil texture schemes, assessment of grading pattern, and estimation of soil hydraulic parameters and erodibility factor.

  18. Huffman Coding with Letter Costs: A Linear-Time Approximation Scheme

    OpenAIRE

    Golin, Mordecai; Mathieu, Claire; Young, Neal E.

    2002-01-01

    We give a polynomial-time approximation scheme for the generalization of Huffman Coding in which codeword letters have non-uniform costs (as in Morse code, where the dash is twice as long as the dot). The algorithm computes a (1+epsilon)-approximate solution in time O(n + f(epsilon) log^3 n), where n is the input size.

  19. Approximation algorithms for indefinite complex quadratic maximization problems

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In this paper,we consider the following indefinite complex quadratic maximization problem: maximize zHQz,subject to zk ∈ C and zkm = 1,k = 1,...,n,where Q is a Hermitian matrix with trQ = 0,z ∈ Cn is the decision vector,and m 3.An (1/log n) approximation algorithm is presented for such problem.Furthermore,we consider the above problem where the objective matrix Q is in bilinear form,in which case a 0.7118 cos mπ 2approximation algorithm can be constructed.In the context of quadratic optimization,various extensions and connections of the model are discussed.

  20. Refining Approximating Betweenness Centrality Based on Samplings

    CERN Document Server

    Ji, Shiyu

    2016-01-01

    Betweenness Centrality (BC) is an important measure used widely in complex network analysis, such as social network, web page search, etc. Computing the exact BC values is highly time consuming. Currently the fastest exact BC determining algorithm is given by Brandes, taking $O(nm)$ time for unweighted graphs and $O(nm+n^2\\log n)$ time for weighted graphs, where $n$ is the number of vertices and $m$ is the number of edges in the graph. Due to the extreme difficulty of reducing the time complexity of exact BC determining problem, many researchers have considered the possibility of any satisfactory BC approximation algorithms, especially those based on samplings. Bader et al. give the currently best BC approximation algorithm, with a high probability to successfully estimate the BC of one vertex within a factor of $1/\\varepsilon$ using $\\varepsilon t$ samples, where $t$ is the ratio between $n^2$ and the BC value of the vertex. However, some of the algorithmic parameters in Bader's work are not yet tightly boun...

  1. Lead Poisoning

    Science.gov (United States)

    ... lead is of microscopic size, invisible to the naked eye. More often than not, children with elevated ... majority of the childhood lead poisoning cases we see today. Children and adults too can get seriously ...

  2. Logging impacts on avian species richness and composition differ across latitudes and foraging and breeding habitat preferences.

    Science.gov (United States)

    LaManna, Joseph A; Martin, Thomas E

    2016-10-10

    Understanding the causes underlying changes in species diversity is a fundamental pursuit of ecology. Animal species richness and composition often change with decreased forest structural complexity associated with logging. Yet differences in latitude and forest type may strongly influence how species diversity responds to logging. We performed a meta-analysis of logging effects on local species richness and composition of birds across the world and assessed responses by different guilds (nesting strata, foraging strata, diet, and body size). This approach allowed identification of species attributes that might underlie responses to this anthropogenic disturbance. We only examined studies that allowed forests to regrow naturally following logging, and accounted for logging intensity, spatial extent, successional regrowth after logging, and the change in species composition expected due to random assembly from regional species pools. Selective logging in the tropics and clearcut logging in temperate latitudes caused loss of species from nearly all forest strata (ground to canopy), leading to substantial declines in species richness (up to 27% of species). Few species were lost or gained following any intensity of logging in lower-latitude temperate forests, but the relative abundances of these species changed substantially. Selective logging at higher-temperate latitudes generally replaced late-successional specialists with early-successional specialists, leading to no net changes in species richness but large changes in species composition. Removing less basal area during logging mitigated the loss of avian species from all forests and, in some cases, increased diversity in temperate forests. This meta-analysis provides insights into the important role of habitat specialization in determining differential responses of animal communities to logging across tropical and temperate latitudes.

  3. Logging impacts on avian species richness and composition differ across latitudes relative to foraging and breeding habitat preferences

    Science.gov (United States)

    LaManna, Joseph A.; Martin, Thomas E.

    2017-01-01

    Understanding the causes underlying changes in species diversity is a fundamental pursuit of ecology. Animal species richness and composition often change with decreased forest structural complexity associated with logging. Yet differences in latitude and forest type may strongly influence how species diversity responds to logging. We performed a meta-analysis of logging effects on local species richness and composition of birds across the world and assessed responses by different guilds (nesting strata, foraging strata, diet, and body size). This approach allowed identification of species attributes that might underlie responses to this anthropogenic disturbance. We only examined studies that allowed forests to regrow naturally following logging, and accounted for logging intensity, spatial extent, successional regrowth after logging, and the change in species composition expected due to random assembly from regional species pools. Selective logging in the tropics and clearcut logging in temperate latitudes caused loss of species from nearly all forest strata (ground to canopy), leading to substantial declines in species richness (up to 27% of species). Few species were lost or gained following any intensity of logging in lower-latitude temperate forests, but the relative abundances of these species changed substantially. Selective logging at higher-temperate latitudes generally replaced late-successional specialists with early-successional specialists, leading to no net changes in species richness but large changes in species composition. Removing less basal area during logging mitigated the loss of avian species from all forests and, in some cases, increased diversity in temperate forests. This meta-analysis provides insights into the important role of habitat specialization in determining differential responses of animal communities to logging across tropical and temperate latitudes.

  4. Three-Way Channels With Multiple Unicast Sessions: Capacity Approximation via Network Transformation

    KAUST Repository

    Chaaban, Anas

    2016-09-28

    A network of three nodes mutually communicating with each other is studied. This multi-way network is a suitable model for three-user device-to-device communications. The main goal of this paper is to characterize the capacity region of the underlying Gaussian three-way channel (3WC) within a constant gap. To this end, a capacity outer bound is derived using cut-set bounds and genie-aided bounds. For achievability, the 3WC is first transformed into an equivalent star channel. This latter is then decomposed into a set of “successive” sub-channels, leading to a sub-channel allocation problem. Using backward decoding, interference neutralization, and known results on the capacity of the star-channel relying of physical-layer network coding, an achievable rate region for the 3WC is obtained. It is then shown that the achievable rate region is within a constant gap of the developed outer bound, leading to the desired capacity approximation. Interestingly, in contrast to the Gaussian two-way channel (TWC), adaptation is necessary in the 3WC. Furthermore, message splitting is another ingredient of the developed scheme for the 3WC, which is not required in the TWC. The two setups are, however, similar in terms of their sum-capacity pre-log, which is equal to 2. Finally, some interesting networks and their approximate capacities are recovered as special cases of the 3WC, such as the cooperative broadcast channel and multiple access channel.

  5. Relational Leading

    DEFF Research Database (Denmark)

    2015-01-01

    This first chapter presents the exploratory and curious approach to leading as relational processes – an approach that pervades the entire book. We explore leading from a perspective that emphasises the unpredictable challenges and triviality of everyday life, which we consider an interesting......, relevant and realistic way to examine leading. The chapter brings up a number of concepts and contexts as formulated by researchers within the field, and in this way seeks to construct a first understanding of relational leading....

  6. Singular and combined effects of blowdown, salvage logging, and wildfire on forest floor and soil mercury pools.

    Science.gov (United States)

    Mitchell, Carl P J; Kolka, Randall K; Fraver, Shawn

    2012-08-07

    A number of factors influence the amount of mercury (Hg) in forest floors and soils, including deposition, volatile emission, leaching, and disturbances such as fire. Currently the impact on soil Hg pools from other widespread forest disturbances such as blowdown and management practices like salvage logging are unknown. Moreover, ecological and biogeochemical responses to disturbances are generally investigated within a single-disturbance context, with little currently known about the impact of multiple disturbances occurring in rapid succession. In this study we capitalize on a combination of blowdown, salvage logging and fire events in the sub-boreal region of northern Minnesota to assess both the singular and combined effects of these disturbances on forest floor and soil total Hg concentrations and pools. Although none of the disturbance combinations affected Hg in mineral soil, we did observe significant effects on both Hg concentrations and pools in the forest floor. Blowdown increased the mean Hg pool in the forest floor by 0.76 mg Hg m(-2) (223%). Salvage logging following blowdown created conditions leading to a significantly more severe forest floor burn during wildfire, which significantly enhanced Hg emission. This sequence of combined events resulted in a mean loss of approximately 0.42 mg Hg m(-2) (68% of pool) from the forest floor, after conservatively accounting for potential losses via enhanced soil leaching and volatile emissions between the disturbance and sampling dates. Fire alone or blowdown followed by fire did not significantly affect the total Hg concentrations or pools in the forest floor. Overall, unexpected consequences for soil Hg accumulation and by extension, atmospheric Hg emission and risk to aquatic biota, may result when combined impacts are considered in addition to singular forest floor and soil disturbances.

  7. Theoretical studies of permeability inversion from seismoelectric logs

    Science.gov (United States)

    Hu, H.; Guan, W.; Zhao, W.

    2012-04-01

    Permeability is one of the most important parameters for evaluating the level of difficulty in oil and gas exploitation. A quick, continuous and accurate in-situ estimate of reservoir permeability is highly significant. Stoneley wave logs have been used to determine formation permeability (Tang and Cheng, 1996). However, the inversion errors of this method are too big in low-permeability formations, especially in high-porosity and low-permeability formations resulting from the high clay content in pores. In this study, we propose to invert permeability by using the full waveforms of seismoelectric logs with low frequencies. This method is based on the relationship of permeability with the ratio of the electric excitation intensity to the pressure field's (REP) with respect to the Stoneley wave in seismoelectric logs. By solving the governing equations for electrokinetic coupled wavefields in homogeneous fluid-saturated porous media (Pride, 1994), we calculate the full waveforms of the borehole seismoelectric wavefields excited by a point pressure source and investigate frequency-dependent excitation intensities of the mode waves and excitation intensities of the real branch points in seismoelectric logs. It is found that the REP's phase, which reflects the phase discrepancy between the Stoneley-wave-induced electric field and the acoustic pressure, is sensitive to formation permeability. To check the relation between permeability and REP's phase qualitatively, an approximate expression of the tangent of the REP's argument is derived theoretically as tan(θEP) ≈-ωc/ω = -φη/ (2πfα ∞ρfκ0), where θEPdenotes the arguments of the REP and their principal value is the REP's phase,ω is the angular frequency,ωc is a critical angular frequency that separates the low-frequency viscous flow from the high-frequency inertial flow, φ is the porosity, α∞ is the tortuosity, κ0 is the Darcy permeability, ρf and η are the density and the viscosity of the pore

  8. Neutron generator for the array borehole logging

    Institute of Scientific and Technical Information of China (English)

    LuHong-Bo; ZhongZhen-Qian; 等

    1998-01-01

    The performance mechanism of the array neutron generator to be used to porosity logging is presented.The neutron generator utilizes a drive-in target ceramic neutron tube,which cursts nerutron with fast-slow period selectively pressure.Regulation of the neutron tube is accomplished by pulse width modulation.The high voltage power supply is poerated at optimum frequency.

  9. Precision Prediction of the Log Power Spectrum

    CERN Document Server

    Repp, Andrew

    2016-01-01

    At translinear scales, the log power spectrum captures significantly more cosmological information than the standard power spectrum. At high wavenumbers $k$, the cosmological information in the standard power spectrum $P(k)$ fails to increase in proportion to $k$ due to correlations between large- and small-scale modes. As a result, $P(k)$ suffers from an information plateau on these translinear scales, so that analysis with the standard power spectrum cannot access the information contained in these small-scale modes. The log power spectrum $P_A(k)$, on the other hand, captures the majority of this otherwise lost information. Until now there has been no means of predicting the amplitude of the log power spectrum apart from cataloging the results of simulations. We here present a cosmology-independent prescription for the log power spectrum, and we find this prescription to display accuracy comparable to that of Smith et al. (2003), over a range of redshifts and smoothing scales, and for wavenumbers up to $1....

  10. The fluid-compensated cement bond log

    Energy Technology Data Exchange (ETDEWEB)

    Nayfeh, T.H.; Wheelis, W.B. Jr.; Leslie, H.D.

    1986-08-01

    Simulations of cement bond logging (CBL) have shown that wellbore fluid effects can be segregated from sonic-signal response to changing cement strengths. Traditionally, the effects have been considered negligible and the CBL's have been interpreted as if water were in the wellbore. However, large variations in CBL's have become apparent with the increasing number of logs run in completion fluids, such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/. To study wellbore fluid effects, physical and numerical models were developed that simulated the wellbore geometry. Measurements were conducted in 5-, 7-, and 9 5/8-in. casings for a range of wellbore fluid types and for both densities and viscosities. Parallel numerical modeling used similar parameters. Results show that bond-log amplitudes varied dramatically with the wellbore fluid acoustic impedance-i.e., there was a 70% increase in signal amplitudes for 11.5 lbm/gal (1370-kg/m/sup 3/) CaCl/sub 2/ over the signal amplitude in water. This led to the development of a fluid-compensated bond log that corrects the amplitude for acoustic impedance of various wellbore fluids, thereby making the measurements more directly related to the cement quality.

  11. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    D'Souza, Subas

    2013-01-01

    A starter guide that covers Apache Flume in detail.Apache Flume: Distributed Log Collection for Hadoop is intended for people who are responsible for moving datasets into Hadoop in a timely and reliable manner like software engineers, database administrators, and data warehouse administrators

  12. Smartphone log data in a qualitative perspective

    DEFF Research Database (Denmark)

    Ørmen, Jacob; Thorhauge, Anne Mette

    2015-01-01

    Log data from smartphones have primarily been used in large-scale research designs to draw statistical inferences from hundreds or even thousands of participants. In this article, we argue that more qualitatively oriented designs can also benefit greatly from integrating these rich data sources...

  13. Discovering the Local Landscape: Pioneer Log Buildings.

    Science.gov (United States)

    Douglas, Bob; And Others

    Building structures made from logs appeared in the eastern United States during the late 17th century, and immigrants from Sweden, Finland, and Germany are credited with their construction. There were two types of structures: the horizontal design introduced by the Scandinavians and the German or Pennsylvania Dutch model that was used by the…

  14. SOME NONLINEAR APPROXIMATIONS FOR MATRIX-VALUED FUNCTIONS

    Institute of Scientific and Technical Information of China (English)

    Guo-liang Xu

    2003-01-01

    Some nonlinear approximants, i.e., exponential-sum interpolation with equal distance or at origin, (0,1)-type, (0,2)-type and (1,2)-type fraction-sum approximations, for matrixvalued functions are introduced. All these approximation problems lead to a same form system of nonlinear equations. Solving methods for the nonlinear system are discussed.Conclusions on uniqueness and convergence of the approximants for certain class of functions are given.

  15. CNPC makes major breakthrough in array lateral logging technology

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    On October 12, 2011, the array lateral logging tool independently developed by CNPC Well Logging Company was successfully put to use at two production wells at Changqing Oilfield, obtaining high quality logging information. CNPC becomes the second company after Schlumberger in the world to master the array lateral logging technology, which can effectively identify layers as thin as 0.3 meter.

  16. 32 CFR 700.846 - Status of logs.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  17. 47 CFR 73.782 - Retention of logs.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  18. 47 CFR 73.1840 - Retention of logs.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  19. Performance of sampling methods to estimate log characteristics for wildlife.

    Science.gov (United States)

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton

    2004-01-01

    Accurate estimation of the characteristics of log resources, or coarse woody debris (CWD), is critical to effective management of wildlife and other forest resources. Despite the importance of logs as wildlife habitat, methods for sampling logs have traditionally focused on silvicultural and fire applications. These applications have emphasized estimates of log volume...

  20. Displacement length and velocity of tagged logs in the tagliamento river

    Directory of Open Access Journals (Sweden)

    Diego Ravazzolo

    2013-09-01

    Full Text Available Large wood enhance the dynamics of geomorphic processes in river systems, increases the morphological complexity of the channel bed, and provides habitats for fish and invertebrates. On the other side, if transported during high-magnitude events, large wood pieces can increase flood risks in sensitive places such as bridges and narrow cross sections prone to outbank flows. However, the dynamics and mobility of logs in rivers is poorly understood, especially in wide gravel-bed rivers. Recent studies have employed fixed video cameras to assess logs velocity, but little evidence is still available about travel length during flood events of different magnitude. This study was conducted in a valley reach of the Tagliamento river, located in the North East of Italy. The Tagliamento river is approximately 800 m wide in the study area, and is characterized by relatively high natural conditions and complex fluvial dynamics. Log mobility have been studied from June 2010 to October 2011, a period characterized by a relatively high magnitude flood in November 2010. Log mobility and displacement during floods have been measured by implanting active radio transmitters (RFID in 113 logs and GPS track devices in 42 logs. The first devices allow to recover the log after flood events by using a portable antenna, and to derive the displacement length over the monitoring period, whereas the second devices allows to calculate instantaneous (1 sec and average log velocity of moving logs. Recovery rate of logs equipped with RFID and GPS was about 50% and 60%, respectively. A preliminary analysis of the data collected indicates that there is a positive relationship between displacement length and the peak of flood events, as well as a positive relationship between log velocity and the flood magnitude. Also, a critical flow rate over which logs stranded on active bars can be transported has been identified. The ability to predict wood mobility in gravel-bed rivers could

  1. Fast inference in generalized linear models via expected log-likelihoods.

    Science.gov (United States)

    Ramirez, Alexandro D; Paninski, Liam

    2014-04-01

    Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting "expected log-likelihood" can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina.

  2. On the orbital stability of Gaussian solitary waves in the log-KdV equation

    Science.gov (United States)

    Carles, Rémi; Pelinovsky, Dmitry

    2014-12-01

    We consider the logarithmic Korteweg-de Vries (log-KdV) equation, which models solitary waves in anharmonic chains with Hertzian interaction forces. By using an approximating sequence of global solutions of the regularized generalized KdV equation in H^1({R}) with conserved L2 norm and energy, we construct a weak global solution of the log-KdV equation in a subset of H^1({R}) . This construction yields conditional orbital stability of Gaussian solitary waves of the log-KdV equation, provided that uniqueness and continuous dependence of the constructed solution holds. Furthermore, we study the linearized log-KdV equation at the Gaussian solitary wave and prove that the associated linearized operator has a purely discrete spectrum consisting of simple purely imaginary eigenvalues in addition to the double zero eigenvalue. The eigenfunctions, however, do not decay like Gaussian functions but have algebraic decay. Using numerical approximations, we show that the Gaussian initial data do not spread out but produce visible radiation at the left slope of the Gaussian-like pulse in the time evolution of the linearized log-KdV equation.

  3. A Brief Introduction of Jianghan Well Logging Institute

    Institute of Scientific and Technical Information of China (English)

    Hu Yiliang

    1996-01-01

    @@ Jianghan Well Logging Institute(JHWLI), situated in Qianjiang city of Hubei Province and founded in 1979, is the only specialized well logging institute affiliated to China National Petroleum Corporation (CNPC), It has a basic logging method research department, an image logging research department, a technological support department., a new-tech development & promotion department, a well logging service department and a diversification company. They are engaged in the researchs and development of well logging technology and its application in addition to in site well logging services.

  4. Requirements-Driven Log Analysis Extended Abstract

    Science.gov (United States)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  5. Seasonal logging, process response, and geomorphic work

    Science.gov (United States)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2013-09-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.

  6. Requirements-Driven Log Analysis Extended Abstract

    Science.gov (United States)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  7. Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences

    Directory of Open Access Journals (Sweden)

    Andrzej Cichocki

    2015-05-01

    Full Text Available This work reviews and extends a family of log-determinant (log-det divergences for symmetric positive definite (SPD matrices and discusses their fundamental properties. We show how to use parameterized Alpha-Beta (AB and Gamma log-det divergences to generate many well-known divergences; in particular, we consider the Stein’s loss, the S-divergence, also called Jensen-Bregman LogDet (JBLD divergence, Logdet Zero (Bhattacharyya divergence, Affine Invariant Riemannian Metric (AIRM, and other divergences. Moreover, we establish links and correspondences between log-det divergences and visualise them on an alpha-beta plane for various sets of parameters. We use this unifying framework to interpret and extend existing similarity measures for semidefinite covariance matrices in finite-dimensional Reproducing Kernel Hilbert Spaces (RKHS. This paper also shows how the Alpha-Beta family of log-det divergences relates to the divergences of multivariate and multilinear normal distributions. Closed form formulas are derived for Gamma divergences of two multivariate Gaussian densities; the special cases of the Kullback-Leibler, Bhattacharyya, Rényi, and Cauchy-Schwartz divergences are discussed. Symmetrized versions of log-det divergences are also considered and briefly reviewed. Finally, a class of divergences is extended to multiway divergences for separable covariance (or precision matrices.

  8. BDD Minimization for Approximate Computing

    OpenAIRE

    Soeken, Mathias; Grosse, Daniel; Chandrasekharan, Arun; Drechsler, Rolf

    2016-01-01

    We present Approximate BDD Minimization (ABM) as a problem that has application in approximate computing. Given a BDD representation of a multi-output Boolean function, ABM asks whether there exists another function that has a smaller BDD representation but meets a threshold w.r.t. an error metric. We present operators to derive approximated functions and present algorithms to exactly compute the error metrics directly on the BDD representation. An experimental evaluation demonstrates the app...

  9. Tree wavelet approximations with applications

    Institute of Scientific and Technical Information of China (English)

    XU Yuesheng; ZOU Qingsong

    2005-01-01

    We construct a tree wavelet approximation by using a constructive greedy scheme(CGS). We define a function class which contains the functions whose piecewise polynomial approximations generated by the CGS have a prescribed global convergence rate and establish embedding properties of this class. We provide sufficient conditions on a tree index set and on bi-orthogonal wavelet bases which ensure optimal order of convergence for the wavelet approximations encoded on the tree index set using the bi-orthogonal wavelet bases. We then show that if we use the tree index set associated with the partition generated by the CGS to encode a wavelet approximation, it gives optimal order of convergence.

  10. Impacts of Unsustainable Mahogany Logging in Bolivia and Peru

    Directory of Open Access Journals (Sweden)

    Roberto F. Kometter

    2004-06-01

    Full Text Available Although bigleaf mahogany [Swietenia macrophylla King (Meliaceae] is the premier timber species of Latin America, its exploitation is unsustainable because of a pattern of local depletion and shifting supply. We surveyed experts on the status of mahogany in Bolivia and Peru, the world's past and present largest exporters. Bolivia no longer has commercially viable mahogany (trees > 60 cm diameter at breast height across 79% of its range. In Peru, mahogany's range has shrunk by 50%, and, within a decade, a further 28% will be logged out. Approximately 15% of the mahogany range in these two countries is protected, but low densities and illegal logging mean that this overestimates the extent of mahogany under protection. The international community can support mahogany conservation by funding park management and by encouraging independent verification of the legality of mahogany in trade. Our findings demonstrate that a systematic expert survey can generate reliable and cost-effective information on the status of widespread species of concern and help to inform appropriate management policy.

  11. Parameter estimation and forecasting for multiplicative log-normal cascades.

    Science.gov (United States)

    Leövey, Andrés E; Lux, Thomas

    2012-04-01

    We study the well-known multiplicative log-normal cascade process in which the multiplication of Gaussian and log normally distributed random variables yields time series with intermittent bursts of activity. Due to the nonstationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian probability density function to empirical data, cf. Castaing et al. [Physica D 46, 177 (1990)]. More recently, alternative estimators based upon various moments have been proposed by Beck [Physica D 193, 195 (2004)] and Kiyono et al. [Phys. Rev. E 76, 041113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous generalized method of moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono et al.'s procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets.

  12. Neutron logging tool readings and neutron parameters of formations

    Science.gov (United States)

    Czubek, Jan A.

    1995-03-01

    A case history of the calibration of neutron porosity tools is given in the paper. The calibration of neutron porosity tools is one of the most difficult, complicated, and time consuming tasks in the well logging operations in geophysics. A semi empirical approach to this problem is given in the paper. It is based on the correlation of the tool readings observed in known environments with the apparent neutron parameters sensed by the tools. The apparent neutron parameters are functions of the true neutron parameters of geological formations and of the borehole material, borehole diameter, and the tool position inside the borehole. The true integral neutron transport parameters are obtained by the multigroup diffusion approximation for slowing down of neutrons and by one thermal neutron group for the diffusion. In the latter, the effective neutron temperature is taken into account. The problem of the thermal neutron absorption cross section of rocks is discussed in detail from the point of view of its importance for the well logging results and for the experimental techniques being used.

  13. Implementing regularization implicitly via approximate eigenvector computation

    CERN Document Server

    Mahoney, Michael W

    2010-01-01

    Regularization is a powerful technique for extracting useful information from noisy data. Typically, it is implemented by adding some sort of norm constraint to an objective function and then exactly optimizing the modified objective function. This procedure typically leads to optimization problems that are computationally more expensive than the original problem, a fact that is clearly problematic if one is interested in large-scale applications. On the other hand, a large body of empirical work has demonstrated that heuristics, and in some cases approximation algorithms, developed to speed up computations sometimes have the side-effect of performing regularization implicitly. Thus, we consider the question: What is the regularized optimization objective that an approximation algorithm is exactly optimizing? We address this question in the context of computing approximations to the smallest nontrivial eigenvector of a graph Laplacian; and we consider three random-walk-based procedures: one based on the heat ...

  14. Monte Carlo Numerical Models for Nuclear Logging Applications

    OpenAIRE

    Fusheng Li; Xiaogang Han

    2012-01-01

    Nuclear logging is one of most important logging services provided by many oil service companies. The main parameters of interest are formation porosity, bulk density, and natural radiation. Other services are also provided from using complex nuclear logging tools, such as formation lithology/mineralogy, etc. Some parameters can be measured by using neutron logging tools and some can only be measured by using a gamma ray tool. To understand the response of nuclear logging tools, the neutron t...

  15. Topics in multivariate approximation and interpolation

    CERN Document Server

    Jetter, Kurt

    2005-01-01

    This book is a collection of eleven articles, written by leading experts and dealing with special topics in Multivariate Approximation and Interpolation. The material discussed here has far-reaching applications in many areas of Applied Mathematics, such as in Computer Aided Geometric Design, in Mathematical Modelling, in Signal and Image Processing and in Machine Learning, to mention a few. The book aims at giving a comprehensive information leading the reader from the fundamental notions and results of each field to the forefront of research. It is an ideal and up-to-date introduction for gr

  16. Securing a cyber physical system in nuclear power plants using least square approximation and computational geometric approach

    Energy Technology Data Exchange (ETDEWEB)

    Gawand, Hemangi Laxman [Homi Bhabha National Institute, Computer Section, BARC, Mumbai (India); Bhattacharjee, A. K. [Reactor Control Division, BARC, Mumbai (India); Roy, Kallol [BHAVINI, Kalpakkam (India)

    2017-04-15

    In industrial plants such as nuclear power plants, system operations are performed by embedded controllers orchestrated by Supervisory Control and Data Acquisition (SCADA) software. A targeted attack (also termed a control aware attack) on the controller/SCADA software can lead a control system to operate in an unsafe mode or sometimes to complete shutdown of the plant. Such malware attacks can result in tremendous cost to the organization for recovery, cleanup, and maintenance activity. SCADA systems in operational mode generate huge log files. These files are useful in analysis of the plant behavior and diagnostics during an ongoing attack. However, they are bulky and difficult for manual inspection. Data mining techniques such as least squares approximation and computational methods can be used in the analysis of logs and to take proactive actions when required. This paper explores methodologies and algorithms so as to develop an effective monitoring scheme against control aware cyber attacks. It also explains soft computation techniques such as the computational geometric method and least squares approximation that can be effective in monitor design. This paper provides insights into diagnostic monitoring of its effectiveness by attack simulations on a four-tank model and using computation techniques to diagnose it. Cyber security of instrumentation and control systems used in nuclear power plants is of paramount importance and hence could be a possible target of such applications.

  17. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.

    Science.gov (United States)

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-08-10

    "Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms.

  18. Development of New Density Functional Approximations

    Science.gov (United States)

    Su, Neil Qiang; Xu, Xin

    2017-05-01

    Kohn-Sham density functional theory has become the leading electronic structure method for atoms, molecules, and extended systems. It is in principle exact, but any practical application must rely on density functional approximations (DFAs) for the exchange-correlation energy. Here we emphasize four aspects of the subject: (a) philosophies and strategies for developing DFAs; (b) classification of DFAs; (c) major sources of error in existing DFAs; and (d) some recent developments and future directions.

  19. Galaxy rotation curves with log-normal density distribution

    CERN Document Server

    Marr, John H

    2015-01-01

    The log-normal distribution represents the probability of finding randomly distributed particles in a micro canonical ensemble with high entropy. To a first approximation, a modified form of this distribution with a truncated termination may represent an isolated galactic disk, and this disk density distribution model was therefore run to give the best fit to the observational rotation curves for 37 representative galaxies. The resultant curves closely matched the observational data for a wide range of velocity profiles and galaxy types with rising, flat or descending curves in agreement with Verheijen's classification of 'R', 'F' and 'D' type curves, and the corresponding theoretical total disk masses could be fitted to a baryonic Tully Fisher relation (bTFR). Nine of the galaxies were matched to galaxies with previously published masses, suggesting a mean excess dynamic disk mass of dex0.61+/-0.26 over the baryonic masses. Although questionable with regard to other measurements of the shape of disk galaxy g...

  20. Log-supermodular functions, functional clones and counting CSPs

    CERN Document Server

    Bulatov, Andrei A; Goldberg, Leslie Ann; Jerrum, Mark

    2011-01-01

    Motivated by a desire to understand the computational complexity of counting constraint satisfaction problems (counting CSPs), particularly the complexity of approximation, we study functional clones of functions on the Boolean domain, which are analogous to the familiar relational clones constituting Post's lattice. One of these clones is the collection of log-supermodular (lsm) functions, which turns out to play a significant role in classifying counting CSPs. In our study, we assume that non-negative unary functions (weights) are available. Given this, we prove that there are no functional clones lying strictly between the clone of lsm functions and the total clone (containing all functions). Thus, any counting CSP that contains a single non-lsm function is computationally as hard as any problem in #P. Furthermore, any non-trivial functional clone (in a sense that will be made precise below) contains the binary function "implies". As a consequence, all non-trivial counting CSPs (with non-negative unary wei...

  1. Selecting Aquifer Wells for Planned Gyroscopic Logging

    Energy Technology Data Exchange (ETDEWEB)

    Rohe, Michael James; Studley, Gregory Wayne

    2002-04-01

    Understanding the configuration of the eastern Snake River Plain aquifer's water table is made difficult, in part, due to borehole deviation in aquifer wells. A borehole has deviation if it is not vertical or straight. Deviation impairs the analysis of water table elevation measurements because it results in measurements that are greater than the true distance from the top of the well to the water table. Conceptual models of the water table configuration are important to environmental management decision-making at the INEEL; these models are based on measurements of depth to the water table taken from aquifer wells at or near the INEEL. When accurate data on the amount of deviation in any given borehole is acquired, then measurements of depth-to-water can be adjusted to reflect the true depth so more accurate conceptual models can be developed. Collection of additional borehole deviation data with gyroscopic logging is planned for selected wells to further our confidence in the quality of water level measurements. Selection of wells for the planned logging is based on qualitative and quantitative screening criteria. An existing data set from magnetic deviation logs was useful in establishing these criteria however, are considered less accurate than gyroscopic deviation logs under certain conditions. Population distributions for 128 aquifer wells with magnetic deviation data were used to establish three quantitative screening thresholds. Qualitative criteria consisted of administrative controls, accessibility issues, and drilling methods. Qualitative criteria eliminated all but 116 of the 337 aquifer wells, in the vicinity of the INEEL, that were initially examined in this screening effort. Of these, 72 have associated magnetic deviation data; 44 do not. Twenty-five (25) of the 72 wells with magnetic deviation data have deviation greater than one of the three quantitative screening thresholds. These 25 are recommended for the planned gyroscopic borehole deviation

  2. Diophantine approximation and automorphic spectrum

    CERN Document Server

    Ghosh, Anish; Nevo, Amos

    2010-01-01

    The present paper establishes qunatitative estimates on the rate of diophantine approximation in homogeneous varieties of semisimple algebraic groups. The estimates established generalize and improve previous ones, and are sharp in a number of cases. We show that the rate of diophantine approximation is controlled by the spectrum of the automorphic representation, and is thus subject to the generalised Ramanujan conjectures.

  3. Some results in Diophantine approximation

    DEFF Research Database (Denmark)

    the basic concepts on which the papers build. Among other it introduces metric Diophantine approximation, Mahler’s approach on algebraic approximation, the Hausdorff measure, and properties of the formal Laurent series over Fq. The introduction ends with a discussion on Mahler’s problem when considered...

  4. Beyond the random phase approximation

    DEFF Research Database (Denmark)

    Olsen, Thomas; Thygesen, Kristian S.

    2013-01-01

    We assess the performance of a recently proposed renormalized adiabatic local density approximation (rALDA) for ab initio calculations of electronic correlation energies in solids and molecules. The method is an extension of the random phase approximation (RPA) derived from time-dependent density...

  5. Uniform approximation by (quantum) polynomials

    NARCIS (Netherlands)

    Drucker, A.; de Wolf, R.

    2011-01-01

    We show that quantum algorithms can be used to re-prove a classical theorem in approximation theory, Jackson's Theorem, which gives a nearly-optimal quantitative version of Weierstrass's Theorem on uniform approximation of continuous functions by polynomials. We provide two proofs, based respectivel

  6. Global approximation of convex functions

    CERN Document Server

    Azagra, D

    2011-01-01

    We show that for every (not necessarily bounded) open convex subset $U$ of $\\R^n$, every (not necessarily Lipschitz or strongly) convex function $f:U\\to\\R$ can be approximated by real analytic convex functions, uniformly on all of $U$. In doing so we provide a technique which transfers results on uniform approximation on bounded sets to results on uniform approximation on unbounded sets, in such a way that not only convexity and $C^k$ smoothness, but also local Lipschitz constants, minimizers, order, and strict or strong convexity, are preserved. This transfer method is quite general and it can also be used to obtain new results on approximation of convex functions defined on Riemannian manifolds or Banach spaces. We also provide a characterization of the class of convex functions which can be uniformly approximated on $\\R^n$ by strongly convex functions.

  7. Approximate circuits for increased reliability

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Mayo, Jackson R.

    2015-08-18

    Embodiments of the invention describe a Boolean circuit having a voter circuit and a plurality of approximate circuits each based, at least in part, on a reference circuit. The approximate circuits are each to generate one or more output signals based on values of received input signals. The voter circuit is to receive the one or more output signals generated by each of the approximate circuits, and is to output one or more signals corresponding to a majority value of the received signals. At least some of the approximate circuits are to generate an output value different than the reference circuit for one or more input signal values; however, for each possible input signal value, the majority values of the one or more output signals generated by the approximate circuits and received by the voter circuit correspond to output signal result values of the reference circuit.

  8. Approximate circuits for increased reliability

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Mayo, Jackson R.

    2015-12-22

    Embodiments of the invention describe a Boolean circuit having a voter circuit and a plurality of approximate circuits each based, at least in part, on a reference circuit. The approximate circuits are each to generate one or more output signals based on values of received input signals. The voter circuit is to receive the one or more output signals generated by each of the approximate circuits, and is to output one or more signals corresponding to a majority value of the received signals. At least some of the approximate circuits are to generate an output value different than the reference circuit for one or more input signal values; however, for each possible input signal value, the majority values of the one or more output signals generated by the approximate circuits and received by the voter circuit correspond to output signal result values of the reference circuit.

  9. An Approach to Log Management: Prototyping a Design of Agent for Log Harvesting

    CERN Document Server

    Reinaldo, Mayol Arnao; Antonio, Lobo

    2011-01-01

    This paper describes a work in progress implementing a solution for harvesting and transporting information logs from network devices in a e-science environment. The system is composed for servers, agents, active devices and a transporting protocol. This document describes the state of development of agents. Agents capture logs from devices, normalize, reduce and cataloged them by using metadata. Once all these processes are done, they transmit the cataloged data by using Transportation Protocol to a warehouse server. Also an agent use orchestration parameters to transmit modified logs to a data warehouse server. These parameters can be received from orchestration applications such as Taverna. The operation of the agents and the communication protocol solve some of the deficiencies of traditional logs management protocols. Finally, we show some test realized over the new prototype.

  10. Development of a clinically feasible logMAR alternative to the Snellen chart: performance of the “compact reduced logMAR” visual acuity chart in amblyopic children

    Science.gov (United States)

    Laidlaw, D A H; Abbott, A; Rosser, D A

    2003-01-01

    Background/aim: The “compact reduced logMAR” (cRLM) chart is being developed as a logMAR alternative to the Snellen chart. It is closer spaced and has fewer letters per line than conventional logMAR charts. Information regarding the performance of such a chart in amblyopes and children is therefore required. This study aimed to investigate the performance of the cRLM chart in amblyopic children. Methods: Timed test and retest measurements using two versions of each chart design were obtained on the amblyopic eye of 43 children. Using the methods of Bland and Altman the agreement, test-retest variability (95% confidence limits for agreement, TRV) and test time of the cRLM and the current clinical standard Snellen chart were compared to the gold standard ETDRS logMAR chart. Results: No systematic bias between chart designs was found. For line assignment scoring the respective TRVs were 0.20 logMAR, 0.20 logMAR, and 0.30 logMAR. Single letter scoring TRVs were cRLM (95% CL 0.17) logMAR, ETDRS (95% CL 0.14) logMAR, and Snellen (95% CL 0.29) logMAR. Median testing times were ETDRS 60 seconds, cRLM 40 seconds, Snellen 30 seconds. Conclusion: The sensitivity to change of the cRLM equalled or approached that of the gold standard ETDRS and was at least 50% better than that of Snellen. This enhanced sensitivity to change was at the cost of only a 10 second time penalty compared to Snellen. The cRLM chart was approximately half the width of the ETDRS chart. The cRLM chart may represent a clinically acceptable compromise between the desire to obtain logMAR acuities of reasonable and known sensitivity to change, chart size, and testing time. PMID:14507755

  11. The X-ray log N-log S relation. [background radiation in extragalactic media

    Science.gov (United States)

    Boldt, Elihu

    1989-01-01

    Results from various surveys are reviewed as regards X-ray source counts at high galactic latitudes and the luminosity functions determined for extragalactic sources. Constraints on the associated log N-log S relation provided by the extragalactic X-ray background are emphasized in terms of its spatial fluctuations and spectrum as well as absolute flux level. The large number of sources required for this background suggests that there is not a sharp boundary in the redshift distribution of visible matter.

  12. INSPIRE and SPIRES Log File Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are made between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.

  13. Forecasting Monthly Prices of Japanese Logs

    Directory of Open Access Journals (Sweden)

    Tetsuya Michinaka

    2016-04-01

    Full Text Available Forecasts of prices can help industries in their risk management. This is especially true for Japanese logs, which experience sharp fluctuations in price. In this research, the authors used an exponential smoothing method (ETS and autoregressive integrated moving average (ARIMA models to forecast the monthly prices of domestic logs of three of the most important species in Japan: sugi (Japanese cedar, Cryptomeria japonica D. Don, hinoki (Japanese cypress, Chamaecyparis obtusa (Sieb. et Zucc. Endl., and karamatsu (Japanese larch, Larix kaempferi (Lamb. Carr.. For the 12-month forecasting periods, forecasting intervals of 80% and 95% were given. By measuring the accuracy of forecasts of 12- and 6-month forecasting periods, it was found that ARIMA gave better results than did the ETS in the majority of cases. However, the combined method of averaging ETS and ARIMA forecasts gave the best results for hinoki in several cases.

  14. CS model coil experimental log book

    Energy Technology Data Exchange (ETDEWEB)

    Nishijima, Gen; Sugimoto, Makoto; Nunoya, Yoshihiko; Wakabayashi, Hiroshi; Tsuji, Hiroshi [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2001-02-01

    Charging test of the ITER CS Model Coil which is the world's largest superconducting pulse coil and the CS Insert Coil had started at April 11, 2000 and had completed at August 18, 2000. In the campaign, total shot numbers were 356 and the size of the data file in the DAS (Data Acquisition System) was over 20 GB. This report is a database that consists of the log list and the log sheets of every shot. One can access the database, make a search, and browse results via Internet (http://1ogwww.naka.jaeri.go.jp). The database will be useful to quick search to choose necessary shots. (author)

  15. SNG-log in borehole Ermelund-208

    DEFF Research Database (Denmark)

    Korsbech, Uffe C C

    1996-01-01

    A Spectral Natural Gamma-ray log has been run in a borehole in Ermelunden. The vertical distribution of Th, U, and K is similar to that observed in neighbouring boreholes. A new measuring and data processing technique was used and the probes own background signal was determined. Surprisingly a si......-rays are emitted and detected by the probe. The intensity of cosmic radiation varies with depth, and, therefore, slightly influences the accuracy of the thorium concentration determination.......A Spectral Natural Gamma-ray log has been run in a borehole in Ermelunden. The vertical distribution of Th, U, and K is similar to that observed in neighbouring boreholes. A new measuring and data processing technique was used and the probes own background signal was determined. Surprisingly...

  16. Lead Test

    Science.gov (United States)

    ... months, and at 3, 4, 5, and 6 years of age. A blood lead level test should be done only if the risk ... recommended if the person is symptomatic at any level below 70 mcg/dL. Because lead will pass through the blood to an unborn child, pregnant ...

  17. Effects of Selection Logging on Rainforest Productivity

    OpenAIRE

    Vanclay, Jerome K.

    2006-01-01

    An analysis of data from 212 permanent sample plots provided no evidence of any decline in rainforest productivity after three cycles of selection logging in the tropical rainforests of north Queensland. Relative productivity was determined as the difference between observed diameter increments and increments predicted from a diameter increment function which incorporated tree size, stand density and site quality. Analyses of variance and regression analyses revealed no significant decline in...

  18. Using Web Logs in the Science Classroom

    Science.gov (United States)

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  19. Log-scaling magnitude modulated watermarking scheme

    Institute of Scientific and Technical Information of China (English)

    LING HeFei; YUAN WuGang; ZOU FuHao; LU ZhengDing

    2007-01-01

    A real-time watermarking scheme with high robustness and security has been proposed based on modulating the log-scaling magnitudes of DCT coefficients,which is most suitable for JPEG images and MPEG streams. The watermark bit is encoded as the sign of the difference between the individual log-scaling magnitude of a group-region and the average one of all group-regions. The log-scaling magnitude can be modulated by modifying the low and middle frequency DCT coefficients imperceptibly. The robustness of scheme is not only dependent on those largest coefficients, but also on the other coefficients with the same proportion. It can embed 512 bits into an image with a size of 512×512, which can satisfy the payload requirement of most video watermarking applications. Moreover, the watermark embedding process only requires one-sixth of the time consumed during normal playing of video, and the watermark detection only requires one-twelfth of that, which can meet the real-time requirements of most video watermarking applications. Furthermore, the experimental results show that the presented scheme is transparent and robust to significant valumetric distortions (including additive noise, low-pass filtering, lossy compression and valumetric scaling) and a part of geometric distortions. It performs much better than the EMW algorithm in resisting all kinds of distortions except Gaussian noise with a larger deviation.

  20. Gerencia logística y global

    Directory of Open Access Journals (Sweden)

    Pablo Cesar Ocampo Vélez

    2009-04-01

    Full Text Available La logística es una filosofía que se encarga de aplicar las buenas pá¡cticas en los macroprocesos, procesos, procedimientos, tareas y productos dentro de una organización, con el fin de satisfacer las necesidades del cliente, quien es la razón de ser de la empresa. Existen entes internacionales que velan porque las compañias en todo el mundo utilicen y difundan las diferentes disciplinas logísticas en toda la red de valor y tengan un mismo enfoque estratégico.El objetivo de este artículo es explicar la existencia de procesos y macroprocesos de clase mundial en la logística, gracias a los cuales hay una gran oportunidad de mejora para los empresarios colombianos, con el fin de reestructurar los procesos estratégicos, tácticos y operacionales de cada Unidad de Negocio, para que sean organizaciones más competitivas, que puedan enfrentar los tratados de libre comercio, brindando mayor sostenibilidad y óptimo nivel de servicio.

  1. Analysis of Web Logs And Web User In Web Mining

    Directory of Open Access Journals (Sweden)

    L.K. Joshila Grace

    2011-01-01

    Full Text Available Log files contain information about User Name, IP Address, Time Stamp, Access Request, number of Bytes Transferred, Result Status, URL that Referred and User Agent. The log files are maintained by the web servers. By analysing these log files gives a neat idea about the user. This paper gives a detailed discussion about these log files, their formats, their creation, access procedures, their uses, various algorithms used and the additional parameters that can be used in the log files which in turn gives way to an effective mining. It also provides the idea of creating an extended log file and learning the user behaviour.

  2. Analysis of Web Logs and Web User in Web Mining

    CERN Document Server

    Grace, L K Joshila; Nagamalai, Dhinaharan

    2011-01-01

    Log files contain information about User Name, IP Address, Time Stamp, Access Request, number of Bytes Transferred, Result Status, URL that Referred and User Agent. The log files are maintained by the web servers. By analysing these log files gives a neat idea about the user. This paper gives a detailed discussion about these log files, their formats, their creation, access procedures, their uses, various algorithms used and the additional parameters that can be used in the log files which in turn gives way to an effective mining. It also provides the idea of creating an extended log file and learning the user behaviour.

  3. Unification of acoustic drillhole logging data

    Energy Technology Data Exchange (ETDEWEB)

    Oehman, I.; Palmen, J.; Heikkinen, E. (Poeyry Environment Oy, Vantaa (Finland))

    2009-04-15

    Posiva Oy prepares for disposal of spent nuclear fuel in bedrock in Olkiluoto, Eurajoki. This is in accordance of the application filed in 1999, the Decision-in-Principle of the State Council in 2000, and ratification by the Parliament in 2001. The site characterization at Olkiluoto has included comprehensive geological, hydrological, geochemical and geophysical investigations airborne, on ground and in drillholes since 1988. One of key techniques in geophysical drillhole surveys has been acoustic full waveform logging, which has been implemented since 1994. Various tools have been used in acquisition of acoustic data and several processing techniques have been applied. The logging work and processing to P and S wave velocities has been previously carried out on single drillhole basis. Comparisons to actual values and levels have not been made, and the results have not been calibrated. Therefore results for different drillholes have not been comparable. Resolution of the P and S wave velocity has been rather coarse, and depth correlation to the core data has been on tentative level. As the investigation data has been accumulating, it has become possible to correlate the results to geological and laboratory control data and to calibrate the results of separate measurement campaigns and different drillholes together onto same reference level and resolution. The presented technique has been applied for drillhole OL-KR29 onwards and has set the processing standard, settings and reference levels for later surveys. This approach will further assist the application of the method for mapping and numerical description of lithology variation and possible effect of alteration and deformation on it. Further on, the P and S wave velocity data together with density can be used in computing of dynamic in situ rock mechanical parameters, and possibly in correlating rock strength laboratory data to P and S wave velocity logging data. The acoustic logging data from drillholes OL-KR1

  4. Approximation Algorithms for Dominating Set in Disk Graphs

    CERN Document Server

    Gibson, Matt

    2010-01-01

    We consider the problem of finding a lowest cost dominating set in a given disk graph containing $n$ disks. The problem has been extensively studied on subclasses of disk graphs, yet the best known approximation for disk graphs has remained $O(\\log n)$ -- a bound that is asymptotically no better than the general case. We improve the status quo in two ways: for the unweighted case, we show how to obtain a PTAS using the framework recently proposed (independently)by Mustafa and Ray [SoCG 09] and by Chan and Har-Peled [SoCG 09]; for the weighted case where each input disk has an associated rational weight with the objective of finding a minimum cost dominating set, we give a randomized algorithm that obtains a dominating set whose weight is within a factor $2^{O(\\log^* n)}$ of a minimum cost solution, with high probability -- the technique follows the framework proposed recently by Varadarajan [STOC 10].

  5. Rytov approximation in electron scattering

    Science.gov (United States)

    Krehl, Jonas; Lubk, Axel

    2017-06-01

    In this work we introduce the Rytov approximation in the scope of high-energy electron scattering with the motivation of developing better linear models for electron scattering. Such linear models play an important role in tomography and similar reconstruction techniques. Conventional linear models, such as the phase grating approximation, have reached their limits in current and foreseeable applications, most importantly in achieving three-dimensional atomic resolution using electron holographic tomography. The Rytov approximation incorporates propagation effects which are the most pressing limitation of conventional models. While predominately used in the weak-scattering regime of light microscopy, we show that the Rytov approximation can give reasonable results in the inherently strong-scattering regime of transmission electron microscopy.

  6. Rollout sampling approximate policy iteration

    NARCIS (Netherlands)

    Dimitrakakis, C.; Lagoudakis, M.G.

    2008-01-01

    Several researchers have recently investigated the connection between reinforcement learning and classification. We are motivated by proposals of approximate policy iteration schemes without value functions, which focus on policy representation using classifiers and address policy learning as a

  7. Approximate common divisors via lattices

    CERN Document Server

    Cohn, Henry

    2011-01-01

    We analyze the multivariate generalization of Howgrave-Graham's algorithm for the approximate common divisor problem. In the m-variable case with modulus N and approximate common divisor of size N^beta, this improves the size of the error tolerated from N^(beta^2) to N^(beta^((m+1)/m)), under a commonly used heuristic assumption. This gives a more detailed analysis of the hardness assumption underlying the recent fully homomorphic cryptosystem of van Dijk, Gentry, Halevi, and Vaikuntanathan. While these results do not challenge the suggested parameters, a 2^sqrt(n) approximation algorithm for lattice basis reduction in n dimensions could be used to break these parameters. We have implemented our algorithm, and it performs better in practice than the theoretical analysis suggests. Our results fit into a broader context of analogies between cryptanalysis and coding theory. The multivariate approximate common divisor problem is the number-theoretic analogue of noisy multivariate polynomial interpolation, and we ...

  8. Approximate Implicitization Using Linear Algebra

    Directory of Open Access Journals (Sweden)

    Oliver J. D. Barrowclough

    2012-01-01

    Full Text Available We consider a family of algorithms for approximate implicitization of rational parametric curves and surfaces. The main approximation tool in all of the approaches is the singular value decomposition, and they are therefore well suited to floating-point implementation in computer-aided geometric design (CAGD systems. We unify the approaches under the names of commonly known polynomial basis functions and consider various theoretical and practical aspects of the algorithms. We offer new methods for a least squares approach to approximate implicitization using orthogonal polynomials, which tend to be faster and more numerically stable than some existing algorithms. We propose several simple propositions relating the properties of the polynomial bases to their implicit approximation properties.

  9. Binary nucleation beyond capillarity approximation

    NARCIS (Netherlands)

    Kalikmanov, V.I.

    2010-01-01

    Large discrepancies between binary classical nucleation theory (BCNT) and experiments result from adsorption effects and inability of BCNT, based on the phenomenological capillarity approximation, to treat small clusters. We propose a model aimed at eliminating both of these deficiencies. Adsorption

  10. Nonlinear approximation with redundant dictionaries

    DEFF Research Database (Denmark)

    Borup, Lasse; Nielsen, M.; Gribonval, R.

    2005-01-01

    In this paper we study nonlinear approximation and data representation with redundant function dictionaries. In particular, approximation with redundant wavelet bi-frame systems is studied in detail. Several results for orthonormal wavelets are generalized to the redundant case. In general......, for a wavelet bi-frame system the approximation properties are limited by the number of vanishing moments of the system. In some cases this can be overcome by oversampling, but at a price of replacing the canonical expansion by another linear expansion. Moreover, for special non-oversampled wavelet bi-frames we...... can obtain good approximation properties not restricted by the number of vanishing moments, but again without using the canonical expansion....

  11. Mathematical algorithms for approximate reasoning

    Science.gov (United States)

    Murphy, John H.; Chay, Seung C.; Downs, Mary M.

    1988-01-01

    Most state of the art expert system environments contain a single and often ad hoc strategy for approximate reasoning. Some environments provide facilities to program the approximate reasoning algorithms. However, the next generation of expert systems should have an environment which contain a choice of several mathematical algorithms for approximate reasoning. To meet the need for validatable and verifiable coding, the expert system environment must no longer depend upon ad hoc reasoning techniques but instead must include mathematically rigorous techniques for approximate reasoning. Popular approximate reasoning techniques are reviewed, including: certainty factors, belief measures, Bayesian probabilities, fuzzy logic, and Shafer-Dempster techniques for reasoning. A group of mathematically rigorous algorithms for approximate reasoning are focused on that could form the basis of a next generation expert system environment. These algorithms are based upon the axioms of set theory and probability theory. To separate these algorithms for approximate reasoning various conditions of mutual exclusivity and independence are imposed upon the assertions. Approximate reasoning algorithms presented include: reasoning with statistically independent assertions, reasoning with mutually exclusive assertions, reasoning with assertions that exhibit minimum overlay within the state space, reasoning with assertions that exhibit maximum overlay within the state space (i.e. fuzzy logic), pessimistic reasoning (i.e. worst case analysis), optimistic reasoning (i.e. best case analysis), and reasoning with assertions with absolutely no knowledge of the possible dependency among the assertions. A robust environment for expert system construction should include the two modes of inference: modus ponens and modus tollens. Modus ponens inference is based upon reasoning towards the conclusion in a statement of logical implication, whereas modus tollens inference is based upon reasoning away

  12. Lead Poisoning

    Science.gov (United States)

    ... Topics Environment & Health Healthy Living Pollution Reduce, Reuse, Recycle Science – How It Works The Natural World Games ... OTHERS: Lead has recently been found in some plastic mini-blinds and vertical blinds which were made ...

  13. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    Science.gov (United States)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  14. Thresholds of logging intensity to maintain tropical forest biodiversity.

    Science.gov (United States)

    Burivalova, Zuzana; Sekercioğlu, Cağan Hakkı; Koh, Lian Pin

    2014-08-18

    Primary tropical forests are lost at an alarming rate, and much of the remaining forest is being degraded by selective logging. Yet, the impacts of logging on biodiversity remain poorly understood, in part due to the seemingly conflicting findings of case studies: about as many studies have reported increases in biodiversity after selective logging as have reported decreases. Consequently, meta-analytical studies that treat selective logging as a uniform land use tend to conclude that logging has negligible effects on biodiversity. However, selectively logged forests might not all be the same. Through a pantropical meta-analysis and using an information-theoretic approach, we compared and tested alternative hypotheses for key predictors of the richness of tropical forest fauna in logged forest. We found that the species richness of invertebrates, amphibians, and mammals decreases as logging intensity increases and that this effect varies with taxonomic group and continental location. In particular, mammals and amphibians would suffer a halving of species richness at logging intensities of 38 m(3) ha(-1) and 63 m(3) ha(-1), respectively. Birds exhibit an opposing trend as their total species richness increases with logging intensity. An analysis of forest bird species, however, suggests that this pattern is largely due to an influx of habitat generalists into heavily logged areas while forest specialist species decline. Our study provides a quantitative analysis of the nuanced responses of species along a gradient of logging intensity, which could help inform evidence-based sustainable logging practices from the perspective of biodiversity conservation.

  15. Twisted inhomogeneous Diophantine approximation and badly approximable sets

    CERN Document Server

    Harrap, Stephen

    2010-01-01

    For any real pair i, j geq 0 with i+j=1 let Bad(i, j) denote the set of (i, j)-badly approximable pairs. That is, Bad(i, j) consists of irrational vectors x:=(x_1, x_2) in R^2 for which there exists a positive constant c(x) such that max {||qx_1||^(-i), ||qx_2||^(-j)} > c(x)/q for all q in N. Building on a result of Kurzweil, a new characterization of the set Bad(i, j) in terms of `well-approximable' vectors in the area of `twisted' inhomogeneous Diophantine approximation is established. In addition, it is shown that Bad^x(i, j), the `twisted' inhomogeneous analogue of Bad(i, j), has full Hausdorff dimension 2 when x is chosen from the set Bad(i, j).

  16. The Research of Through-casing Resistivity Logging Logging Calibration System Leakage Current Measurement Method

    Directory of Open Access Journals (Sweden)

    ZHANG Jiatian

    2013-07-01

    Full Text Available This paper introduces the logging principle of through-casing resistivity logging technology, finds a phenomenon that the leakage current measurements are susceptible to sufferring interferences. The through-casing resistivity logging technology in Russia and that of Schlumberger are studied, and the system of through-casing resistivity logging is established to improve the accuracy of calibrating, testing and measuring of the instrument. In this paper, distribution parameters of the form is replaced by the lumped parameter, and precision resistor array simulation in formation leakage current and scale pool simulation in different resistivity of formation are conducted, which make the dynamic range of the simulation in formation resistivity of the medium increase to 1- 300 Ω·m and meet the requirement of through-casing resistivity logging technology measurement range, 1 Ω·m ~ 100 Ω·m. Since the measuring signals of calibration acquisition and processing systems are extremely weak and calculation signals need to tell the nV (nanovolts level, the high accurate data acquisition system of 24 digits is applied.

  17. Nut Production in Bertholletia excelsa across a Logged Forest Mosaic: Implications for Multiple Forest Use.

    Science.gov (United States)

    Rockwell, Cara A; Guariguata, Manuel R; Menton, Mary; Arroyo Quispe, Eriks; Quaedvlieg, Julia; Warren-Thomas, Eleanor; Fernandez Silva, Harol; Jurado Rojas, Edwin Eduardo; Kohagura Arrunátegui, José Andrés Hideki; Meza Vega, Luis Alberto; Revilla Vera, Olivia; Quenta Hancco, Roger; Valera Tito, Jonatan Frank; Villarroel Panduro, Betxy Tabita; Yucra Salas, Juan José

    2015-01-01

    Although many examples of multiple-use forest management may be found in tropical smallholder systems, few studies provide empirical support for the integration of selective timber harvesting with non-timber forest product (NTFP) extraction. Brazil nut (Bertholletia excelsa, Lecythidaceae) is one of the world's most economically-important NTFP species extracted almost entirely from natural forests across the Amazon Basin. An obligate out-crosser, Brazil nut flowers are pollinated by large-bodied bees, a process resulting in a hard round fruit that takes up to 14 months to mature. As many smallholders turn to the financial security provided by timber, Brazil nut fruits are increasingly being harvested in logged forests. We tested the influence of tree and stand-level covariates (distance to nearest cut stump and local logging intensity) on total nut production at the individual tree level in five recently logged Brazil nut concessions covering about 4000 ha of forest in Madre de Dios, Peru. Our field team accompanied Brazil nut harvesters during the traditional harvest period (January-April 2012 and January-April 2013) in order to collect data on fruit production. Three hundred and ninety-nine (approximately 80%) of the 499 trees included in this study were at least 100 m from the nearest cut stump, suggesting that concessionaires avoid logging near adult Brazil nut trees. Yet even for those trees on the edge of logging gaps, distance to nearest cut stump and local logging intensity did not have a statistically significant influence on Brazil nut production at the applied logging intensities (typically 1-2 timber trees removed per ha). In one concession where at least 4 trees ha-1 were removed, however, the logging intensity covariate resulted in a marginally significant (0.09) P value, highlighting a potential risk for a drop in nut production at higher intensities. While we do not suggest that logging activities should be completely avoided in Brazil nut rich

  18. Nut Production in Bertholletia excelsa across a Logged Forest Mosaic: Implications for Multiple Forest Use.

    Directory of Open Access Journals (Sweden)

    Cara A Rockwell

    Full Text Available Although many examples of multiple-use forest management may be found in tropical smallholder systems, few studies provide empirical support for the integration of selective timber harvesting with non-timber forest product (NTFP extraction. Brazil nut (Bertholletia excelsa, Lecythidaceae is one of the world's most economically-important NTFP species extracted almost entirely from natural forests across the Amazon Basin. An obligate out-crosser, Brazil nut flowers are pollinated by large-bodied bees, a process resulting in a hard round fruit that takes up to 14 months to mature. As many smallholders turn to the financial security provided by timber, Brazil nut fruits are increasingly being harvested in logged forests. We tested the influence of tree and stand-level covariates (distance to nearest cut stump and local logging intensity on total nut production at the individual tree level in five recently logged Brazil nut concessions covering about 4000 ha of forest in Madre de Dios, Peru. Our field team accompanied Brazil nut harvesters during the traditional harvest period (January-April 2012 and January-April 2013 in order to collect data on fruit production. Three hundred and ninety-nine (approximately 80% of the 499 trees included in this study were at least 100 m from the nearest cut stump, suggesting that concessionaires avoid logging near adult Brazil nut trees. Yet even for those trees on the edge of logging gaps, distance to nearest cut stump and local logging intensity did not have a statistically significant influence on Brazil nut production at the applied logging intensities (typically 1-2 timber trees removed per ha. In one concession where at least 4 trees ha-1 were removed, however, the logging intensity covariate resulted in a marginally significant (0.09 P value, highlighting a potential risk for a drop in nut production at higher intensities. While we do not suggest that logging activities should be completely avoided in Brazil

  19. CNPC Sees Rapid Growth in Overseas Oil Logging Service Business

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ CNPC Logging Technology Service Co Ltd has made breakthrough in market development in the first two months of this year by signing the logging technological service contract with Repsol in Libya, creating a good start for 2004.

  20. Asymptotic Analysis of the Paradox in Log-Stretch Dip Moveout

    CERN Document Server

    Yang, Xin-She

    2010-01-01

    There exists a paradox in dip moveout (DMO) in seismic data processing. The paradox is why Notfors and Godfrey's approximate time log-stretched DMO can produce better impulse responses than the full log DMO, and why Hale's f-k DMO is correct although it was based on two inaccurate assumptions for the midpoint repositioning and the DMO time relationship? Based on the asymptotic analysis of the DMO algorithms, we find that any form of correctly formulated DMO must handle both space and time coordinates properly in order to deal with all dips accurately. The surprising improvement of Notfors and Godfrey's log DMO on Bale and Jakubowicz's full log DMO was due to the equivalent midpoint repositioning by transforming the time-related phase shift to the space-related phase shift. The explanation of why Hale's f-k DMO is correct although it was based on two inaccurate assumptions is that the two approximations exactly cancel each other in the f-k domain to give the correct final result.

  1. Distributions and Losses of Logging Residues at Clear-Felled Areas during Extraction for Bioenergy: Comparing Dried- and Fresh-Stacked Method

    Directory of Open Access Journals (Sweden)

    Bengt Nilsson

    2015-11-01

    Full Text Available It is well known that a large proportion of available logging residues intended for extraction will not reach the energy-conversion industry, because some are lost during transportation or left on the clear-felled area. However, there is little understanding of where logging residue losses occur in the supply chain. In this study, the distribution of logging residues for two methods (dried- and fresh-stacked method to extract logging residues were studied in one clear-felled area. In addition, residue fractions were examined in a detailed comparison. Even though the fresh-stacked method left somewhat more logging residues at the clear-felled area, the differences are small between the methods. Approximately 30% of the total amount of logging residues was left behind between the harvester heaps, with an additional 10%–15% under these heaps and approximately 2%–3% beneath the windrows. The final product that was delivered to the energy-conversion industry was very similar, regardless of the extraction method used. The delivered chipped logging residues had moisture contents of 37% and 36% following fresh- and dried-stacked methods respectively, and in both cases the needle content in the processed logging residues was approximately 10%. However, the total amount of fine fractions (needles and fines was slightly higher following dried-stacking.

  2. Log amplifier with pole-zero compensation

    Science.gov (United States)

    Brookshier, William

    1987-01-01

    A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifier circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedback loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point or pole is offset by a compensating break point or zero.

  3. Blood lead levels and risk factors for lead poisoning among children in Jakarta, Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Albalak, Rachel; Noonan, Gary; Buchanan, Sharunda; Flanders, W. Dana; Gotway-Crawford, Carol; Blumenthal, Wendy; Curtis, Gerald; McGeehin, Michael A. [Division of Environmental Hazards and Health Effects, National Center for Environmental Health, Centers for Disease Control and Prevention, 1600 Clifton Rd. Mailstop E-19, Atlanta, GA 30333 (United States); Kim, Dennis; Tan, Regina [Epidemic Intelligence Service, Epidemiology Program Office, Centers for Disease Control and Prevention, 1600 Clifton Rd. Mailstop D-18, Atlanta, GA 30333 (United States); Jones, Robert L. [Division of Laboratory Sciences, National Center for Environmental Health, Centers for Disease Control and Prevention, 1600 Clifton Rd. Mailstop F-18, Atlanta, GA 30333 (United States); Sulaiman, Rini [Swisscontact, Jl. Wijaya XII No. 44, Jakarta (Indonesia) 12160

    2003-01-01

    The phase-out of leaded gasoline began in Jakarta, Indonesia on July 1, 2001. We evaluated mean blood lead levels (BLLs) and the prevalence of elevated BLLs of Jakarta school children and assessed risk factors for lead exposure in these children before the beginning of the phase-out activities. The study involved a population-based, cross-sectional blood lead survey that included capillary blood lead sampling and a brief questionnaire on risk factors for lead poisoning. A cluster survey design was used. Forty clusters, defined as primary schools in Jakarta, and 15 2nd- and 3rd-grade children in each cluster were randomly selected for participation in the study. The average age of children in this study was 8.6 years (range 6-12) and the geometric mean BLL of the children was 8.6 {mu}g/dl (median: 8.6 {mu}g/dl; range: 2.6-24.1 {mu}g/dl) (n=397). Thirty-five percent of children had BLLs {>=}10 {mu}g/dl and 2.4% had BLLs {>=}20 {mu}g/dl. Approximately one-fourth of children had BLLs 10-14.9 {mu}g/dl. In multivariate models, level of education of the child's primary caregiver, water collection method, home varnishing and occupational recycling of metals, other than lead, by a family member were predictors of log BLLs after adjustment for age and sex. BLLs of children who lived near a highway or major intersection were significantly higher than those of children who lived near a street with little or no traffic when level of education was not included in the model. Water collection method was a significant predictor of BLLs {>=}10 {mu}g/dl after adjustment for age and sex. BLLs in children in this study were moderately high and consistent with BLLs of children in other countries where leaded gasoline is used. With the phase-out of leaded gasoline, BLLs of children in Jakarta are expected to rapidly decline as they have in other countries that have phased lead out of gasoline.

  4. Second derivatives for approximate spin projection methods.

    Science.gov (United States)

    Thompson, Lee M; Hratchian, Hrant P

    2015-02-07

    The use of broken-symmetry electronic structure methods is required in order to obtain correct behavior of electronically strained open-shell systems, such as transition states, biradicals, and transition metals. This approach often has issues with spin contamination, which can lead to significant errors in predicted energies, geometries, and properties. Approximate projection schemes are able to correct for spin contamination and can often yield improved results. To fully make use of these methods and to carry out exploration of the potential energy surface, it is desirable to develop an efficient second energy derivative theory. In this paper, we formulate the analytical second derivatives for the Yamaguchi approximate projection scheme, building on recent work that has yielded an efficient implementation of the analytical first derivatives.

  5. Ecotoxicology: Lead

    Science.gov (United States)

    Scheuhammer, A.M.; Beyer, W.N.; Schmitt, C.J.; Jorgensen, Sven Erik; Fath, Brian D.

    2008-01-01

    Lead (Pb) is a naturally occurring metallic element; trace concentrations are found in all environmental media and in all living things. However, certain human activities, especially base metal mining and smelting; combustion of leaded gasoline; the use of Pb in hunting, target shooting, and recreational angling; the use of Pb-based paints; and the uncontrolled disposal of Pb-containing products such as old vehicle batteries and electronic devices have resulted in increased environmental levels of Pb, and have created risks for Pb exposure and toxicity in invertebrates, fish, and wildlife in some ecosystems.

  6. Reinforcement Learning via AIXI Approximation

    CERN Document Server

    Veness, Joel; Hutter, Marcus; Silver, David

    2010-01-01

    This paper introduces a principled approach for the design of a scalable general reinforcement learning agent. This approach is based on a direct approximation of AIXI, a Bayesian optimality notion for general reinforcement learning agents. Previously, it has been unclear whether the theory of AIXI could motivate the design of practical algorithms. We answer this hitherto open question in the affirmative, by providing the first computationally feasible approximation to the AIXI agent. To develop our approximation, we introduce a Monte Carlo Tree Search algorithm along with an agent-specific extension of the Context Tree Weighting algorithm. Empirically, we present a set of encouraging results on a number of stochastic, unknown, and partially observable domains.

  7. Concept Approximation between Fuzzy Ontologies

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Fuzzy ontologies are efficient tools to handle fuzzy and uncertain knowledge on the semantic web; but there are heterogeneity problems when gaining interoperability among different fuzzy ontologies. This paper uses concept approximation between fuzzy ontologies based on instances to solve the heterogeneity problems. It firstly proposes an instance selection technology based on instance clustering and weighting to unify the fuzzy interpretation of different ontologies and reduce the number of instances to increase the efficiency. Then the paper resolves the problem of computing the approximations of concepts into the problem of computing the least upper approximations of atom concepts. It optimizes the search strategies by extending atom concept sets and defining the least upper bounds of concepts to reduce the searching space of the problem. An efficient algorithm for searching the least upper bounds of concept is given.

  8. Diophantine approximation and Dirichlet series

    CERN Document Server

    Queffélec, Hervé

    2013-01-01

    This self-contained book will benefit beginners as well as researchers. It is devoted to Diophantine approximation, the analytic theory of Dirichlet series, and some connections between these two domains, which often occur through the Kronecker approximation theorem. Accordingly, the book is divided into seven chapters, the first three of which present tools from commutative harmonic analysis, including a sharp form of the uncertainty principle, ergodic theory and Diophantine approximation to be used in the sequel. A presentation of continued fraction expansions, including the mixing property of the Gauss map, is given. Chapters four and five present the general theory of Dirichlet series, with classes of examples connected to continued fractions, the famous Bohr point of view, and then the use of random Dirichlet series to produce non-trivial extremal examples, including sharp forms of the Bohnenblust-Hille theorem. Chapter six deals with Hardy-Dirichlet spaces, which are new and useful Banach spaces of anal...

  9. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  10. Approximate Sparse Regularized Hyperspectral Unmixing

    Directory of Open Access Journals (Sweden)

    Chengzhi Deng

    2014-01-01

    Full Text Available Sparse regression based unmixing has been recently proposed to estimate the abundance of materials present in hyperspectral image pixel. In this paper, a novel sparse unmixing optimization model based on approximate sparsity, namely, approximate sparse unmixing (ASU, is firstly proposed to perform the unmixing task for hyperspectral remote sensing imagery. And then, a variable splitting and augmented Lagrangian algorithm is introduced to tackle the optimization problem. In ASU, approximate sparsity is used as a regularizer for sparse unmixing, which is sparser than l1 regularizer and much easier to be solved than l0 regularizer. Three simulated and one real hyperspectral images were used to evaluate the performance of the proposed algorithm in comparison to l1 regularizer. Experimental results demonstrate that the proposed algorithm is more effective and accurate for hyperspectral unmixing than state-of-the-art l1 regularizer.

  11. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    Directory of Open Access Journals (Sweden)

    Sayalee Narkhede

    2013-07-01

    Full Text Available In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes of log data in a day.These datasets are huge. In order to analyze such large datasets we need parallel processing system andreliable data storage mechanism. Virtual database system is an effective solution for integrating the databut it becomes inefficient for large datasets. The Hadoop framework provides reliable data storage byHadoop Distributed File System and MapReduce programming model which is a parallel processingsystem for large datasets. Hadoop distributed file system breaks up input data and sends fractions of theoriginal data to several machines in hadoop cluster to hold blocks of data. This mechanism helps toprocess log data in parallel using all the machines in the hadoop cluster and computes result efficiently.The dominant approach provided by hadoop to “Store first query later”, loads the data to the HadoopDistributed File System and then executes queries written in Pig Latin. This approach reduces the responsetime as well as the load on to the end system. This paper proposes a log analysis system using HadoopMapReduce which will provide accurate results in minimum response time.

  12. 10 CFR 39.13 - Specific licenses for well logging.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Specific licenses for well logging. 39.13 Section 39.13 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING Specific Licensing Requirements § 39.13 Specific licenses for well logging. The Commission will approve an application for a specific license for the...

  13. Intensifying the Group Member's Experience Using the Group Log.

    Science.gov (United States)

    Valine, Warren J.

    1983-01-01

    Presents the use of a group log in which members analyze the content and process of each session using a suggested format. The log promotes dialogue between the leader and each group member and involves members more fully in the group process. Feedback indicates the log is valuable. (JAC)

  14. Data Cleaning Methods for Client and Proxy Logs

    NARCIS (Netherlands)

    Weinreich, H.; Obendorf, H.; Herder, E.; Edmonds, A.; Hawkey, K.; Kellar, M.; Turnbull, D.

    2006-01-01

    In this paper we present our experiences with the cleaning of Web client and proxy usage logs, based on a long-term browsing study with 25 participants. A detailed clickstream log, recorded using a Web intermediary, was combined with a second log of user interface actions, which was captured by a mo

  15. 14 CFR 121.701 - Maintenance log: Aircraft.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Maintenance log: Aircraft. 121.701 Section... REQUIREMENTS: DOMESTIC, FLAG, AND SUPPLEMENTAL OPERATIONS Records and Reports § 121.701 Maintenance log... have made, a record of that action in the airplane's maintenance log. (b) Each certificate holder...

  16. 31 CFR 593.309 - Round log or timber product.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Round log or timber product. 593.309 Section 593.309 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE... SANCTIONS REGULATIONS General Definitions § 593.309 Round log or timber product. The term round log...

  17. 47 CFR 80.1153 - Station log and radio watches.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station log and radio watches. 80.1153 Section... SERVICES STATIONS IN THE MARITIME SERVICES Voluntary Radio Installations General § 80.1153 Station log and radio watches. (a) Licensees of voluntary ships are not required to maintain radio station logs....

  18. 21 CFR 211.182 - Equipment cleaning and use log.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Equipment cleaning and use log. 211.182 Section... Reports § 211.182 Equipment cleaning and use log. A written record of major equipment cleaning... individual equipment logs that show the date, time, product, and lot number of each batch processed....

  19. 14 CFR 125.407 - Maintenance log: Airplanes.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Maintenance log: Airplanes. 125.407 Section... Maintenance log: Airplanes. (a) Each person who takes corrective action or defers action concerning a reported... record the action taken in the airplane maintenance log in accordance with part 43 of this chapter....

  20. The Learning Log as an Integrated Instructional Assessment Tool.

    Science.gov (United States)

    Topaz, Beverley

    1997-01-01

    Use of student learning logs is recommended as a means for both students and teacher to assess second-language learning. The approach encourages learners to analyze their learning difficulties and plan for overcoming them. Incorporated into portfolios, logs can be used to analyze progress. Sample log sheet and chart used as a framework for…

  1. Why, What, and How to Log? Lessons from LISTEN

    Science.gov (United States)

    Mostow, Jack; Beck, Joseph E.

    2009-01-01

    The ability to log tutorial interactions in comprehensive, longitudinal, fine-grained detail offers great potential for educational data mining--but what data is logged, and how, can facilitate or impede the realization of that potential. We propose guidelines gleaned over 15 years of logging, exploring, and analyzing millions of events from…

  2. A survey of animal-powered logging in Alabama

    Science.gov (United States)

    Christopher W. Toms; Mark R. Dubois; John C. Bliss; John H. Wilhoit; Robert B. Rummer

    2001-01-01

    In a state with a very large, highly mechanized timber harvesting industry, animal-powered logging still occupies a niche in Alabama as a small-scale harvesting alternative. This article summarizes the results from a study that examined the extent of animal logging in Alabama. We investigated this topic by asking who is logging with animals, where are they working,...

  3. RT-PLG: Real Time Process Log Generator

    DEFF Research Database (Denmark)

    Yahya, Bernardo Nugroho; Khosiawan, Yohanes; Choi, Woosik;

    2016-01-01

    . This paper aims to develop a real time process log generator for the usage of streaming process mining tool. The real time process log generator (RT-PLG) is constructed in an independent tool. Afterward, the RT-PLG is utilized to generate a synthetic log for streaming process mining. The tool has been...... evaluated using an existing simulation model....

  4. U.S. Hardwood Sawmill Log Procurement Practices

    Directory of Open Access Journals (Sweden)

    Adrienn Andersch

    2015-01-01

    Full Text Available U.S. hardwood sawmill log procurement practices are evolving because of the recent economic recession, market and supply chain shifts, and changing landowner objectives, among other factors. The objective of this study was to characterize the log procurement practices of hardwood sawmills and to characterize the role that log brokers play in supplying the sawmill industry with raw material. To meet this objective, a mail survey on hardwood log procurement practices in the U.S. hardwood sawmill industry was conducted. Survey respondents highlighted several factors that had major effects on their businesses, including “Increasing fuel and trucking cost,” “High logging cost,” “Unpredictable log supply,” “Log shortages,” “Logger shortages,” and “Low log quality,” among others. Results showed that large sawmills tend to rely more on gatewood from loggers and stumpage harvested by company contract loggers than do small- and medium-sized sawmills. This study failed to find an increase in the role of log brokers as an intermediary between landowners and hardwood sawmills during the last decade. Moreover, sawmills indicated only a limited demand for log broker services, with log delivery and the procurement of specialty logs identified as being the most highly demanded broker services.

  5. Transfinite Approximation of Hindman's Theorem

    CERN Document Server

    Beiglböck, Mathias

    2010-01-01

    Hindman's Theorem states that in any finite coloring of the integers, there is an infinite set all of whose finite sums belong to the same color. This is much stronger than the corresponding finite form, stating that in any finite coloring of the integers there are arbitrarily long finite sets with the same property. We extend the finite form of Hindman's Theorem to a "transfinite" version for each countable ordinal, and show that Hindman's Theorem is equivalent to the appropriate transfinite approximation holding for every countable ordinal. We then give a proof of Hindman's Theorem by directly proving these transfinite approximations.

  6. Fast approximate convex decomposition using relative concavity

    KAUST Repository

    Ghosh, Mukulika

    2013-02-01

    Approximate convex decomposition (ACD) is a technique that partitions an input object into approximately convex components. Decomposition into approximately convex pieces is both more efficient to compute than exact convex decomposition and can also generate a more manageable number of components. It can be used as a basis of divide-and-conquer algorithms for applications such as collision detection, skeleton extraction and mesh generation. In this paper, we propose a new method called Fast Approximate Convex Decomposition (FACD) that improves the quality of the decomposition and reduces the cost of computing it for both 2D and 3D models. In particular, we propose a new strategy for evaluating potential cuts that aims to reduce the relative concavity, rather than absolute concavity. As shown in our results, this leads to more natural and smaller decompositions that include components for small but important features such as toes or fingers while not decomposing larger components, such as the torso, that may have concavities due to surface texture. Second, instead of decomposing a component into two pieces at each step, as in the original ACD, we propose a new strategy that uses a dynamic programming approach to select a set of n c non-crossing (independent) cuts that can be simultaneously applied to decompose the component into n c+1 components. This reduces the depth of recursion and, together with a more efficient method for computing the concavity measure, leads to significant gains in efficiency. We provide comparative results for 2D and 3D models illustrating the improvements obtained by FACD over ACD and we compare with the segmentation methods in the Princeton Shape Benchmark by Chen et al. (2009) [31]. © 2012 Elsevier Ltd. All rights reserved.

  7. Selected borehole geophysical logs and drillers' logs, northern coastal plain of New Jersey

    Science.gov (United States)

    Murashige, J.E.; Birkelo, B.A.; Pucci, A.A.

    1989-01-01

    This report presents lithologic data compiled during the initial phase of a cooperative study by the U.S. Geological Survey and the New Jersey Department of Environmental Protection, Division of Water Resources to assess the hydrogeology of the Potomac-Raritan-Magothy aquifer system in the northern Coastal Plain of New Jersey. The report includes 109 geophysical logs and 328 drillers ' logs that were selected as representative of the Potomac-Raritan-Magothy aquifer system. A description of the Potomac-Raritan-Magothy aquifer system also is give. (USGS)

  8. Leading men

    DEFF Research Database (Denmark)

    Bekker-Nielsen, Tønnes

    2016-01-01

    Through a systematic comparison of c. 50 careers leading to the koinarchate or high priesthood of Asia, Bithynia, Galatia, Lycia, Macedonia and coastal Pontus, as described in funeral or honorary inscriptions of individual koinarchs, it is possible to identify common denominators but also...

  9. Lead grids

    CERN Multimedia

    1974-01-01

    One of the 150 lead grids used in the multiwire proportional chamber g-ray detector. The 0.75 mm diameter holes are spaced 1 mm centre to centre. The grids were made by chemical cutting techniques in the Godet Workshop of the SB Physics.

  10. Tree wavelet approximations with applications

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    [1]Baraniuk, R. G., DeVore, R. A., Kyriazis, G., Yu, X. M., Near best tree approximation, Adv. Comput. Math.,2002, 16: 357-373.[2]Cohen, A., Dahmen, W., Daubechies, I., DeVore, R., Tree approximation and optimal encoding, Appl. Comput.Harmonic Anal., 2001, 11: 192-226.[3]Dahmen, W., Schneider, R., Xu, Y., Nonlinear functionals of wavelet expansions-adaptive reconstruction and fast evaluation, Numer. Math., 2000, 86: 49-101.[4]DeVore, R. A., Nonlinear approximation, Acta Numer., 1998, 7: 51-150.[5]Davis, G., Mallat, S., Avellaneda, M., Adaptive greedy approximations, Const. Approx., 1997, 13: 57-98.[6]DeVore, R. A., Temlyakov, V. N., Some remarks on greedy algorithms, Adv. Comput. Math., 1996, 5: 173-187.[7]Kashin, B. S., Temlyakov, V. N., Best m-term approximations and the entropy of sets in the space L1, Mat.Zametki (in Russian), 1994, 56: 57-86.[8]Temlyakov, V. N., The best m-term approximation and greedy algorithms, Adv. Comput. Math., 1998, 8:249-265.[9]Temlyakov, V. N., Greedy algorithm and m-term trigonometric approximation, Constr. Approx., 1998, 14:569-587.[10]Hutchinson, J. E., Fractals and self similarity, Indiana. Univ. Math. J., 1981, 30: 713-747.[11]Binev, P., Dahmen, W., DeVore, R. A., Petruchev, P., Approximation classes for adaptive methods, Serdica Math.J., 2002, 28: 1001-1026.[12]Gilbarg, D., Trudinger, N. S., Elliptic Partial Differential Equations of Second Order, Berlin: Springer-Verlag,1983.[13]Ciarlet, P. G., The Finite Element Method for Elliptic Problems, New York: North Holland, 1978.[14]Birman, M. S., Solomiak, M. Z., Piecewise polynomial approximation of functions of the class Wαp, Math. Sb.,1967, 73: 295-317.[15]DeVore, R. A., Lorentz, G. G., Constructive Approximation, New York: Springer-Verlag, 1993.[16]DeVore, R. A., Popov, V., Interpolation of Besov spaces, Trans. Amer. Math. Soc., 1988, 305: 397-414.[17]Devore, R., Jawerth, B., Popov, V., Compression of wavelet decompositions, Amer. J. Math., 1992, 114: 737-785.[18]Storozhenko, E

  11. Logging Data High-Resolution Sequence Stratigraphy

    Institute of Scientific and Technical Information of China (English)

    Li Hongqi; Xie Yinfu; Sun Zhongchun; Luo Xingping

    2006-01-01

    The recognition and contrast of bed sets in parasequence is difficult in terrestrial basin high-resolution sequence stratigraphy. This study puts forward new methods for the boundary identification and contrast of bed sets on the basis of manifold logging data. The formation of calcareous interbeds, shale resistivity differences and the relation of reservoir resistivity to altitude are considered on the basis of log curve morphological characteristics, core observation, cast thin section, X-ray diffraction and scanning electron microscopy. The results show that the thickness of calcareous interbeds is between 0.5 m and 2 m, increasing on weathering crusts and faults. Calcareous interbeds occur at the bottom of Reservoir resistivity increases with altitude. Calcareous interbeds may be a symbol of recognition for the boundary of bed sets and isochronous contrast bed sets, and shale resistivity differences may confirm the stack relation and connectivity of bed sets. Based on this, a high-rcsolution chronostratigraphic framework of Xi-1 segment in Shinan area, Junggar basin is presented, and the connectivity of bed sets and oil-water contact is confirmed. In this chronostratigraphic framework, the growth order, stack mode and space shape of bed sets are qualitatively and quantitatively described.

  12. Public PCs: Log Out or Lose Out

    CERN Multimedia

    Computer Security Team

    2013-01-01

    Do you regularly use one of the public Windows or Linux terminals in the CERN library or in front of the Users' Office? Or do you often give presentations or run meetings, workshops or conferences? Did you recently attend a training session in the CERN Training Centre? If you answered at least once with “yes”, we have a plea for you: LOG OUT when done in order to protect your data!   You might recall that CERN considers that “Your Privacy is Paramount”. But this does not come for free. In the few past months, we have received several reports from vigilant people who have spotted open user sessions on public PCs at CERN. Those users simply forgot to log out once their work, training or meeting was over. Their session continued without them being present. Worse, with CERN using a central Single Sign-On (SSO) portal, their login credentials would allow a malicious person at CERN to use those credentials to access that user’s mailbox, DFS ...

  13. Evaluation of the use of salivary lead levels as a surrogate of blood lead or plasma lead levels in lead exposed subjects

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, Fernando [Universidade de Sao Paulo, Departamento de Analises Clinicas, Toxicologicas e Bromatologicas, Faculdade de Ciencias Farmaceuticas de Ribeirao Preto, Ribeirao Preto, SP (Brazil); Correa Rodrigues, Maria H.; Buzalaf, Maria R. [Universidade de Sao Paulo, Departamento de Ciencias Biologicas/Bioquimica, Faculdade de Odontologia de Bauru, Bauru, SP (Brazil); Krug, Francisco J. [Universidade de Sao Paulo, Centro de Energia Nuclear na Agricultura, Piracicaba, SP (Brazil); Gerlach, Raquel F. [Universidade de Sao Paulo, Departamento de Morfologia, Estomatologia e Fisiologia, Faculdade de Odontologia de Ribeirao Preto, Ribeirao Preto, SP (Brazil); Tanus-Santos, Jose E. [Universidade de Sao Paulo, Departamento de Farmacologia, Faculdade de Medicina de Ribeirao Preto, Ribeirao Preto, SP (Brazil)

    2006-10-15

    We conducted a study to evaluate the use of parotid salivary lead (Pb-saliva) levels as a surrogate of the blood lead (Pb-B) or plasma lead levels (Pb-P) to diagnose lead exposure. The relationship between these biomarkers was assessed in a lead exposed population. Pb-saliva and Pb-P were determined by inductively coupled plasma mass spectrometry, while in whole blood lead was determined by graphite furnace atomic absorption spectrometry. We studied 88 adults (31 men and 57 women) from 18 to 60 years old. Pb-saliva levels varied from 0.05 to 4.4 {mu}g/l, with a mean of 0.85 {mu}g/l. Blood lead levels varied from 32.0 to 428.0 {mu}g/l in men (mean 112.3 {mu}g/l) and from 25.0 to 263.0 {mu}g/l (mean 63.5 {mu}g/l) in women. Corresponding Pb-Ps were 0.02-2.50 {mu}g/l (mean 0.77 {mu}g/l) and 0.03-1.6 {mu}g/l (mean 0.42 {mu}g/l) in men and women, respectively. A weak correlation was found between Log Pb-saliva and Log Pb-B (r=0.277, P<0.008), and between Log Pb-saliva and Log Pb-P (r=0.280, P=0.006). The Pb-saliva/Pb-P ratio ranged from 0.20 to 18.0. Age or gender does not affect Pb-saliva levels or Pb-saliva/Pb-P ratio. Taken together, these results suggest that salivary lead may not be used as a biomarker to diagnose lead exposure nor as a surrogate of plasma lead levels at least for low to moderately lead exposed population. (orig.)

  14. Financial Anti-Bubbles Log-Periodicity in Gold and Nikkei Collapses

    Science.gov (United States)

    Johansen, A.; Sornette, D.

    We propose that the herding behavior of traders leads not only to speculative bubbles with accelerating over-valuations of financial markets possibly followed by crashes, but also to "anti-bubbles" with decelerating market devaluations following all-time highs. For this, we propose a simple market dynamics model in which the demand decreases slowly with barriers that progressively quench in, leading to a power law decay of the market price characterized by decelerating log-periodic oscillations. We document this behavior of the Japanese Nikkei stock index from 1990 to present and of the gold future prices after 1980, both after their all-time highs. We perform simultaneously parametric and nonparametric analyses that are fully consistent with each other. We extend the parametric approach to the next order of perturbation, comparing the log-periodic fits with one, two and three log-frequencies, the latter providing a prediction for the general trend in the coming years. The nonparametric power spectrum analysis shows the existence of log-periodicity with high statistical significance, with a preferred scale ratio of λ≈3.5 for the Nikkei index and λ≈1.9 for the Gold future prices, comparable to the values obtained for speculative bubbles leading to crashes.

  15. Theory of Casimir Forces without the Proximity-Force Approximation.

    Science.gov (United States)

    Lapas, Luciano C; Pérez-Madrid, Agustín; Rubí, J Miguel

    2016-03-18

    We analyze both the attractive and repulsive Casimir-Lifshitz forces recently reported in experimental investigations. By using a kinetic approach, we obtain the Casimir forces from the power absorbed by the materials. We consider collective material excitations through a set of relaxation times distributed in frequency according to a log-normal function. A generalized expression for these forces for arbitrary values of temperature is obtained. We compare our results with experimental measurements and conclude that the model goes beyond the proximity-force approximation.

  16. WKB Approximation in Noncommutative Gravity

    Directory of Open Access Journals (Sweden)

    Maja Buric

    2007-12-01

    Full Text Available We consider the quasi-commutative approximation to a noncommutative geometry defined as a generalization of the moving frame formalism. The relation which exists between noncommutativity and geometry is used to study the properties of the high-frequency waves on the flat background.

  17. Truthful approximations to range voting

    DEFF Research Database (Denmark)

    Filos-Ratsika, Aris; Miltersen, Peter Bro

    We consider the fundamental mechanism design problem of approximate social welfare maximization under general cardinal preferences on a finite number of alternatives and without money. The well-known range voting scheme can be thought of as a non-truthful mechanism for exact social welfare...

  18. Approximate Reasoning with Fuzzy Booleans

    NARCIS (Netherlands)

    Broek, van den P.M.; Noppen, J.A.R.

    2004-01-01

    This paper introduces, in analogy to the concept of fuzzy numbers, the concept of fuzzy booleans, and examines approximate reasoning with the compositional rule of inference using fuzzy booleans. It is shown that each set of fuzzy rules is equivalent to a set of fuzzy rules with singleton crisp ante

  19. Ultrafast Approximation for Phylogenetic Bootstrap

    NARCIS (Netherlands)

    Bui Quang Minh, [No Value; Nguyen, Thi; von Haeseler, Arndt

    2013-01-01

    Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and

  20. On badly approximable complex numbers

    DEFF Research Database (Denmark)

    Esdahl-Schou, Rune; Kristensen, S.

    We show that the set of complex numbers which are badly approximable by ratios of elements of , where has maximal Hausdorff dimension. In addition, the intersection of these sets is shown to have maximal dimension. The results remain true when the sets in question are intersected with a suitably...

  1. Rational approximation of vertical segments

    Science.gov (United States)

    Salazar Celis, Oliver; Cuyt, Annie; Verdonk, Brigitte

    2007-08-01

    In many applications, observations are prone to imprecise measurements. When constructing a model based on such data, an approximation rather than an interpolation approach is needed. Very often a least squares approximation is used. Here we follow a different approach. A natural way for dealing with uncertainty in the data is by means of an uncertainty interval. We assume that the uncertainty in the independent variables is negligible and that for each observation an uncertainty interval can be given which contains the (unknown) exact value. To approximate such data we look for functions which intersect all uncertainty intervals. In the past this problem has been studied for polynomials, or more generally for functions which are linear in the unknown coefficients. Here we study the problem for a particular class of functions which are nonlinear in the unknown coefficients, namely rational functions. We show how to reduce the problem to a quadratic programming problem with a strictly convex objective function, yielding a unique rational function which intersects all uncertainty intervals and satisfies some additional properties. Compared to rational least squares approximation which reduces to a nonlinear optimization problem where the objective function may have many local minima, this makes the new approach attractive.

  2. Approximation on the complex sphere

    OpenAIRE

    Alsaud, Huda; Kushpel, Alexander; Levesley, Jeremy

    2012-01-01

    We develop new elements of harmonic analysis on the complex sphere on the basis of which Bernstein's, Jackson's and Kolmogorov's inequalities are established. We apply these results to get order sharp estimates of $m$-term approximations. The results obtained is a synthesis of new results on classical orthogonal polynomials, harmonic analysis on manifolds and geometric properties of Euclidean spaces.

  3. On badly approximable complex numbers

    DEFF Research Database (Denmark)

    Esdahl-Schou, Rune; Kristensen, S.

    We show that the set of complex numbers which are badly approximable by ratios of elements of , where has maximal Hausdorff dimension. In addition, the intersection of these sets is shown to have maximal dimension. The results remain true when the sets in question are intersected with a suitably...

  4. Pythagorean Approximations and Continued Fractions

    Science.gov (United States)

    Peralta, Javier

    2008-01-01

    In this article, we will show that the Pythagorean approximations of [the square root of] 2 coincide with those achieved in the 16th century by means of continued fractions. Assuming this fact and the known relation that connects the Fibonacci sequence with the golden section, we shall establish a procedure to obtain sequences of rational numbers…

  5. Approximate Reanalysis in Topology Optimization

    DEFF Research Database (Denmark)

    Amir, Oded; Bendsøe, Martin P.; Sigmund, Ole

    2009-01-01

    In the nested approach to structural optimization, most of the computational effort is invested in the solution of the finite element analysis equations. In this study, the integration of an approximate reanalysis procedure into the framework of topology optimization of continuum structures...

  6. Low Rank Approximation in $G_0W_0$ Approximation

    CERN Document Server

    Shao, Meiyue; Yang, Chao; Liu, Fang; da Jornada, Felipe H; Deslippe, Jack; Louie, Steven G

    2016-01-01

    The single particle energies obtained in a Kohn--Sham density functional theory (DFT) calculation are generally known to be poor approximations to electron excitation energies that are measured in transport, tunneling and spectroscopic experiments such as photo-emission spectroscopy. The correction to these energies can be obtained from the poles of a single particle Green's function derived from a many-body perturbation theory. From a computational perspective, the accuracy and efficiency of such an approach depends on how a self energy term that properly accounts for dynamic screening of electrons is approximated. The $G_0W_0$ approximation is a widely used technique in which the self energy is expressed as the convolution of a non-interacting Green's function ($G_0$) and a screened Coulomb interaction ($W_0$) in the frequency domain. The computational cost associated with such a convolution is high due to the high complexity of evaluating $W_0$ at multiple frequencies. In this paper, we discuss how the cos...

  7. Min-cuts and Shortest Cycles in Planar Graphs in O(n log log n) Time

    CERN Document Server

    \\L\\kacki, Jakub

    2011-01-01

    We present a deterministic O(n log log n) time algorithm for finding shortest cycles and minimum cuts in planar graphs. The algorithm improves the previously known fastest algorithm by Italiano et al. in STOC'11 by a factor of log n. This speedup is obtained through the use of dense distance graphs combined with a divide-and-conquer approach.

  8. Coal log pipeline for twenty-first century coal transportation

    Energy Technology Data Exchange (ETDEWEB)

    Marrero, T.R.; Liu, H.; Wilkinson, J.E. [Univ. of Missouri, Columbia, MO (United States)

    1998-12-31

    During the first years of the 21st century coal log pipeline (CLP) technology will be available for long-distance coal transportation. The purpose of this report is to present the state-of-the-art of coal log pipeline technology. Some recent developments are as follows: optimization of coal log compaction procedures, construction of a unique coal log prototype manufacturing machine, and its testing. Coal log abrasion while transported in water-filled pipelines is also discussed. A CLP pilot plant is currently under construction at the University of Missouri. For certain routes in the US a CLP system appears to be cost-competitive.

  9. An additive combinatorics approach to the log-rank conjecture in communication complexity

    CERN Document Server

    Ben-Sasson, Eli; Zewi, Noga

    2011-01-01

    For a $\\{0,1\\}$-valued matrix $M$ let $\\rm{CC}(M)$ denote the deterministic communication complexity of the boolean function associated with $M$. The log-rank conjecture of Lov\\'{a}sz and Saks [FOCS 1988] states that $\\rm{CC}(M) \\leq \\log^c(\\rm{rank}(M))$ for some absolute constant $c$ where $\\rm{rank}(M)$ denotes the rank of $M$ over the field of real numbers. We show that $\\rm{CC}(M)\\leq c \\cdot \\rm{rank}(M)/\\log \\rm{rank}(M)$ for some absolute constant $c$, assuming a well-known conjecture from additive combinatorics known as the Polynomial Freiman-Ruzsa (PFR) conjecture. Our proof is based on the study of the "approximate duality conjecture" which was recently suggested by Ben-Sasson and Zewi [STOC 2011] and studied there in connection to the PFR conjecture. First we improve the bounds on approximate duality assuming the PFR conjecture. Then we use the approximate duality conjecture (with improved bounds) to get the aforementioned upper bound on the communication complexity of low-rank martices, where thi...

  10. Providing clinicians with information on laboratory test costs leads to ...

    African Journals Online (AJOL)

    Providing clinicians with information on laboratory test costs leads to ... Log in or Register to get access to full text downloads. ... For the intervention and control groups, pre- and postintervention cost and days in hospital were estimated.

  11. A Sublogarithmic Approximation for Highway and Tollbooth Pricing

    CERN Document Server

    Gamzu, Iftah

    2010-01-01

    An instance of the tollbooth problem consists of an undirected network and a collection of single-minded customers, each of which is interested in purchasing a fixed path subject to an individual budget constraint. The objective is to assign a per-unit price to each edge in a way that maximizes the collective revenue obtained from all customers. The revenue generated by any customer is equal to the overall price of the edges in her desired path, when this cost falls within her budget; otherwise, that customer will not purchase any edge. Our main result is a deterministic algorithm for the tollbooth problem on trees whose approximation ratio is O(log m / log log m), where m denotes the number of edges in the underlying graph. This finding improves on the currently best performance guarantees for trees, due to Elbassioni et al. (SAGT '09), as well as for paths (commonly known as the highway problem), due to Balcan and Blum (EC '06). An additional interesting consequence is a computational separation between tol...

  12. Write-Combined Logging: An Optimized Logging for Consistency in NVRAM

    Directory of Open Access Journals (Sweden)

    Wenzhe Zhang

    2015-01-01

    Full Text Available Nonvolatile memory (e.g., Phase Change Memory blurs the boundary between memory and storage and it could greatly facilitate the construction of in-memory durable data structures. Data structures can be processed and stored directly in NVRAM. To maintain the consistency of persistent data, logging is a widely adopted mechanism. However, logging introduces write-twice overhead. This paper introduces an optimized write-combined logging to reduce the writes to NVRAM log. By leveraging the fast-read and byte-addressable features of NVRAM, we can perform a read-and-compare operation before writes and thus issue writes in a finer-grained way. We tested our system on the benchmark suit STAMP which contains real-world applications. Experiment results show that our system can reduce the writes to NVRAM by 33%–34%, which can help extend the lifetime of NVRAM and improve performance. Averagely our system can improve performance by 7%–11%.

  13. Use of historical logging patterns to identify disproportionately logged ecosystems within temperate rainforests of southeastern Alaska.

    Science.gov (United States)

    Albert, David M; Schoen, John W

    2013-08-01

    The forests of southeastern Alaska remain largely intact and contain a substantial proportion of Earth's remaining old-growth temperate rainforest. Nonetheless, industrial-scale logging has occurred since the 1950s within a relatively narrow range of forest types that has never been quantified at a regional scale. We analyzed historical patterns of logging from 1954 through 2004 and compared the relative rates of change among forest types, landform associations, and biogeographic provinces. We found a consistent pattern of disproportionate logging at multiple scales, including large-tree stands and landscapes with contiguous productive old-growth forests. The highest rates of change were among landform associations and biogeographic provinces that originally contained the largest concentrations of productive old growth (i.e., timber volume >46.6 m³/ha). Although only 11.9% of productive old-growth forests have been logged region wide, large-tree stands have been reduced by at least 28.1%, karst forests by 37%, and landscapes with the highest volume of contiguous old growth by 66.5%. Within some island biogeographic provinces, loss of rare forest types may place local viability of species dependent on old growth at risk of extirpation. Examination of historical patterns of change among ecological forest types can facilitate planning for conservation of biodiversity and sustainable use of forest resources. © 2013 Society for Conservation Biology.

  14. KFM 01A. Q-logging

    Energy Technology Data Exchange (ETDEWEB)

    Barton, Nick [Nick Barton and Associates (Norway)

    2003-03-01

    The first Forsmark potential repository site borehole KFM 01A provided core from 101.8 to 1000.7 m depth. This was independently Q-logged during a two-day period (19th-20th February, 2003), without access to BORMAP results or regional jointing frequencies or orientations. The Q-logging was intended to be an independent check for subsequent BORMAP-derived Q-parameter estimation. The Q-logging was accomplished using the manually-recorded 'histogram method' which allows the logger to enter Q-parameter ranges and depths directly into the appropriate histograms, which facilitates subsequent data processing using Excel spreadsheets. Successive pairs of core boxes, which contain an average of 11 meters of core in ten rows, were the source of ten opinions of each of the six Q-parameters, giving a total of 4920 recordings of Q-parameter values for the 164 core boxes. Data processing was divided into several parts, with successively increasing detail. The report therefore contains Q-histograms for the whole core, for four identified fracture(d) zones combined as if one unit, and then for the whole core minus these fracture(d) zones. This background rock mass quality is subsequently divided into nine depth zones or slices, and trends of variation with depth are tabulated. The four identified fracture(d) zones, which are actually of reasonable quality, are also analysed separately, and similarities and subtle differences are discerned between them. The overall quality of this first core is very good to excellent, with Q(mean) of 48.4, and a most frequent Q-value of 100. The range of quality is from 2.1 to 2130, which is the complete upper half of the six order of magnitude Q scale. Even the relatively fracture(d) zones, representing some 13% of the 900 m cored, have a combined Q(mean) of 13.9 and a range of quality of 2.1 to 150.

  15. Who Leads China's Leading Universities?

    Science.gov (United States)

    Huang, Futao

    2017-01-01

    This study attempts to identify the major characteristics of two different groups of institutional leaders in China's leading universities. The study begins with a review of relevant literature and theory. Then, there is a brief introduction to the selection of party secretaries, deputy secretaries, presidents and vice presidents in leading…

  16. Methods for interpretation of tensor induction well logging in layered anisotropic formations

    Science.gov (United States)

    Peksen, Ertan

    One of the most challenging problems in the field of electromagnetic well logging is the development of interpretation methods for the characterization of conductivity anisotropy in an earth formation. Response of a triaxial electromagnetic induction well logging instrument is examined. This instrument detects three components of the magnetic field due to each of three transmitters for a total of nine signals. The conductivity anisotropy of the medium can be resolved from the instrument response. This information includes not only the vertical and horizontal conductivities, but also the orientation of the logging instrument axis with respect to the principal tensor axes. Formulas for the apparent horizontal and vertical conductivities, the apparent anisotropy coefficient, and the apparent relative deviation angle are introduced. A new method of induction logging based on electrical measurements is investigated. Electrical tensor components are studied in an unbounded, homogeneous, transversely isotropic, conductive medium. Low frequency asymptotic approximations of the analytical solution are derived. The important result is that by measuring the in-phase components of the electrical tensor, the principal values of the conductivity tensor can be obtained. The basic principles of tensor induction logging two-, three-, and multilayer anisotropic formations in vertical and deviated wells are examined by using numerical simulation of the tensor logs. A technique for correct reconstruction of the apparent conductivities of the anisotropic formations is introduced, based on application of a regularized Newton method. The method is fast and provides real time interpretation. The practical effectiveness of this technique for tensor induction log interpretation is illustrated using results of numerical experiments. The theoretical formulas for the tensor apparent conductivities of the transversely isotropic medium are studied and developed for an ideal tensor induction

  17. 测井数据链与一体化网络测井%On Log Data Link and Integrated Net Logging

    Institute of Scientific and Technical Information of China (English)

    林德强; 陈浩军; 徐秋贞; 钮顺

    2011-01-01

    Data link is a new concept which connects all the logging business units together by advanced internet technology and system integration technology, such as EILog, digtal core system, LEAD, ERPs, etc. Thus, the whole business units in a coordinated way work, which provides quick solution for oil and gas evaluation. Proposed is a concept of log data link. Discussed are new generation logging technology system and its characteristics based on the log data link. Simply elaborated are basic net, system architecture and future applications, etc.%测井数据链是一个全新的概念,通过先进网络技术和系统集成技术,将测井采集系统、数字岩心系统、处理解释系统、生产管理、ERP等独立的业务子系统进行整合,实现各业务单元协同工作,为油气评价提供快速解决方案.提出测井数据链概念,重点讨论以测井数据链为基础的新一代测井技术体系及其基本特征,对基础网络、体系结构、应用前景进行了初步阐述.

  18. Does logging and forest conversion to oil palm agriculture alter functional diversity in a biodiversity hotspot?

    Science.gov (United States)

    Edwards, F A; Edwards, D P; Larsen, T H; Hsu, W W; Benedick, S; Chung, A; Vun Khen, C; Wilcove, D S; Hamer, K C

    2014-01-01

    Forests in Southeast Asia are rapidly being logged and converted to oil palm. These changes in land-use are known to affect species diversity but consequences for the functional diversity of species assemblages are poorly understood. Environmental filtering of species with similar traits could lead to disproportionate reductions in trait diversity in degraded habitats. Here, we focus on dung beetles, which play a key role in ecosystem processes such as nutrient recycling and seed dispersal. We use morphological and behavioural traits to calculate a variety of functional diversity measures across a gradient of disturbance from primary forest through intensively logged forest to oil palm. Logging caused significant shifts in community composition but had very little effect on functional diversity, even after a repeated timber harvest. These data provide evidence for functional redundancy of dung beetles within primary forest and emphasize the high value of logged forests as refugia for biodiversity. In contrast, conversion of forest to oil palm greatly reduced taxonomic and functional diversity, with a marked decrease in the abundance of nocturnal foragers, a higher proportion of species with small body sizes and the complete loss of telecoprid species (dung-rollers), all indicating a decrease in the functional capacity of dung beetles within plantations. These changes also highlight the vulnerability of community functioning within logged forests in the event of further environmental degradation. PMID:25821399

  19. Does logging and forest conversion to oil palm agriculture alter functional diversity in a biodiversity hotspot?

    Science.gov (United States)

    Edwards, F A; Edwards, D P; Larsen, T H; Hsu, W W; Benedick, S; Chung, A; Vun Khen, C; Wilcove, D S; Hamer, K C

    2014-04-01

    Forests in Southeast Asia are rapidly being logged and converted to oil palm. These changes in land-use are known to affect species diversity but consequences for the functional diversity of species assemblages are poorly understood. Environmental filtering of species with similar traits could lead to disproportionate reductions in trait diversity in degraded habitats. Here, we focus on dung beetles, which play a key role in ecosystem processes such as nutrient recycling and seed dispersal. We use morphological and behavioural traits to calculate a variety of functional diversity measures across a gradient of disturbance from primary forest through intensively logged forest to oil palm. Logging caused significant shifts in community composition but had very little effect on functional diversity, even after a repeated timber harvest. These data provide evidence for functional redundancy of dung beetles within primary forest and emphasize the high value of logged forests as refugia for biodiversity. In contrast, conversion of forest to oil palm greatly reduced taxonomic and functional diversity, with a marked decrease in the abundance of nocturnal foragers, a higher proportion of species with small body sizes and the complete loss of telecoprid species (dung-rollers), all indicating a decrease in the functional capacity of dung beetles within plantations. These changes also highlight the vulnerability of community functioning within logged forests in the event of further environmental degradation.

  20. Hydrogen Beyond the Classic Approximation

    CERN Document Server

    Scivetti, I

    2003-01-01

    The classical nucleus approximation is the most frequently used approach for the resolution of problems in condensed matter physics.However, there are systems in nature where it is necessary to introduce the nuclear degrees of freedom to obtain a correct description of the properties.Examples of this, are the systems with containing hydrogen.In this work, we have studied the resolution of the quantum nuclear problem for the particular case of the water molecule.The Hartree approximation has been used, i.e. we have considered that the nuclei are distinguishable particles.In addition, we have proposed a model to solve the tunneling process, which involves the resolution of the nuclear problem for configurations of the system away from its equilibrium position

  1. Approximate Privacy: Foundations and Quantification

    CERN Document Server

    Feigenbaum, Joan; Schapira, Michael

    2009-01-01

    Increasing use of computers and networks in business, government, recreation, and almost all aspects of daily life has led to a proliferation of online sensitive data about individuals and organizations. Consequently, concern about the privacy of these data has become a top priority, particularly those data that are created and used in electronic commerce. There have been many formulations of privacy and, unfortunately, many negative results about the feasibility of maintaining privacy of sensitive data in realistic networked environments. We formulate communication-complexity-based definitions, both worst-case and average-case, of a problem's privacy-approximation ratio. We use our definitions to investigate the extent to which approximate privacy is achievable in two standard problems: the second-price Vickrey auction and the millionaires problem of Yao. For both the second-price Vickrey auction and the millionaires problem, we show that not only is perfect privacy impossible or infeasibly costly to achieve...

  2. Simulation Control Graphical User Interface Logging Report

    Science.gov (United States)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  3. Finite element methods in resistivity logging

    Science.gov (United States)

    Lovell, J. R.

    1993-09-01

    Resistivity measurements are used in geophysical logging to help determine hydrocarbon reserves. The derivation of formation parameters from resistivity measurements is a complicated nonlinear procedure often requiring additional geological information. This requires an excellent understanding of tool physics, both to design new tools and interpret the measurements of existing tools. The Laterolog measurements in particular are difficult to interpret because the response is very nonlinear as a function of electrical conductivity, unlike Induction measurements. Forward modeling of the Laterolog is almost invariably done with finite element codes which require the inversion of large sparse matrices. Modern techniques can be used to accelerate this inversion. Moreover, an understanding of the tool physics can help refine these numerical techniques.

  4. Families of Log Canonically Polarized Varieties

    CERN Document Server

    Dundon, Ariana

    2011-01-01

    Determining the number of singular fibers in a family of varieties over a curve is a generalization of Shafarevich's Conjecture and has implications for the types of subvarieties that can appear in the corresponding moduli stack. We consider families of log canonically polarized varieties over $\\P^1$, i.e. families $g:(Y,D)\\to \\P^1$ where $D$ is an effective snc divisor and the sheaf $\\omega_{Y/\\P^1}(D)$ is $g$-ample. After first defining what it means for fibers of such a family to be singular, we show that with the addition of certain mild hypotheses (the fibers have finite automorphism group, $\\sO_Y(D)$ is semi-ample, and the components of $D$ must avoid the singular locus of the fibers and intersect the fibers transversely), such a family must either be isotrivial or contain at least 3 singular fibers.

  5. Approximate Counting of Graphical Realizations.

    Science.gov (United States)

    Erdős, Péter L; Kiss, Sándor Z; Miklós, István; Soukup, Lajos

    2015-01-01

    In 1999 Kannan, Tetali and Vempala proposed a MCMC method to uniformly sample all possible realizations of a given graphical degree sequence and conjectured its rapidly mixing nature. Recently their conjecture was proved affirmative for regular graphs (by Cooper, Dyer and Greenhill, 2007), for regular directed graphs (by Greenhill, 2011) and for half-regular bipartite graphs (by Miklós, Erdős and Soukup, 2013). Several heuristics on counting the number of possible realizations exist (via sampling processes), and while they work well in practice, so far no approximation guarantees exist for such an approach. This paper is the first to develop a method for counting realizations with provable approximation guarantee. In fact, we solve a slightly more general problem; besides the graphical degree sequence a small set of forbidden edges is also given. We show that for the general problem (which contains the Greenhill problem and the Miklós, Erdős and Soukup problem as special cases) the derived MCMC process is rapidly mixing. Further, we show that this new problem is self-reducible therefore it provides a fully polynomial randomized approximation scheme (a.k.a. FPRAS) for counting of all realizations.

  6. Approximate Counting of Graphical Realizations.

    Directory of Open Access Journals (Sweden)

    Péter L Erdős

    Full Text Available In 1999 Kannan, Tetali and Vempala proposed a MCMC method to uniformly sample all possible realizations of a given graphical degree sequence and conjectured its rapidly mixing nature. Recently their conjecture was proved affirmative for regular graphs (by Cooper, Dyer and Greenhill, 2007, for regular directed graphs (by Greenhill, 2011 and for half-regular bipartite graphs (by Miklós, Erdős and Soukup, 2013. Several heuristics on counting the number of possible realizations exist (via sampling processes, and while they work well in practice, so far no approximation guarantees exist for such an approach. This paper is the first to develop a method for counting realizations with provable approximation guarantee. In fact, we solve a slightly more general problem; besides the graphical degree sequence a small set of forbidden edges is also given. We show that for the general problem (which contains the Greenhill problem and the Miklós, Erdős and Soukup problem as special cases the derived MCMC process is rapidly mixing. Further, we show that this new problem is self-reducible therefore it provides a fully polynomial randomized approximation scheme (a.k.a. FPRAS for counting of all realizations.

  7. Many Faces of Boussinesq Approximations

    CERN Document Server

    Vladimirov, Vladimir A

    2016-01-01

    The \\emph{equations of Boussinesq approximation} (EBA) for an incompressible and inhomogeneous in density fluid are analyzed from a viewpoint of the asymptotic theory. A systematic scaling shows that there is an infinite number of related asymptotic models. We have divided them into three classes: `poor', `reasonable' and `good' Boussinesq approximations. Each model can be characterized by two parameters $q$ and $k$, where $q =1, 2, 3, \\dots$ and $k=0, \\pm 1, \\pm 2,\\dots$. Parameter $q$ is related to the `quality' of approximation, while $k$ gives us an infinite set of possible scales of velocity, time, viscosity, \\emph{etc.} Increasing $q$ improves the quality of a model, but narrows the limits of its applicability. Parameter $k$ allows us to vary the scales of time, velocity and viscosity and gives us the possibility to consider any initial and boundary conditions. In general, we discover and classify a rich variety of possibilities and restrictions, which are hidden behind the routine use of the Boussinesq...

  8. CMB-lensing beyond the Born approximation

    Science.gov (United States)

    Marozzi, Giovanni; Fanizza, Giuseppe; Di Dio, Enea; Durrer, Ruth

    2016-09-01

    We investigate the weak lensing corrections to the cosmic microwave background temperature anisotropies considering effects beyond the Born approximation. To this aim, we use the small deflection angle approximation, to connect the lensed and unlensed power spectra, via expressions for the deflection angles up to third order in the gravitational potential. While the small deflection angle approximation has the drawback to be reliable only for multipoles l lesssim 2500, it allows us to consistently take into account the non-Gaussian nature of cosmological perturbation theory beyond the linear level. The contribution to the lensed temperature power spectrum coming from the non-Gaussian nature of the deflection angle at higher order is a new effect which has not been taken into account in the literature so far. It turns out to be the leading contribution among the post-Born lensing corrections. On the other hand, the effect is smaller than corrections coming from non-linearities in the matter power spectrum, and its imprint on CMB lensing is too small to be seen in present experiments.

  9. CMB-lensing beyond the Born approximation

    CERN Document Server

    Marozzi, Giovanni; Di Dio, Enea; Durrer, Ruth

    2016-01-01

    We investigate the weak lensing corrections to the cosmic microwave background temperature anisotropies considering effects beyond the Born approximation. To this aim, we use the small deflection angle approximation, to connect the lensed and unlensed power spectra, via expressions for the deflection angles up to third order in the gravitational potential. While the small deflection angle approximation has the drawback to be reliable only for multipoles $\\ell\\lesssim 2500$, it allows us to consistently take into account the non-Gaussian nature of cosmological perturbation theory beyond the linear level. The contribution to the lensed temperature power spectrum coming from the non-Gaussian nature of the deflection angle at higher order is a new effect which has not been taken into account in the literature so far. It turns out to be the leading contribution among the post-Born lensing corrections. On the other hand, the effect is smaller than corrections coming from non-linearities in the matter power spectrum...

  10. Leading Cities

    DEFF Research Database (Denmark)

    Pogner, Karl-Heinz

    2017-01-01

    and technical engineering; Smart Cities) is very prominent in the traditional mass media discourse, in PR / PA of tech companies and traditional municipal administrations; whereas the second one (participation; Livable Cities) is mostly enacted in social media, (local) initiatives, movements, (virtual......) communities, new forms of urban governance in municipal administration and co-competitive city networks. Both forms seem to struggle for getting voice and power in the discourses, negotiations, struggles, and conflicts in Urban Governance about the question how to manage or lead (in) a city. Talking about...

  11. Cooperativity and saturation in biochemical networks: a saturable formalism using Taylor series approximations.

    Science.gov (United States)

    Sorribas, Albert; Hernández-Bermejo, Benito; Vilaprinyo, Ester; Alves, Rui

    2007-08-01

    Cooperative and saturable systems are common in molecular biology. Nevertheless, common canonical formalisms for kinetic modeling that are theoretically well justified do not have a saturable form. Modeling and fitting data from saturable systems are widely done using Hill-like equations. In practice, there is no theoretical justification for the generalized use of these equations, other than their ability to fit experimental data. Thus it is important to find a canonical formalism that is (a) theoretically well supported, (b) has a saturable functional form, and (c) can be justifiably applicable to any biochemical network. Here we derive such a formalism using Taylor approximations in a special transformation space defined by power-inverses and logarithms of power-inverses. This formalism is generalized for processes with n-variables, leading to a useful mathematical representation for molecular biology: the Saturable and Cooperative Formalism (SC formalism). This formalism provides an appropriate representation that can be used for modeling processes with cooperativity and saturation. We also show that the Hill equation can be seen as a special case within this formalism. Parameter estimation for the SC formalism requires information that is also necessary to build Power-Law models, Metabolic Control Analysis descriptions or (log)linear and Lin-log models. In addition, the saturation fraction of the relevant processes at the operating point needs to be considered. The practical use of the SC formalism for modeling is illustrated with a few examples. Similar models are built using different formalisms and compared to emphasize advantages and limitations of the different approaches.

  12. Morphometric analyses of hominoid crania, probabilities of conspecificity and an approximation of a biological species constant.

    Science.gov (United States)

    Thackeray, J F; Dykes, S

    2016-02-01

    Thackeray has previously explored the possibility of using a morphometric approach to quantify the "amount" of variation within species and to assess probabilities of conspecificity when two fossil specimens are compared, instead of "pigeon-holing" them into discrete species. In an attempt to obtain a statistical (probabilistic) definition of a species, Thackeray has recognized an approximation of a biological species constant (T=-1.61) based on the log-transformed standard error of the coefficient m (log sem) in regression analysis of cranial and other data from pairs of specimens of conspecific extant species, associated with regression equations of the form y=mx+c where m is the slope and c is the intercept, using measurements of any specimen A (x axis), and any specimen B of the same species (y axis). The log-transformed standard error of the co-efficient m (log sem) is a measure of the degree of similarity between pairs of specimens, and in this study shows central tendency around a mean value of -1.61 and standard deviation 0.10 for modern conspecific specimens. In this paper we focus attention on the need to take into account the range of difference in log sem values (Δlog sem or "delta log sem") obtained from comparisons when specimen A (x axis) is compared to B (y axis), and secondly when specimen A (y axis) is compared to B (x axis). Thackeray's approach can be refined to focus on high probabilities of conspecificity for pairs of specimens for which log sem is less than -1.61 and for which Δlog sem is less than 0.03. We appeal for the adoption of a concept here called "sigma taxonomy" (as opposed to "alpha taxonomy"), recognizing that boundaries between species are not always well defined. Copyright © 2015 Elsevier GmbH. All rights reserved.

  13. Quasi-greedy triangulations approximating the minimum weight triangulation

    Energy Technology Data Exchange (ETDEWEB)

    Levcopoulos, C.; Krznaric, D. [Lund Univ. (Sweden)

    1996-12-31

    This paper settles the following two open problems: (1) What is the worst-case approximation ratio between the greedy and the minimum weight triangulation? (2) Is there a polynomial time algorithm that always pro- duces a triangulation whose length is within a constant factor from the minimum? The answer to the first question is that the known {Omega}({radical}n) lower bound is tight. The second question is answered in the affirmative by using a slight modification of an O(n log n) algorithm for the greedy triangulation. We also derive some other interesting results. For example, we show that a constant-factor approximation of the minimum weight convex partition can be obtained within the same time bounds.

  14. Approximating the Minimum Tour Cover of a Digraph

    Directory of Open Access Journals (Sweden)

    Viet Hung Nguyen

    2011-04-01

    Full Text Available Given a directed graph G with non-negative cost on the arcs, a directed tour cover T of G is a cycle (not necessarily simple in G such that either head or tail (or both of them of every arc in G is touched by T. The minimum directed tour cover problem (DToCP, which is to find a directed tour cover of minimum cost, is NP-hard. It is thus interesting to design approximation algorithms with performance guarantee to solve this problem. Although its undirected counterpart (ToCP has been studied in recent years, in our knowledge, the DToCP remains widely open. In this paper, we give a 2 log2(n-approximation algorithm for the DToCP.

  15. Development of a log-quadratic model to describe microbial inactivation, illustrated by thermal inactivation of Clostridium botulinum.

    Science.gov (United States)

    Stone, G; Chapman, B; Lovell, D

    2009-11-01

    In the commercial food industry, demonstration of microbiological safety and thermal process equivalence often involves a mathematical framework that assumes log-linear inactivation kinetics and invokes concepts of decimal reduction time (D(T)), z values, and accumulated lethality. However, many microbes, particularly spores, exhibit inactivation kinetics that are not log linear. This has led to alternative modeling approaches, such as the biphasic and Weibull models, that relax strong log-linear assumptions. Using a statistical framework, we developed a novel log-quadratic model, which approximates the biphasic and Weibull models and provides additional physiological interpretability. As a statistical linear model, the log-quadratic model is relatively simple to fit and straightforwardly provides confidence intervals for its fitted values. It allows a D(T)-like value to be derived, even from data that exhibit obvious "tailing." We also showed how existing models of non-log-linear microbial inactivation, such as the Weibull model, can fit into a statistical linear model framework that dramatically simplifies their solution. We applied the log-quadratic model to thermal inactivation data for the spore-forming bacterium Clostridium botulinum and evaluated its merits compared with those of popular previously described approaches. The log-quadratic model was used as the basis of a secondary model that can capture the dependence of microbial inactivation kinetics on temperature. This model, in turn, was linked to models of spore inactivation of Sapru et al. and Rodriguez et al. that posit different physiological states for spores within a population. We believe that the log-quadratic model provides a useful framework in which to test vitalistic and mechanistic hypotheses of inactivation by thermal and other processes.

  16. Space-Time Approximation with Sparse Grids

    Energy Technology Data Exchange (ETDEWEB)

    Griebel, M; Oeltz, D; Vassilevski, P S

    2005-04-14

    In this article we introduce approximation spaces for parabolic problems which are based on the tensor product construction of a multiscale basis in space and a multiscale basis in time. Proper truncation then leads to so-called space-time sparse grid spaces. For a uniform discretization of the spatial space of dimension d with O(N{sup d}) degrees of freedom, these spaces involve for d > 1 also only O(N{sup d}) degrees of freedom for the discretization of the whole space-time problem. But they provide the same approximation rate as classical space-time Finite Element spaces which need O(N{sup d+1}) degrees of freedoms. This makes these approximation spaces well suited for conventional parabolic and for time-dependent optimization problems. We analyze the approximation properties and the dimension of these sparse grid space-time spaces for general stable multiscale bases. We then restrict ourselves to an interpolatory multiscale basis, i.e. a hierarchical basis. Here, to be able to handle also complicated spatial domains {Omega}, we construct the hierarchical basis from a given spatial Finite Element basis as follows: First we determine coarse grid points recursively over the levels by the coarsening step of the algebraic multigrid method. Then, we derive interpolatory prolongation operators between the respective coarse and fine grid points by a least squares approach. This way we obtain an algebraic hierarchical basis for the spatial domain which we then use in our space-time sparse grid approach. We give numerical results on the convergence rate of the interpolation error of these spaces for various space-time problems with two spatial dimensions. Also implementational issues, data structures and questions of adaptivity are addressed to some extent.

  17. Efficient and Robust Signal Approximations

    Science.gov (United States)

    2009-05-01

    frenzy . The representational advantage and the low computational cost of applying the Discrete Wavelet Transform have lead to the design of current...107] has been shown to produce a representation that is relevant to modeling the auditory nerve . However, the spikes in the representation are computed

  18. Rollout Sampling Approximate Policy Iteration

    CERN Document Server

    Dimitrakakis, Christos

    2008-01-01

    Several researchers have recently investigated the connection between reinforcement learning and classification. We are motivated by proposals of approximate policy iteration schemes without value functions which focus on policy representation using classifiers and address policy learning as a supervised learning problem. This paper proposes variants of an improved policy iteration scheme which addresses the core sampling problem in evaluating a policy through simulation as a multi-armed bandit machine. The resulting algorithm offers comparable performance to the previous algorithm achieved, however, with significantly less computational effort. An order of magnitude improvement is demonstrated experimentally in two standard reinforcement learning domains: inverted pendulum and mountain-car.

  19. Approximate Deconvolution Reduced Order Modeling

    CERN Document Server

    Xie, Xuping; Wang, Zhu; Iliescu, Traian

    2015-01-01

    This paper proposes a large eddy simulation reduced order model(LES-ROM) framework for the numerical simulation of realistic flows. In this LES-ROM framework, the proper orthogonal decomposition(POD) is used to define the ROM basis and a POD differential filter is used to define the large ROM structures. An approximate deconvolution(AD) approach is used to solve the ROM closure problem and develop a new AD-ROM. This AD-ROM is tested in the numerical simulation of the one-dimensional Burgers equation with a small diffusion coefficient(10^{-3})

  20. C#实现日志文件清除功能%C# Log File Removal Function

    Institute of Scientific and Technical Information of China (English)

    刘德军

    2014-01-01

    The log file is recorded file system operation event, operating system operating system log files, database system data-base, the system log file. The system log file contains a system message files, including kernel, service, an application running on the system. Different log files record different information. This paper aimed at the hospital information system in the application log file long time accumulation leads to the problem of disk space is full, puts forward solving methods specific log file cleared.%日志文件是记录系统操作事件的记录文件,操作系统有操作系统日志文件,数据库系统有数据库系统日志文件。系统日志文件是包含关于系统消息的文件,包括内核、服务、在系统上运行的应用程序等。不同的日志文件记载不同的信息。该文针对医院信息系统应用程序中产生的日志文件长时间积累导致磁盘空间满这一突出问题,提出特定的日志文件清除的解决方法。

  1. Rill erosion in burned and salvage logged western montane forests: Effects of logging equipment type, traffic level, and slash treatment

    Science.gov (United States)

    Wagenbrenner, J. W.; Robichaud, P. R.; Brown, R. E.

    2016-10-01

    Following wildfires, forest managers often consider salvage logging burned trees to recover monetary value of timber, reduce fuel loads, or to meet other objectives. Relatively little is known about the cumulative hydrologic effects of wildfire and subsequent timber harvest using logging equipment. We used controlled rill experiments in logged and unlogged (control) forests burned at high severity in northern Montana, eastern Washington, and southern British Columbia to quantify rill overland flow and sediment production rates (fluxes) after ground-based salvage logging. We tested different types of logging equipment-feller-bunchers, tracked and wheeled skidders, and wheeled forwarders-as well as traffic levels and the addition of slash to skid trails as a best management practice. Rill experiments were done at each location in the first year after the fire and repeated in subsequent years. Logging was completed in the first or second post-fire year. We found that ground-based logging using heavy equipment compacted soil, reduced soil water repellency, and reduced vegetation cover. Vegetation recovery rates were slower in most logged areas than the controls. Runoff rates were higher in the skidder and forwarder plots than their respective controls in the Montana and Washington sites in the year that logging occurred, and the difference in runoff between the skidder and control plots at the British Columbia site was nearly significant (p = 0.089). Most of the significant increases in runoff in the logged plots persisted for subsequent years. The type of skidder, the addition of slash, and the amount of forwarder traffic did not significantly affect the runoff rates. Across the three sites, rill sediment fluxes were 5-1900% greater in logged plots than the controls in the year of logging, and the increases were significant for all logging treatments except the low use forwarder trails. There was no difference in the first-year sediment fluxes between the feller

  2. Plasma Physics Approximations in Ares

    Energy Technology Data Exchange (ETDEWEB)

    Managan, R. A.

    2015-01-08

    Lee & More derived analytic forms for the transport properties of a plasma. Many hydro-codes use their formulae for electrical and thermal conductivity. The coefficients are complex functions of Fermi-Dirac integrals, Fn( μ/θ ), the chemical potential, μ or ζ = ln(1+e μ/θ ), and the temperature, θ = kT. Since these formulae are expensive to compute, rational function approximations were fit to them. Approximations are also used to find the chemical potential, either μ or ζ . The fits use ζ as the independent variable instead of μ/θ . New fits are provided for Aα (ζ ),Aβ (ζ ), ζ, f(ζ ) = (1 + e-μ/θ)F1/2(μ/θ), F1/2'/F1/2, Fcα, and Fcβ. In each case the relative error of the fit is minimized since the functions can vary by many orders of magnitude. The new fits are designed to exactly preserve the limiting values in the non-degenerate and highly degenerate limits or as ζ→ 0 or ∞. The original fits due to Lee & More and George Zimmerman are presented for comparison.

  3. Rational approximations to fluid properties

    Science.gov (United States)

    Kincaid, J. M.

    1990-05-01

    The purpose of this report is to summarize some results that were presented at the Spring AIChE meeting in Orlando, Florida (20 March 1990). We report on recent attempts to develop a systematic method, based on the technique of rational approximation, for creating mathematical models of real-fluid equations of state and related properties. Equation-of-state models for real fluids are usually created by selecting a function tilde p(T,rho) that contains a set of parameters (gamma sub i); the (gamma sub i) is chosen such that tilde p(T,rho) provides a good fit to the experimental data. (Here p is the pressure, T the temperature and rho is the density). In most cases, a nonlinear least-squares numerical method is used to determine (gamma sub i). There are several drawbacks to this method: one has essentially to guess what tilde p(T,rho) should be; the critical region is seldom fit very well and nonlinear numerical methods are time consuming and sometimes not very stable. The rational approximation approach we describe may eliminate all of these drawbacks. In particular, it lets the data choose the function tilde p(T,rho) and its numerical implementation involves only linear algorithms.

  4. Log rank检验的功效%The Power of Log Rank Test

    Institute of Scientific and Technical Information of China (English)

    朱斌; 王曙炎; 赵国龙

    2006-01-01

    目的:Log rank检验是生存资料比较的标准方法,但无与之匹配的样本量测定方法.论述了这种检验的功效,为样本量研究提供依据.方法:由Lachin-Foulkes法计算期望功效作为参照,回顾Log rank检验的3种形式,按Monte Carlo方法分别计算其观测功效,然后作对比分析.结果:所得观测功效在多数试验集均低于期望功效.与上半部相比,寿命表下半部期望和观测功效均较低.所得观测功效在不同终检水平或不同生存分布各不相同.结论:Lachin-Foulkes法产生的样本量偏小,不能满足Log rank检验的预定功效.Log rank检验所需样本量因终检水平、生存时间或生存分布而异,Lachin-Foulkes法无视这些事实,无法作出切合实际的测定.因此必须寻求与这种检验匹配的样本量测定方法.

  5. Aespoe Hard Rock Laboratory. BIPS logging in borehole KAS09

    Energy Technology Data Exchange (ETDEWEB)

    Gustafsson, Jaana; Gustafsson, Christer (Malaa Geoscience AB (Sweden))

    2010-01-15

    This report includes the data gained in BIPS logging performed at the Aespoe Hard Rock Laboratory. The logging operation presented here includes BIPS logging in the core drilled borehole KAS09. The objective for the BIPS logging was to observe the condition of KAS09 in order to restore the borehole in the hydrogeological monitoring programme.All measurements were conducted by Malaa Geoscience AB on October 9th 2009. The objective of the BIPS logging is to achieve information of the borehole including occurrence of rock types as well as determination of fracture distribution and orientation. This report describes the equipment used as well as the measurement procedures and data gained. For the BIPS survey, the result is presented as images. The basic conditions of the BIPS logging for geological mapping and orientation of structures are satisfying for borehole KAS09, although induced affects from the drilling on the borehole walls limit the visibility

  6. Penerapan Reduced Impact Logging Menggunakan Monocable Winch (Pancang Tarik (Implementing Reduced Impact Logging with Monocable Winch

    Directory of Open Access Journals (Sweden)

    Yosep Ruslim

    2012-01-01

    Full Text Available Forest harvesting still encounters many problems especially concerning impact to the residual stand  and environmental damage. Implementing the reduced impact monocable winch and planning of good skid trails should have a positive impact on work efficiency as well as, reducing damage to the residual stand and soil during felling and skidding activities. Reduced impact logging (RIL with a monocable winch (Pancang Tarik system has been tried in several IUPHHKs and it can be concluded that RIL monocable winch system could be applied practically and reduce impact on residual stand and soil damage. Using this technology has many advantages, among others: cost efficiency, locally made, environmental friendly, and high local community participation. Application of  the monocable winch  system in reduced impact logging is an effort to reduce economical and environment  damages when compared to conventional system of ground based skidding with bulldozer system. The aim of this research is to verify the efficiency (operational cost, effectiveness (productivity and  time consumption of monocable winch system. The results  indicate that the implementation monocable winch system, has reduced the soil damage as much as 8% ha-1.  The skidding cost  with monocable system is Rp95.000 m-3. This figure is significantly cheaper if compare with ground base skidding with bulldozer system in which the skidding cost around Rp165.000 m-3.Keywords: mononocable winch, productivity,  skidding cost, reduced impact logging, local community

  7. Volcanic stratigraphy of DSDP/ODP Hole 395A: An interpretation using well-logging data

    Science.gov (United States)

    Bartetzko, Anne; Pezard, Philippe; Goldberg, David; Sun, Yue-Feng; Becker, Keir

    2001-03-01

    Deep Sea Drilling Project/Ocean Drilling Program Hole 395A was drilled approximately 500 m deep into young oceanic crust west of the Mid-Atlantic Ridge. Core recovery is very poor in this hole and therefore continuous downhole measurements are important to understand the drilled lithology. Geophysical downhole measurements were carried out during several cruises. A new set of logs was recorded during Leg 174B in summer 1997. The new logging data show a significant improvement in data quality compared to older measurements from Leg 109. The lithostratigraphy established from cores gives only limited information because of the poor core recovery in this hole. The gaps in the core lithostratigraphy are filled by reconstructing a synthetic lithological profile using the standard well-logging data. Three types of lava morphologies, massive basalts, altered lava flows, and pillow basalts, may be distinguished using the logs because the lava morphologies show differences in their physical properties due to differences in fracturing and alteration. The synthetic lithological profile gives a more detailed and precise vertical definition of single layers than the core profile. The integration of further logging and core data enables a detailed reconstruction of the accretion history at the drill site. Cyclic, upward decreasing trends in the resistivity logs were already observed during earlier cruises and were referred to magmatic cycles. Similar trends occur in the density log and, inversely, in the total gamma ray log. The trends reflect gradual changes in fracturing, porosity, permeability, and alteration and cover depth intervals of several tens of meters. Boundaries between cycles are interpreted to correspond to periods of volcanic quiescence. Two types of boundaries may be identified. Boundaries correlating with reversals in the magnetic field and/or changes in the geochemical composition of the basalts are interpreted as long pauses. Basalts separated by these

  8. Development of a Single-Borehole Radar for Well Logging

    OpenAIRE

    Zheng-ou Zhou; Qing Zhao; Haining Yang; Tingjun Li

    2012-01-01

    An impulse-based single-borehole radar prototype has been developed for well logging. The borehole radar is comprised of subsurface sonde and surface equipment. An armored 7-conductor well logging cable is used to connect subsurface sonde and surface equipment which is well compatible with the other well logging instruments. The performance experiments of the prototype have been conducted in a test field. The results show that the prototype system is capable of detecting the target which is 8...

  9. Western tight gas sands advanced logging workshop proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, J B; Carroll, Jr, H B [eds.

    1982-04-01

    An advanced logging research program is one major aspect of the Western Tight Sands Program. Purpose of this workshop is to help BETC define critical logging needs for tight gas sands and to allow free interchange of ideas on all aspects of the current logging research program. Sixteen papers and abstracts are included together with discussions. Separate abstracts have been prepared for the 12 papers. (DLC)

  10. Mining Interesting Knowledge from Web-Log

    Institute of Scientific and Technical Information of China (English)

    ZHOU Hong-fang; FENG Bo-qin; HEI Xin-hong; LU Lin-tao

    2004-01-01

    Web-log contains a lot of information related with user activities on the Internet.How to mine user browsing interest patterns effectively is an important and challengeable research topic.On the analysis of the present algorithm's advantages and disadvantages, we propose a new concept: support-interest.Its key insight is that visitor will backtrack if they do not find the information where they expect.And the point from where they backtrack is the expected location for the page.We present User Access Matrix and the corresponding algorithm for discovering such expected locations that can handle page caching by the browser.Since the URL-URL matrix is a sparse matrix which can be represented by List of 3-tuples, we can mine user preferred sub-paths from the computation of this matrix.Accordingly, all the sub-paths are merged, and user preferred paths are formed.Experiments showed that it was accurate and scalable.It's suitable for website based application, such as to optimize website's topological structure or to design personalized services.

  11. Efficient Preprocessing technique using Web log mining

    Science.gov (United States)

    Raiyani, Sheetal A.; jain, Shailendra

    2012-11-01

    Web Usage Mining can be described as the discovery and Analysis of user access pattern through mining of log files and associated data from a particular websites. No. of visitors interact daily with web sites around the world. enormous amount of data are being generated and these information could be very prize to the company in the field of accepting Customerís behaviors. In this paper a complete preprocessing style having data cleaning, user and session Identification activities to improve the quality of data. Efficient preprocessing technique one of the User Identification which is key issue in preprocessing technique phase is to identify the Unique web users. Traditional User Identification is based on the site structure, being supported by using some heuristic rules, for use of this reduced the efficiency of user identification solve this difficulty we introduced proposed Technique DUI (Distinct User Identification) based on IP address ,Agent and Session time ,Referred pages on desired session time. Which can be used in counter terrorism, fraud detection and detection of unusual access of secure data, as well as through detection of regular access behavior of users improve the overall designing and performance of upcoming access of preprocessing results.

  12. Real Time Face Quality Assessment for Face Log Generation

    DEFF Research Database (Denmark)

    Kamal, Nasrollahi; Moeslund, Thomas B.

    2009-01-01

    Summarizing a long surveillance video to just a few best quality face images of each subject, a face-log, is of great importance in surveillance systems. Face quality assessment is the back-bone for face log generation and improving the quality assessment makes the face logs more reliable....... Developing a real time face quality assessment system using the most important facial features and employing it for face logs generation are the concerns of this paper. Extensive tests using four databases are carried out to validate the usability of the system....

  13. Rock mass characterization for Copenhagen Metro using face logs

    DEFF Research Database (Denmark)

    Hansen, Sanne Louise; Galsgaard, Jens; Foged, Niels Nielsen

    2015-01-01

    of relevant rock mass properties for tunnelling in Danian limestone has previously been difficult, as core logging shows a high degree of induced fracturing and core loss due to drilling disturbance, with an underestimation of the RQD values, and other rock mass properties, compared to face logging. However......, describing rock mass characteristics using detailed face logging with geological description and recording of induration and fracturing, giving a field RQD value during excavation, combined with televiewer logs, when available, has shown to be a valuable tool for rock mass characterization compared...

  14. Core-log integration for a Saudi Arabian sandstone reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Saha, S.; Al-Kaabi, A.U.; Amabeoku, M.O.; Al-Fossail, K.

    1995-10-01

    For a detailed characterization of a reservoir, core-log integration is essential. In this paper, data integration from logs and cores of a Saudi Arabian sandstone reservoir is discussed with particular attention to effects of clay on resistivity logs and water saturation. There are four sources of data, namely, core resistivity measurement, clay study from cores (XRD, CEC), spectral core gamma ray, and well logs. In order to generate continuous cation exchange capacity (CEC) with depth, spectral gamma ray measurements (both from core and downhole log) and CEC from cores and correlated. Q{sub v} (CEC per unit pore volume) values are calculated utilizing only well logs by applying Waxman-Smits equation in water bearing zone. Log derived Q{sub v} values from water zone were then correlated with porosity to generate Q{sub v} values in the oil column and compared with core derived Z{sub v}. Finally, data from well logs (porosity, resistivity and Q{sub v}) and cores (resistivity parameters m, n, and Q{sub v}) were integrated for more accurate water saturation calculation. The core-log correlation can be applied to other wells avoiding expensive core analysis, and the technique developed in this project can be used in other sandstone reservoirs.

  15. Correlation and persistence of hunting and logging impacts on tropical rainforest mammals.

    Science.gov (United States)

    Brodie, Jedediah F; Giordano, Anthony J; Zipkin, Elise F; Bernard, Henry; Mohd-Azlan, Jayasilan; Ambu, Laurentius

    2015-02-01

    Humans influence tropical rainforest animals directly via exploitation and indirectly via habitat disturbance. Bushmeat hunting and logging occur extensively in tropical forests and have large effects on particular species. But how they alter animal diversity across landscape scales and whether their impacts are correlated across species remain less known. We used spatially widespread measurements of mammal occurrence across Malaysian Borneo and recently developed multispecies hierarchical models to assess the species richness of medium- to large-bodied terrestrial mammals while accounting for imperfect detection of all species. Hunting was associated with 31% lower species richness. Moreover, hunting remained high even where richness was very low, highlighting that hunting pressure persisted even in chronically overhunted areas. Newly logged sites had 11% lower species richness than unlogged sites, but sites logged >10 years previously had richness levels similar to those in old-growth forest. Hunting was a more serious long-term threat than logging for 91% of primate and ungulate species. Hunting and logging impacts across species were not correlated across taxa. Negative impacts of hunting were the greatest for common mammalian species, but commonness versus rarity was not related to species-specific impacts of logging. Direct human impacts appeared highly persistent and lead to defaunation of certain areas. These impacts were particularly severe for species of ecological importance as seed dispersers and herbivores. Indirect impacts were also strong but appeared to attenuate more rapidly than previously thought. The lack of correlation between direct and indirect impacts across species highlights that multifaceted conservation strategies may be needed for mammal conservation in tropical rainforests, Earth's most biodiverse ecosystems.

  16. Exact Hypothesis Tests for Log-linear Models with exactLoglinTest

    Directory of Open Access Journals (Sweden)

    Brian Caffo

    2006-11-01

    Full Text Available This manuscript overviews exact testing of goodness of fit for log-linear models using the R package exactLoglinTest. This package evaluates model fit for Poisson log-linear models by conditioning on minimal sufficient statistics to remove nuisance parameters. A Monte Carlo algorithm is proposed to estimate P values from the resulting conditional distribution. In particular, this package implements a sequentially rounded normal approximation and importance sampling to approximate probabilities from the conditional distribution. Usually, this results in a high percentage of valid samples. However, in instances where this is not the case, a Metropolis Hastings algorithm can be implemented that makes more localized jumps within the reference set. The manuscript details how some conditional tests for binomial logit models can also be viewed as conditional Poisson log-linear models and hence can be performed via exactLoglinTest. A diverse battery of examples is considered to highlight use, features and extensions of the software. Notably, potential extensions to evaluating disclosure risk are also considered.

  17. Dodgson's Rule Approximations and Absurdity

    CERN Document Server

    McCabe-Dansted, John C

    2010-01-01

    With the Dodgson rule, cloning the electorate can change the winner, which Young (1977) considers an "absurdity". Removing this absurdity results in a new rule (Fishburn, 1977) for which we can compute the winner in polynomial time (Rothe et al., 2003), unlike the traditional Dodgson rule. We call this rule DC and introduce two new related rules (DR and D&). Dodgson did not explicitly propose the "Dodgson rule" (Tideman, 1987); we argue that DC and DR are better realizations of the principle behind the Dodgson rule than the traditional Dodgson rule. These rules, especially D&, are also effective approximations to the traditional Dodgson's rule. We show that, unlike the rules we have considered previously, the DC, DR and D& scores differ from the Dodgson score by no more than a fixed amount given a fixed number of alternatives, and thus these new rules converge to Dodgson under any reasonable assumption on voter behaviour, including the Impartial Anonymous Culture assumption.

  18. Approximation by double Walsh polynomials

    Directory of Open Access Journals (Sweden)

    Ferenc Móricz

    1992-01-01

    Full Text Available We study the rate of approximation by rectangular partial sums, Cesàro means, and de la Vallée Poussin means of double Walsh-Fourier series of a function in a homogeneous Banach space X. In particular, X may be Lp(I2, where 1≦p<∞ and I2=[0,1×[0,1, or CW(I2, the latter being the collection of uniformly W-continuous functions on I2. We extend the results by Watari, Fine, Yano, Jastrebova, Bljumin, Esfahanizadeh and Siddiqi from univariate to multivariate cases. As by-products, we deduce sufficient conditions for convergence in Lp(I2-norm and uniform convergence on I2 as well as characterizations of Lipschitz classes of functions. At the end, we raise three problems.

  19. Interplay of approximate planning strategies.

    Science.gov (United States)

    Huys, Quentin J M; Lally, Níall; Faulkner, Paul; Eshel, Neir; Seifritz, Erich; Gershman, Samuel J; Dayan, Peter; Roiser, Jonathan P

    2015-03-10

    Humans routinely formulate plans in domains so complex that even the most powerful computers are taxed. To do so, they seem to avail themselves of many strategies and heuristics that efficiently simplify, approximate, and hierarchically decompose hard tasks into simpler subtasks. Theoretical and cognitive research has revealed several such strategies; however, little is known about their establishment, interaction, and efficiency. Here, we use model-based behavioral analysis to provide a detailed examination of the performance of human subjects in a moderately deep planning task. We find that subjects exploit the structure of the domain to establish subgoals in a way that achieves a nearly maximal reduction in the cost of computing values of choices, but then combine partial searches with greedy local steps to solve subtasks, and maladaptively prune the decision trees of subtasks in a reflexive manner upon encountering salient losses. Subjects come idiosyncratically to favor particular sequences of actions to achieve subgoals, creating novel complex actions or "options."

  20. Approximate reduction of dynamical systems

    CERN Document Server

    Tabuada, Paulo; Julius, Agung; Pappas, George J

    2007-01-01

    The reduction of dynamical systems has a rich history, with many important applications related to stability, control and verification. Reduction of nonlinear systems is typically performed in an exact manner - as is the case with mechanical systems with symmetry--which, unfortunately, limits the type of systems to which it can be applied. The goal of this paper is to consider a more general form of reduction, termed approximate reduction, in order to extend the class of systems that can be reduced. Using notions related to incremental stability, we give conditions on when a dynamical system can be projected to a lower dimensional space while providing hard bounds on the induced errors, i.e., when it is behaviorally similar to a dynamical system on a lower dimensional space. These concepts are illustrated on a series of examples.

  1. Truthful approximations to range voting

    DEFF Research Database (Denmark)

    Filos-Ratsika, Aris; Miltersen, Peter Bro

    We consider the fundamental mechanism design problem of approximate social welfare maximization under general cardinal preferences on a finite number of alternatives and without money. The well-known range voting scheme can be thought of as a non-truthful mechanism for exact social welfare...... maximization in this setting. With m being the number of alternatives, we exhibit a randomized truthful-in-expectation ordinal mechanism implementing an outcome whose expected social welfare is at least an Omega(m^{-3/4}) fraction of the social welfare of the socially optimal alternative. On the other hand, we...... show that for sufficiently many agents and any truthful-in-expectation ordinal mechanism, there is a valuation profile where the mechanism achieves at most an O(m^{-{2/3}) fraction of the optimal social welfare in expectation. We get tighter bounds for the natural special case of m = 3...

  2. Analytical approximations for spiral waves

    Energy Technology Data Exchange (ETDEWEB)

    Löber, Jakob, E-mail: jakob@physik.tu-berlin.de; Engel, Harald [Institut für Theoretische Physik, Technische Universität Berlin, Hardenbergstrasse 36, EW 7-1, 10623 Berlin (Germany)

    2013-12-15

    We propose a non-perturbative attempt to solve the kinematic equations for spiral waves in excitable media. From the eikonal equation for the wave front we derive an implicit analytical relation between rotation frequency Ω and core radius R{sub 0}. For free, rigidly rotating spiral waves our analytical prediction is in good agreement with numerical solutions of the linear eikonal equation not only for very large but also for intermediate and small values of the core radius. An equivalent Ω(R{sub +}) dependence improves the result by Keener and Tyson for spiral waves pinned to a circular defect of radius R{sub +} with Neumann boundaries at the periphery. Simultaneously, analytical approximations for the shape of free and pinned spirals are given. We discuss the reasons why the ansatz fails to correctly describe the dependence of the rotation frequency on the excitability of the medium.

  3. On quantum and approximate privacy

    CERN Document Server

    Klauck, H

    2001-01-01

    This paper studies privacy in communication complexity. The focus is on quantum versions of the model and on protocols with only approximate privacy against honest players. We show that the privacy loss (the minimum divulged information) in computing a function can be decreased exponentially by using quantum protocols, while the class of privately computable functions (i.e., those with privacy loss 0) is not increased by quantum protocols. Quantum communication combined with small information leakage on the other hand makes certain functions computable (almost) privately which are not computable using quantum communication without leakage or using classical communication with leakage. We also give an example of an exponential reduction of the communication complexity of a function by allowing a privacy loss of o(1) instead of privacy loss 0.

  4. IONIS: Approximate atomic photoionization intensities

    Science.gov (United States)

    Heinäsmäki, Sami

    2012-02-01

    A program to compute relative atomic photoionization cross sections is presented. The code applies the output of the multiconfiguration Dirac-Fock method for atoms in the single active electron scheme, by computing the overlap of the bound electron states in the initial and final states. The contribution from the single-particle ionization matrix elements is assumed to be the same for each final state. This method gives rather accurate relative ionization probabilities provided the single-electron ionization matrix elements do not depend strongly on energy in the region considered. The method is especially suited for open shell atoms where electronic correlation in the ionic states is large. Program summaryProgram title: IONIS Catalogue identifier: AEKK_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKK_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1149 No. of bytes in distributed program, including test data, etc.: 12 877 Distribution format: tar.gz Programming language: Fortran 95 Computer: Workstations Operating system: GNU/Linux, Unix Classification: 2.2, 2.5 Nature of problem: Photoionization intensities for atoms. Solution method: The code applies the output of the multiconfiguration Dirac-Fock codes Grasp92 [1] or Grasp2K [2], to compute approximate photoionization intensities. The intensity is computed within the one-electron transition approximation and by assuming that the sum of the single-particle ionization probabilities is the same for all final ionic states. Restrictions: The program gives nonzero intensities for those transitions where only one electron is removed from the initial configuration(s). Shake-type many-electron transitions are not computed. The ionized shell must be closed in the initial state. Running time: Few seconds for a

  5. The high-energy radiation pattern from BFKLex with double-log collinear contributions

    CERN Document Server

    Chachamis, G

    2015-01-01

    We study high-energy jet production in the multi-Regge limit making use of the Monte Carlo event generator BFKLex which includes collinear improvements in the form of double-log contributions as presented in [1]. Making use of the anti-kt jet algorithm in the FastJet implementation, we present results for the average transverse momentum and azimuthal angle of the produced jets when two tagged forward/backward jets are present in the final state. We also introduce a new observable which accounts for the average rapidity separation among subsequent emissions. Results are presented, for comparison, at leading order and next-to-leading order, with the resummation of collinear double logs proposed in [2].

  6. Approximate analytic solutions to the NPDD: Short exposure approximations

    Science.gov (United States)

    Close, Ciara E.; Sheridan, John T.

    2014-04-01

    There have been many attempts to accurately describe the photochemical processes that take places in photopolymer materials. As the models have become more accurate, solving them has become more numerically intensive and more 'opaque'. Recent models incorporate the major photochemical reactions taking place as well as the diffusion effects resulting from the photo-polymerisation process, and have accurately described these processes in a number of different materials. It is our aim to develop accessible mathematical expressions which provide physical insights and simple quantitative predictions of practical value to material designers and users. In this paper, starting with the Non-Local Photo-Polymerisation Driven Diffusion (NPDD) model coupled integro-differential equations, we first simplify these equations and validate the accuracy of the resulting approximate model. This new set of governing equations are then used to produce accurate analytic solutions (polynomials) describing the evolution of the monomer and polymer concentrations, and the grating refractive index modulation, in the case of short low intensity sinusoidal exposures. The physical significance of the results and their consequences for holographic data storage (HDS) are then discussed.

  7. Does logging and forest conversion to oil palm agriculture alter functional diversity in a biodiversity hotspot?

    OpenAIRE

    Edwards, F.A.; D. P. Edwards; Larsen, T H; Hsu, W W; Benedick, S; Chung, A.; Vun Khen, C; Wilcove, D S; Hamer, K C

    2013-01-01

    Forests in Southeast Asia are rapidly being logged and converted to oil palm. These changes in land-use are known to affect species diversity but consequences for the functional diversity of species assemblages are poorly understood. Environmental filtering of species with similar traits could lead to disproportionate reductions in trait diversity in degraded habitats. Here, we focus on dung beetles, which play a key role in ecosystem processes such as nutrient recycling and seed dispersal. W...

  8. Logging and Fire Effects in Siberian Boreal Forests

    Science.gov (United States)

    Kukavskaya, E.; Buryak, L.; Ivanova, G.; Kalenskaya, O.; Bogorodskaya, A.; Zhila, S.; McRae, D.; Conard, S. G.

    2013-12-01

    The Russian boreal zone supports a huge terrestrial carbon pool. Moreover, it is a tremendous reservoir of wood products concentrated mainly in Siberia. The main natural disturbance in these forests is wildfire, which modifies the carbon budget and has potentially important climate feedbacks. In addition, both legal and illegal logging increase landscape complexity and fire hazard. We investigated a number of sites in different regions of Siberia to evaluate the impacts of fire and logging on fuel loads, carbon emissions, tree regeneration, soil respiration, and microbocenosis. We found large variations of fire and logging effects among regions depending on growing conditions and type of logging activity. Partial logging had no negative impact on forest conditions and carbon cycle. Illegal logging resulted in increase of fire hazard, and higher carbon emissions than legal logging. The highest fuel loads and carbon emissions were found on repeatedly burned unlogged sites where first fire resulted in total tree mortality. Repeated fires together with logging activities in drier conditions and on large burned sites resulted in insufficient regeneration, or even total lack of tree seedlings. Soil respiration was less on both burned and logged areas than in undisturbed forest. The highest structural and functional disturbances of the soil microbocenosis were observed on logged burned sites. Understanding current interactions between fire and logging is important for modeling ecosystem processes and for managers to develop strategies of sustainable forest management. Changing patterns in the harvest of wood products increase landscape complexity and can be expected to increase emissions and ecosystem damage from wildfires, inhibit recovery of natural ecosystems, and exacerbate impacts of wildland fire on changing climate and air quality. The research was supported by NASA LCLUC Program, RFBR grant # 12-04-31258, and Russian Academy of Sciences.

  9. On a method of approximation by Jacobi polynomials

    OpenAIRE

    Dubey,R.K.; Pandey, R. K.

    2005-01-01

    Convolution structure for Jacobi series allows end point summability of Fourier-Jacobi expansions to lead an approximation of function by a linear combination of Jacobi polynomials. Thus, using Ces$\\grave a$ro summability of some orders $>1$ at $x=1,$ we prove a result of approximation of functions on $[-1,1]$ by operators involving Jacobi polynomials. Precisely, we pick up functions from a Lebesgue integrable space and then study its representation by Jacobi polynomials und...

  10. Sampling, Filtering and Sparse Approximations on Combinatorial Graphs

    CERN Document Server

    Pesenson, Isaac Z

    2011-01-01

    In this paper we address sampling and approximation of functions on combinatorial graphs. We develop filtering on graphs by using Schr\\"odinger's group of operators generated by combinatorial Laplace operator. Then we construct a sampling theory by proving Poincare and Plancherel-Polya-type inequalities for functions on graphs. These results lead to a theory of sparse approximations on graphs and have potential applications to filtering, denoising, data dimension reduction, image processing, image compression, computer graphics, visualization and learning theory.

  11. Calculating resonance positions and widths using the Siegert approximation method

    CERN Document Server

    Rapedius, Kevin

    2011-01-01

    Here we present complex resonance states (or Siegert states), that describe the tunneling decay of a trapped quantum particle, from an intuitive point of view which naturally leads to the easily applicable Siegert approximation method that can be used for analytical and numerical calculations of complex resonances of both the linear and nonlinear Schr\\"odinger equation. Our approach thus complements other treatments of the subject that mostly focus on methods based on continuation in the complex plane or on semiclassical approximations.

  12. A general approach for cache-oblivious range reporting and approximate range counting

    DEFF Research Database (Denmark)

    Afshani, Peyman; Hamilton, Chris; Zeh, Norbert

    2010-01-01

    We present cache-oblivious solutions to two important variants of range searching: range reporting and approximate range counting. Our main contribution is a general approach for constructing cache-oblivious data structures that provide relative (1+ε)-approximations for a general class of range...... counting queries. This class includes three-sided range counting in the plane, 3-d dominance counting, and 3-d halfspace range counting. The constructed data structures use linear space and answer queries in the optimal query bound of O(logB(N/K)) block transfers in the worst case, where K is the number...... of points in the query range. As a corollary, we also obtain the first approximate 3-d halfspace range counting and 3-d dominance counting data structures with a worst-case query time of O(log(N/K)) in internal memory. An easy but important consequence of our main result is the existence of -space cache...

  13. Computer log evaluation in tertiary coal basins

    Energy Technology Data Exchange (ETDEWEB)

    Mares, S.; Krestan, J.

    1984-01-01

    The technology of working a lignite quarry in north Czechoslovakia includes use of AR, GK, GGKP, NNK and a cavernometer with digital processing of the material on an analog-digital transformer of the CSSR in the field and subsequent computer processing an an Eclipse C300. Hungarian equipment was used with recording scale: 1:100, quantization spacing was 0.1 m. The NNK probe was L = 50 cm, the NNK and GGKP calibrations were made on a rock block with known parameters. The task of logging is to determine the base of the productive interval, as well as the ash content (ASH) and calorific value (QD) of the coal in the shaft. The processing graph includes editing (translation of the conventional units into physical quantities) and construction of statistical distributions of parameters. The clay content K /sub cl/ is defined as the minimum quantity for double differential parameter GK and for NNK. The Cross plot of NNK (NPOR)-GGKP (DEN) determines the effective porosity EPOR, as well as the maximum points for porosity (48-53%) and density (1.28-1.75 g/cm/sup 3/) that characterize the coals. In order to determine QD and ASH, regression is used (linear, exponential, logarithmic, parabolic and hyperbolic). The clay type is determined by cross plot GK-GGKP. Examples are given of computer constructions, as well as the summary interpretation document which characterizes the initial and definite parameters in the bed mode (DEN, NPOR, GR, RESN, DS, VSH, EPOR, QD, ASH). The interpretation system is called SG.

  14. [Numerical assessment of impeller features of centrifugal blood pump based on fast hemolysis approximation model].

    Science.gov (United States)

    Shou, Chen; Guo, Yongjun; Su, Lei; Li, Yongqian

    2014-12-01

    The impeller profile, which is one of the most important factors, determines the creation of shear stress which leads to blood hemolysis in the internal flow of centrifugal blood pump. The investigation of the internal flow field in centrifugal blood pump and the estimation of the hemolysis within different impeller profiles will provide information to improve the performance of centrifugal blood pump. The SST kappa-omega with low Reynolds correction was used in our laboratory to study the internal flow fields for four kinds of impellers of centrifugal blood pump. The flow fields included distributions of pressure field, velocity field and shear stress field. In addition, a fast numerical hemolysis approximation was adopted to calculate the normalized index of hemolysis (NIH). The results indicated that the pressure field distribution in all kinds of blood pump were reasonable, but for the log spiral impeller pump, the vortex and backflow were much lower than those of the other pumps, and the high shear stress zone was just about 0.004%, and the NIH was 0.0089.

  15. Correlation between the phase and the log-amplitude of a wave through the vertical atmospheric propagation

    CERN Document Server

    Molodij, Guillaume

    2014-01-01

    I present expressions of the correlation between the log-amplitude and the phase of a wavefront propagating through the atmospheric turbulence. The properties of the angular correlation functions are discussed using usual synthetic turbulence profiles. The theoretical study is completed by practical implementations that can be envisioned to determine and eventually compensate the effects of the fluctuations of the intensity during the astronomical observations. The close formulation between the phase and the log-amplitude allows an analytic formulation in the Rytov approximation. Equations contain the product of an arbitrary number of hypergeometric functions that are evaluated using the Mellin transforms integration method.

  16. Hydrologic properties of coal beds in the Powder River Basin, Montana I. Geophysical log analysis

    Science.gov (United States)

    Morin, R.H.

    2005-01-01

    As part of a multidisciplinary investigation designed to assess the implications of coal-bed methane development on water resources for the Powder River Basin of southeastern Montana, six wells were drilled through Paleocene-age coal beds along a 31-km east-west transect within the Tongue River drainage basin. Analysis of geophysical logs obtained in these wells provides insight into the hydrostratigraphic characteristics of the coal and interbedded siliciclastic rocks and their possible interaction with the local stress field. Natural gamma and electrical resistivity logs were effective in distinguishing individual coal beds. Full-waveform sonic logs were used to determine elastic properties of the coal and an attendant estimate of aquifer storage is in reasonable agreement with that computed from a pumping test. Inspection of magnetically oriented images of the borehole walls generated from both acoustic and optical televiewers and comparison with coal cores infer a face cleat orientation of approximately N33??E, in close agreement with regional lineament patterns and the northeast trend of the nearby Tongue River. The local tectonic stress field in this physiographic province as inferred from a nearby 1984 earthquake denotes an oblique strike-slip faulting regime with dominant east-west compression and north-south extension. These stress directions are coincident with those of the primary fracture sets identified from the televiewer logs and also with the principle axes of the drawdown ellipse produced from a complementary aquifer test, but oblique to apparent cleat orientation. Consequently, examination of these geophysical logs within the context of local hydrologic characteristics indicates that transverse transmissivity anisotropy in these coals is predominantly controlled by bedding configuration and perhaps a mechanical response to the contemporary stress field rather than solely by cleat structure.

  17. Approximability of the discrete Fréchet distance

    Directory of Open Access Journals (Sweden)

    Karl Bringmann

    2015-12-01

    Full Text Available The Fréchet distance is a popular and widespread distance measure for point sequences and for curves. About two years ago, Agarwal et al. [SIAM J. Comput. 2014] presented a new (mildly subquadratic algorithm for the discrete version of the problem. This spawned a flurry of activity that has led to several new algorithms and lower bounds.In this paper, we study the approximability of the discrete Fréchet distance. Building on a recent result by Bringmann [FOCS 2014], we present a new conditional lower bound showing that strongly subquadratic algorithms for the discrete Fréchet distance are unlikely to exist, even in the one-dimensional case and even if the solution may be approximated up to a factor of 1.399.This raises the question of how well we can approximate the Fréchet distance (of two given $d$-dimensional point sequences of length $n$ in strongly subquadratic time. Previously, no general results were known. We present the first such algorithm by analysing the approximation ratio of a simple, linear-time greedy algorithm to be $2^{\\Theta(n}$. Moreover, we design an $\\alpha$-approximation algorithm that runs in time $O(n\\log n + n^2/\\alpha$, for any $\\alpha\\in [1, n]$. Hence, an $n^\\varepsilon$-approximation of the Fréchet distance can be computed in strongly subquadratic time, for any $\\varepsilon > 0$.

  18. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Big Fork River, Minn.; logging. 207.370 Section 207.370 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE NAVIGATION REGULATIONS § 207.370 Big Fork River, Minn.; logging. (a) During the season of navigation, parties engaged in...

  19. New Data-Logging Tools--New Investigations.

    Science.gov (United States)

    Rogers, Laurence

    1997-01-01

    Presents examples of the types of investigations which exploit the tools now commonly featured in data-logging software. Emphasizes the importance of designing tasks that encourage pupils to think about the data and the principles which underpin worthwhile data-logging tasks. (Author/ASK)

  20. Web Log Analysis: A Study of Instructor Evaluations Done Online

    Science.gov (United States)

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  1. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete and mul...

  2. Ubiquitous Learning Project Using Life-Logging Technology in Japan

    Science.gov (United States)

    Ogata, Hiroaki; Hou, Bin; Li, Mengmeng; Uosaki, Noriko; Mouri, Kosuke; Liu, Songran

    2014-01-01

    A Ubiquitous Learning Log (ULL) is defined as a digital record of what a learner has learned in daily life using ubiquitous computing technologies. In this paper, a project which developed a system called SCROLL (System for Capturing and Reusing Of Learning Log) is presented. The aim of developing SCROLL is to help learners record, organize,…

  3. 46 CFR 78.37-10 - Official log entires.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 3 2010-10-01 2010-10-01 false Official log entires. 78.37-10 Section 78.37-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) PASSENGER VESSELS OPERATIONS Logbook Entries § 78.37-10 Official log entires. (a) In addition to other items required to be entered in the...

  4. 12 CFR 27.4 - Inquiry/Application Log.

    Science.gov (United States)

    2010-01-01

    ... of the Federal Reserve Board, 12 CFR part 203) indicates a pattern of significant variation in the... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Inquiry/Application Log. 27.4 Section 27.4... SYSTEM § 27.4 Inquiry/Application Log. (a) The Comptroller, among other things, may require a bank...

  5. Condition and fate of logged forests in the Brazilian Amazon.

    Science.gov (United States)

    Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M

    2006-08-22

    The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained.

  6. Teaching an Old Log New Tricks with Machine Learning.

    Science.gov (United States)

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  7. Prediction of Log "P": ALOGPS Application in Medicinal Chemistry Education

    Science.gov (United States)

    Kujawski, Jacek; Bernard, Marek K.; Janusz, Anna; Kuzma, Weronika

    2012-01-01

    Molecular hydrophobicity (lipophilicity), usually quantified as log "P" where "P" is the partition coefficient, is an important molecular characteristic in medicinal chemistry and drug design. The log "P" coefficient is one of the principal parameters for the estimation of lipophilicity of chemical compounds and pharmacokinetic properties. The…

  8. Butt-log grade distributions for five Appalachian hardwood species

    Science.gov (United States)

    John R. Myers; Gary W. Miller; Harry V., Jr. Wiant; Joseph E. Barnard; Joseph E. Barnard

    1986-01-01

    Tree quality is an important factor in determining the market value of hardwood timber stands, but many forest inventories do not include estimates of tree quality. Butt-log grade distributions were developed for northern red oak, black oak, white oak, chestnut oak, and yellow-poplar using USDA Forest Service log grades on more than 4,700 trees in West Virginia. Butt-...

  9. Ubiquitous Learning Project Using Life-Logging Technology in Japan

    Science.gov (United States)

    Ogata, Hiroaki; Hou, Bin; Li, Mengmeng; Uosaki, Noriko; Mouri, Kosuke; Liu, Songran

    2014-01-01

    A Ubiquitous Learning Log (ULL) is defined as a digital record of what a learner has learned in daily life using ubiquitous computing technologies. In this paper, a project which developed a system called SCROLL (System for Capturing and Reusing Of Learning Log) is presented. The aim of developing SCROLL is to help learners record, organize,…

  10. Fates of trees damaged by logging in Amazonian Bolivia

    NARCIS (Netherlands)

    Shenkin, A.; Bolker, B.; Peña Claros, M.; Licona, J.C.; Putz, F.E.

    2015-01-01

    Estimation of carbon losses from trees felled and incidentally-killed during selective logging of tropical forests is relatively straightforward and well-documented, but less is known about the fates of collaterally-damaged trees that initially survive. Tree response to logging damage is an importan

  11. Mining the SDSS SkyServer SQL queries log

    Science.gov (United States)

    Hirota, Vitor M.; Santos, Rafael; Raddick, Jordan; Thakar, Ani

    2016-05-01

    SkyServer, the Internet portal for the Sloan Digital Sky Survey (SDSS) astronomic catalog, provides a set of tools that allows data access for astronomers and scientific education. One of SkyServer data access interfaces allows users to enter ad-hoc SQL statements to query the catalog. SkyServer also presents some template queries that can be used as basis for more complex queries. This interface has logged over 330 million queries submitted since 2001. It is expected that analysis of this data can be used to investigate usage patterns, identify potential new classes of queries, find similar queries, etc. and to shed some light on how users interact with the Sloan Digital Sky Survey data and how scientists have adopted the new paradigm of e-Science, which could in turn lead to enhancements on the user interfaces and experience in general. In this paper we review some approaches to SQL query mining, apply the traditional techniques used in the literature and present lessons learned, namely, that the general text mining approach for feature extraction and clustering does not seem to be adequate for this type of data, and, most importantly, we find that this type of analysis can result in very different queries being clustered together.

  12. The Effects of Selective Logging Behaviors on Forest Fragmentation and Recovery

    Directory of Open Access Journals (Sweden)

    Xanic J. Rondon

    2012-01-01

    Full Text Available To study the impacts of selective logging behaviors on a forest landscape, we developed an intermediate-scale spatial model to link cross-scale interactions of timber harvesting, a fine-scale human activity, with coarse-scale landscape impacts. We used the Lotka-Volterra predator-prey model with Holling’s functional response II to simulate selective logging, coupled with a cellular automaton model to simulate logger mobility and forest fragmentation. Three logging scenarios were simulated, each varying in timber harvesting preference and logger mobility. We quantified forest resilience by evaluating (1 the spatial patterns of forest fragmentation, (2 the time until the system crossed a threshold into a deforested state, and (3 recovery time. Our simulations showed that logging behaviors involving decisions made about harvesting timber and mobility can lead to different spatial patterns of forest fragmentation. They can, together with forest management practices, significantly delay or accelerate the transition of a forest landscape to a deforested state and its return to a recovered state. Intermediate-scale models emerge as useful tools for understanding cross-scale interactions between human activities and the spatial patterns that are created by anthropogenic land use.

  13. High Risk Posture on Motor-Manual Short Wood Logging System in Acacia mangium Plantation

    Directory of Open Access Journals (Sweden)

    Efi Yuliati Yovi

    2015-06-01

    Full Text Available Motor-manual logging has been considered as the most dominant logging system in Java Island, Indonesia. The system-which consisted of felling, delimbing, bucking, hauling, and transporting activities- involves a combination of stress factors e.q. difficult work postures, generation of force, and lifting techniques. In the other hand, combination of the three is well associated with high risk of work-related musculoskeletal injuries (MSIs, including musculoskeletal disorders. This research aimed to assess difficult work posture on felling, delimbing, bucking, and manually short wood hauling by employing rapid entire body assessment (REBA technique and muscular pain scoring based on the worker's perceive. It was revealed that felling and manual hauling were scored 4 in the REBA action level, indicated very high MSIs risk level, and categorized as “necessary now” for an injury risk preventive action. The workers' pain scoring indicated that low back (spine in general disorders resulting in low back pain has been considered to be the one of the leading safety issues in the felling and manual hauling. Regardless to complex mechanism of how the personal risk and environmental factors associated with manual material handling injuries, job-related factors approach should be underlined in the MSIs prevention initiative in motor-manual logging. Keywords: motor-manual logging, difficult work posture, REBA, MSIs, low back pain

  14. A time-driven transmission method for well logging networks

    Institute of Scientific and Technical Information of China (English)

    Wu Ruiqing; Chen Wei; Chen Tianqi; Li Qun

    2009-01-01

    Long delays and poor real-time transmission are disadvantageous to well logging networks consisting of multiple subnets. In this paper, we proposed a time-driven transmission method (TDTM) to improve the efficiency and precision of logging networks. Using TDTM, we obtained well logging curves by fusing the depth acquired on the surface, and the data acquired in downhole instruments based on the synchronization timestamp. For the TDTM, the precision of time synchronization and the data fusion algorithm were two main factors influencing system errors. A piecewise fractal interpolation was proposed to fast fuse data in each interval of the logging curves. Intervals with similar characteristics in curves were extracted based on the change in the histogram of the interval. The TDTM is evaluated with a sonic curve, as an example. Experimental results showed that the fused data had little error, and the TDTM was effective and suitable for the logging networks.

  15. What's new in well logging and formation evaluation

    Science.gov (United States)

    Prensky, S.

    2011-01-01

    A number of significant new developments is emerging in well logging and formation evaluation. Some of the new developments include an ultrasonic wireline imager, an electromagnetic free-point indicator, wired and fiber-optic coiled tubing systems, and extreme-temperature logging-while-drilling (LWD) tools. The continued consolidation of logging and petrophysical service providers in 2010 means that these innovations are increasingly being provided by a few large companies. Weatherford International has launched a slimhole cross-dipole tool as part of the company's line of compact logging tools. The 26-ft-long Compact Cross-Dipole Sonic (CXD) tool can be run as part of a quad-combo compact logging string. Halliburton has introduced a version of its circumferential acoustic scanning tool (CAST) that runs on monoconductor cable (CAST-M) to provide high-resolution images in open hole and in cased hole for casing and cement evaluation.

  16. Cased-hole log analysis and reservoir performance monitoring

    CERN Document Server

    Bateman, Richard M

    2015-01-01

    This book addresses vital issues, such as the evaluation of shale gas reservoirs and their production. Topics include the cased-hole logging environment, reservoir fluid properties; flow regimes; temperature, noise, cement bond, and pulsed neutron logging; and casing inspection. Production logging charts and tables are included in the appendices. The work serves as a comprehensive reference for production engineers with upstream E&P companies, well logging service company employees, university students, and petroleum industry training professionals. This book also: ·       Provides methods of conveying production logging tools along horizontal well segments as well as measurements of formation electrical resistivity through casing ·       Covers new information on fluid flow characteristics in inclined pipe and provides new and improved nuclear tool measurements in cased wells ·       Includes updates on cased-hole wireline formation testing  

  17. A Logical Method for Policy Enforcement over Evolving Audit Logs

    CERN Document Server

    Garg, Deepak; Datta, Anupam

    2011-01-01

    We present an iterative algorithm for enforcing policies represented in a first-order logic, which can, in particular, express all transmission-related clauses in the HIPAA Privacy Rule. The logic has three features that raise challenges for enforcement --- uninterpreted predicates (used to model subjective concepts in privacy policies), real-time temporal properties, and quantification over infinite domains (such as the set of messages containing personal information). The algorithm operates over audit logs that are inherently incomplete and evolve over time. In each iteration, the algorithm provably checks as much of the policy as possible over the current log and outputs a residual policy that can only be checked when the log is extended with additional information. We prove correctness and termination properties of the algorithm. While these results are developed in a general form, accounting for many different sources of incompleteness in audit logs, we also prove that for the special case of logs that m...

  18. Differentially Private Search Log Sanitization with Optimal Output Utility

    CERN Document Server

    Hong, Yuan; Lu, Haibing; Wu, Mingrui

    2011-01-01

    Web search logs contain extremely sensitive data, as evidenced by the recent AOL incident. However, storing and analyzing search logs can be very useful for many purposes (i.e. investigating human behavior). Thus, an important research question is how to privately sanitize search logs. Although several search log anonymization techniques have been proposed with concrete privacy models, the output utility of most techniques is merely evaluated but not necessarily maximized. Indeed, when applying any privacy standard to the search log anonymization, the optimal (maximum utility) output can be derived according to the inter-relation between privacy and utility. In this paper, we take a first step towards tackling this problem by formulating utility-maximizing optimization problems based on the rigorous privacy standard of differential privacy. Specifically, we utilize optimization models to maximize the output utility of the sanitization for different applications, while ensuring that the production process sati...

  19. Coal log pipeline research at the University of Missouri

    Energy Technology Data Exchange (ETDEWEB)

    Liu, H.

    1992-03-01

    Project tasks: Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and test the logs produced from Task 1. Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. Prepare a final report for DOE.

  20. Maintaining ecosystem function and services in logged tropical forests.

    Science.gov (United States)

    Edwards, David P; Tobias, Joseph A; Sheil, Douglas; Meijaard, Erik; Laurance, William F

    2014-09-01

    Vast expanses of tropical forests worldwide are being impacted by selective logging. We evaluate the environmental impacts of such logging and conclude that natural timber-production forests typically retain most of their biodiversity and associated ecosystem functions, as well as their carbon, climatic, and soil-hydrological ecosystem services. Unfortunately, the value of production forests is often overlooked, leaving them vulnerable to further degradation including post-logging clearing, fires, and hunting. Because logged tropical forests are extensive, functionally diverse, and provide many ecosystem services, efforts to expand their role in conservation strategies are urgently needed. Key priorities include improving harvest practices to reduce negative impacts on ecosystem functions and services, and preventing the rapid conversion and loss of logged forests.