DUAL TIMELIKE NORMAL AND DUAL TIMELIKE SPHERICAL CURVES IN DUAL MINKOWSKI SPACE
ÖNDER, Mehmet
2009-01-01
Abstract: In this paper, we give characterizations of dual timelike normal and dual timelike spherical curves in the dual Minkowski 3-space and we show that every dual timelike normal curve is also a dual timelike spherical curve. Keywords: Normal curves, Dual Minkowski 3-Space, Dual Timelike curves. Mathematics Subject Classifications (2000): 53C50, 53C40. DUAL MINKOWSKI UZAYINDA DUAL TIMELIKE NORMAL VE DUAL TIMELIKE KÜRESEL EĞRİLER Özet: Bu çalışmada, dual Minkowski 3-...
Morse theory on timelike and causal curves
International Nuclear Information System (INIS)
Everson, J.; Talbot, C.J.
1976-01-01
It is shown that the set of timelike curves in a globally hyperbolic space-time manifold can be given the structure of a Hilbert manifold under a suitable definition of 'timelike.' The causal curves are the topological closure of this manifold. The Lorentzian energy (corresponding to Milnor's energy, except that the Lorentzian inner product is used) is shown to be a Morse function for the space of causal curves. A fixed end point index theorem is obtained in which a lower bound for the index of the Hessian of the Lorentzian energy is given in terms of the sum of the orders of the conjugate points between the end points. (author)
Experimental simulation of closed timelike curves.
Ringbauer, Martin; Broome, Matthew A; Myers, Casey R; White, Andrew G; Ralph, Timothy C
2014-06-19
Closed timelike curves are among the most controversial features of modern physics. As legitimate solutions to Einstein's field equations, they allow for time travel, which instinctively seems paradoxical. However, in the quantum regime these paradoxes can be resolved, leaving closed timelike curves consistent with relativity. The study of these systems therefore provides valuable insight into nonlinearities and the emergence of causal structures in quantum mechanics--essential for any formulation of a quantum theory of gravity. Here we experimentally simulate the nonlinear behaviour of a qubit interacting unitarily with an older version of itself, addressing some of the fascinating effects that arise in systems traversing a closed timelike curve. These include perfect discrimination of non-orthogonal states and, most intriguingly, the ability to distinguish nominally equivalent ways of preparing pure quantum states. Finally, we examine the dependence of these effects on the initial qubit state, the form of the unitary interaction and the influence of decoherence.
Dual Smarandache Curves of a Timelike Curve lying on Unit dual Lorentzian Sphere
Kahraman, Tanju; Hüseyin Ugurlu, Hasan
2016-01-01
In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere. Firstly, we define the four types of dual Smarandache curves of a timelike curve lying on dual Lorentzian sphere.
Closed Timelike Curves in Type II Non-Vacuum Spacetime
International Nuclear Information System (INIS)
Ahmed, Faizuddin
2017-01-01
Here we present a cyclicly symmetric non-vacuum spacetime, admitting closed timelike curves (CTCs) which appear after a certain instant of time, i.e., a time-machine spacetime. The spacetime is asymptotically flat, free-from curvature singularities and a four-dimensional extension of the Misner space in curved spacetime. The spacetime is of type II in the Petrov classification scheme and the matter field pure radiation satisfy the energy condition. (paper)
Closed timelike curves in asymmetrically warped brane universes
Päs, Heinrich; Pakvasa, Sandip; Dent, James; Weiler, Thomas J.
2009-08-01
In asymmetrically-warped spacetimes different warp factors are assigned to space and to time. We discuss causality properties of these warped brane universes and argue that scenarios with two extra dimensions may allow for timelike curves which can be closed via paths in the extra-dimensional bulk. In particular, necessary and sufficient conditions on the metric for the existence of closed timelike curves are presented. We find a six-dimensional warped metric which satisfies the CTC conditions, and where the null, weak and dominant energy conditions are satisfied on the brane (although only the former remains satisfied in the bulk). Such scenarios are interesting, since they open the possibility of experimentally testing the chronology protection conjecture by manipulating on our brane initial conditions of gravitons or hypothetical gauge-singlet fermions (“sterile neutrinos”) which then propagate in the extra dimensions.
Supertube domain walls and elimination of closed timelike curves in string theory
International Nuclear Information System (INIS)
Drukker, Nadav
2004-01-01
We show that some novel physics of supertubes removes closed timelike curves from many supersymmetric spaces which naively suffer from this problem. The main claim is that supertubes naturally form domain walls, so while analytical continuation of the metric would lead to closed timelike curves, across the domain wall the metric is nondifferentiable, and the closed timelike curves are eliminated. In the examples we study, the metric inside the domain wall is always of the Goedel type, while outside the shell it looks like a localized rotating object, often a rotating black hole. Thus this mechanism prevents the appearance of closed timelike curves behind the horizons of certain rotating black holes
On Closed Timelike Curves and Warped Brane World Models
Directory of Open Access Journals (Sweden)
Slagter Reinoud Jan
2013-09-01
Full Text Available At first glance, it seems possible to construct in general relativity theory causality violating solutions. The most striking one is the Gott spacetime. Two cosmic strings, approaching each other with high velocity, could produce closed timelike curves. It was quickly recognized that this solution violates physical boundary conditions. The effective one particle generator becomes hyperbolic, so the center of mass is tachyonic. On a 5-dimensional warped spacetime, it seems possible to get an elliptic generator, so no obstruction is encountered and the velocity of the center of mass of the effective particle has an overlap with the Gott region. So a CTC could, in principle, be constructed. However, from the effective 4D field equations on the brane, which are influenced by the projection of the bulk Weyl tensor on the brane, it follows that no asymptotic conical space time is found, so no angle deficit as in the 4D counterpart model. This could also explain why we do not observe cosmic strings.
Exact string theory model of closed timelike curves and cosmological singularities
International Nuclear Information System (INIS)
Johnson, Clifford V.; Svendsen, Harald G.
2004-01-01
We study an exact model of string theory propagating in a space-time containing regions with closed timelike curves (CTCs) separated from a finite cosmological region bounded by a big bang and a big crunch. The model is an nontrivial embedding of the Taub-NUT geometry into heterotic string theory with a full conformal field theory (CFT) definition, discovered over a decade ago as a heterotic coset model. Having a CFT definition makes this an excellent laboratory for the study of the stringy fate of CTCs, the Taub cosmology, and the Milne/Misner-type chronology horizon which separates them. In an effort to uncover the role of stringy corrections to such geometries, we calculate the complete set of α ' corrections to the geometry. We observe that the key features of Taub-NUT persist in the exact theory, together with the emergence of a region of space with Euclidean signature bounded by timelike curvature singularities. Although such remarks are premature, their persistence in the exact geometry is suggestive that string theory is able to make physical sense of the Milne/Misner singularities and the CTCs, despite their pathological character in general relativity. This may also support the possibility that CTCs may be viable in some physical situations, and may be a natural ingredient in pre-big bang cosmological scenarios
Quantum field theory in spaces with closed time-like curves
International Nuclear Information System (INIS)
Boulware, D.G.
1992-01-01
Gott spacetime has closed timelike curves, but no locally anomalous stress-energy. A complete orthonormal set of eigenfunctions of the wave operator is found in the special case of a spacetime in which the total deficit angle is 27π. A scalar quantum field theory is constructed using these eigenfunctions. The resultant interacting quantum field theory is not unitary because the field operators can create real, on-shell, particles in the acausal region. These particles propagate for finite proper time accumulating an arbitrary phase before being annihilated at the same spacetime point as that at which they were created. As a result, the effective potential within the acausal region is complex, and probability is not conserved. The stress tensor of the scalar field is evaluated in the neighborhood of the Cauchy horizon; in the case of a sufficiently small Compton wavelength of the field, the stress tensor is regular and cannot prevent the formation of the Cauchy horizon
Timelike Killing spinors in seven dimensions
International Nuclear Information System (INIS)
Cariglia, Marco; Conamhna, Oisin A.P. Mac
2004-01-01
We employ the G-structure formalism to study supersymmetric solutions of minimal and SU(2) gauged supergravities in seven dimensions admitting Killing spinors with an associated timelike Killing vector. The most general such Killing spinor defines a SU(3) structure. We deduce necessary and sufficient conditions for the existence of a timelike Killing spinor on the bosonic fields of the theories, and find that such configurations generically preserve one out of 16 supersymmetries. Using our general supersymmetric ansatz we obtain numerous new solutions, including squashed or deformed anti-de Sitter solutions of the gauged theory, and a large class of Goedel-like solutions with closed timelike curves
Transversal Surfaces of Timelike Ruled Surfaces in Minkowski 3-Space
Önder, Mehmet
2012-01-01
In this study we give definitions and characterizations of transversal surfaces of timelike ruled surfaces. We study some special cases such as the striction curve is a geodesic, an asymptotic line or a line of curvature. Moreover, we obtain developable conditions for transversal surfaces of a timelike ruled surface.
International Nuclear Information System (INIS)
Goswami, Rituparno; Joshi, Pankaj S.; Vaz, Cenalo; Witten, Louis
2004-01-01
We construct a class of spherically symmetric collapse models in which a naked singularity may develop as the end state of collapse. The matter distribution considered has negative radial and tangential pressures, but the weak energy condition is obeyed throughout. The singularity forms at the center of the collapsing cloud and continues to be visible for a finite time. The duration of visibility depends on the nature of energy distribution. Hence the causal structure of the resulting singularity depends on the nature of the mass function chosen for the cloud. We present a general model in which the naked singularity formed is timelike, neither pointlike nor null. Our work represents a step toward clarifying the necessary conditions for the validity of the Cosmic Censorship Conjecture
Projection-based curve clustering
International Nuclear Information System (INIS)
Auder, Benjamin; Fischer, Aurelie
2012-01-01
This paper focuses on unsupervised curve classification in the context of nuclear industry. At the Commissariat a l'Energie Atomique (CEA), Cadarache (France), the thermal-hydraulic computer code CATHARE is used to study the reliability of reactor vessels. The code inputs are physical parameters and the outputs are time evolution curves of a few other physical quantities. As the CATHARE code is quite complex and CPU time-consuming, it has to be approximated by a regression model. This regression process involves a clustering step. In the present paper, the CATHARE output curves are clustered using a k-means scheme, with a projection onto a lower dimensional space. We study the properties of the empirically optimal cluster centres found by the clustering method based on projections, compared with the 'true' ones. The choice of the projection basis is discussed, and an algorithm is implemented to select the best projection basis among a library of orthonormal bases. The approach is illustrated on a simulated example and then applied to the industrial problem. (authors)
Space- and time-like superselection rules in conformal quantum field theory
International Nuclear Information System (INIS)
Schroer, Bert
2000-11-01
In conformally invariant quantum field theories one encounters besides the standard DHR superselection theory based on spacelike (Einstein-causal) commutation relations and their Haag duality another timelike (Huygens) based superselection structure. Whereas the DHR theory based on spacelike causality of observables confirmed the Lagrangian internal symmetry picture on the level of the physical principles of local quantum physics, the attempts to understand the timelike based superselection charges associated with the center of the conformal covering group in terms of timelike localized charges lead to a more dynamical role of charges outside the DR theorem and even outside the Coleman-Mandula setting. The ensuing plektonic timelike structure of conformal theories explains the spectrum of the anomalous scale dimensions in terms of admissible braid group representations, similar to the explanation of the possible anomalous spin spectrum expected from the extension of the DHR theory to stringlike d=1+2 plektonic fields. (author)
Timelike Completeness as an Obstruction to C 0-Extensions
Galloway, Gregory J.; Ling, Eric; Sbierski, Jan
2017-11-01
The study of low regularity (in-)extendibility of Lorentzian manifolds is motivated by the question whether a given solution to the Einstein equations can be extended (or is maximal) as a weak solution. In this paper we show that a timelike complete and globally hyperbolic Lorentzian manifold is C 0-inextendible. For the proof we make use of the result, recently established by Sämann (Ann Henri Poincaré 17(6):1429-1455, 2016), that even for continuous Lorentzian manifolds that are globally hyperbolic, there exists a length-maximizing causal curve between any two causally related points.
On the (1 + 3) threading of spacetime with respect to an arbitrary timelike vector field
Energy Technology Data Exchange (ETDEWEB)
Bejancu, Aurel [Kuwait University, Department of Mathematics, P.O.Box 5969, Safat (Kuwait); Calin, Constantin [Technical University ' ' Gh.Asachi' ' , Department of Mathematics, Iasi (Romania)
2015-04-15
We develop a newapproach on the (1 + 3) threading of spacetime (M, g) with respect to a congruence of curves defined by an arbitrary timelike vector field. The study is based on spatial tensor fields and on theRiemannian spatial connection ∇*, which behave as 3D geometric objects. We obtain new formulas for local components of the Ricci tensor field of (M, g) with respect to the threading frame field, in terms of the Ricci tensor field of ∇* and of kinematic quantities. Also, new expressions for time covariant derivatives of kinematic quantities are stated. In particular, a new form of Raychaudhuri's equation enables us to prove Lemma 6.3, which completes a well-known lemma used in the proof of the Penrose-Hawking singularity theorems. Finally, we apply the new (1 + 3) formalism to the study of the dynamics of a Kerr-Newman black hole. (orig.)
MICA: Multiple interval-based curve alignment
Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf
2018-01-01
MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA
Timelike single-logarithm-resummed splitting functions
Energy Technology Data Exchange (ETDEWEB)
Albino, S.; Bolzoni, P.; Kniehl, B.A. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Kotikov, A.V. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Joint Inst. of Nuclear Research, Moscow (Russian Federation). Bogoliubov Lab. of Theoretical Physics
2011-08-15
We calculate the single logarithmic contributions to the quark singlet and gluon matrix of timelike splitting functions at all orders in the modified minimal-subtraction (MS) scheme. We fix two of the degrees of freedom of this matrix from the analogous results in the massive-gluon regularization scheme by using the relation between that scheme and the MS scheme. We determine this scheme transformation from the double logarithmic contributions to the timelike splitting functions and the coefficient functions of inclusive particle production in e{sup +}e{sup -} annihilation now available in both schemes. The remaining two degrees of freedom are fixed by reasonable physical assumptions. The results agree with the fixed-order results at next-to-next-to-leading order in the literature. (orig.)
NNLO splitting and coefficient functions with time-like kinematics
International Nuclear Information System (INIS)
Mitov, A.; Moch, S.; Vogt, A.; Liverpool Univ.
2006-09-01
We discuss recent results on the three-loop (next-to-next-to-leading order, NNLO) time-like splitting functions of QCD and the two-loop (NNLO) coefficient functions in one-particle inclusive e + e - -annihilation. These results form the basis for extracting fragmentation functions for light and heavy flavors with NNLO accuracy that will be needed at the LHC and ILC. The two-loop calculations have been performed in Mellin space bases on a new method, the main features of which we also describe briefly. (orig.)
Linear Titration Curves of Acids and Bases.
Joseph, N R
1959-05-29
The Henderson-Hasselbalch equation, by a simple transformation, becomes pH - pK = pA - pB, where pA and pB are the negative logarithms of acid and base concentrations. Sigmoid titration curves then reduce to straight lines; titration curves of polyelectrolytes, to families of straight lines. The method is applied to the titration of the dipeptide glycyl aminotricarballylic acid, with four titrable groups. Results are expressed as Cartesian and d'Ocagne nomograms. The latter is of a general form applicable to polyelectrolytes of any degree of complexity.
Repulsive and attractive timelike singularities in vacuum cosmologies
International Nuclear Information System (INIS)
Miller, B.D.
1979-01-01
Spherically symmetric cosmologies whose big bang is partially spacelike and partially timelike are constrained to occur only in the presence of certain types of matter, and in such cosmologies the timelike part of the big bang is a negative-mass singularity. In this paper examples are given of cylindrically symmetric cosmologies whose big bang is partially spacelike and partially timelike. These cosmologies are vacuum. In some of them, the timelike part of the big bang is clearly a (generalized) negative-mass singularity, while in others it is a (generalized) positive-mass singularity
Class of continuous timelike curves determines the topology of spacetime
International Nuclear Information System (INIS)
Malament, D.B.
1977-01-01
The title assertion is proven, and two corollaries are established. First, the topology of every past and future distinguishing spacetime is determined by its causal structure. Second, in every spacetime the path topology of Hawking, King, and McCarthy codes topological, differential, and conformal structure
Noncrossing timelike singularities of irrotational dust collapse
International Nuclear Information System (INIS)
Liang, E.P.T.
1979-01-01
Known naked singularities in spherical dust collapse are either due to shell-crossing or localized to the central world line. They will probably be destroyed by pressure gradients or blue-shift instabilities. To violate the cosmic censorship hypothesis in a more convincing and general context, collapse solutions with naked singularities that are at least nonshell-crossing and nonlocalized need to be constructed. Some results concerning the probable structure of a class of nonshellcrossing and nonlocalized timelike singularities are reviewed. The cylindrical dust model is considered but this model is not asymptotically flat. To make these noncrossing singularities viable counter examples to the cosmic censorship hypothesis, the occurrence of such singularities in asymptotically flat collapse needs to be demonstrated. (UK)
On timelike supersymmetric solutions of gauged minimal 5-dimensional supergravity
Energy Technology Data Exchange (ETDEWEB)
Chimento, Samuele; Ortín, Tomás [Instituto de Física Teórica UAM/CSIC,C/Nicolás Cabrera, 13-15, C.University Cantoblanco, E-28049 Madrid (Spain)
2017-04-04
We analyze the timelike supersymmetric solutions of minimal gauged 5-dimensional supergravity for the case in which the Kähler base manifold admits a holomorphic isometry and depends on two real functions satisfying a simple second-order differential equation. Using this general form of the base space, the equations satisfied by the building blocks of the solutions become of, at most, fourth degree and can be solved by simple polynomic ansatzs. In this way we construct two 3-parameter families of solutions that contain almost all the timelike supersymmetric solutions of this theory with one angular momentum known so far and a few more: the (singular) supersymmetric Reissner-Nordström-AdS solutions, the three exact supersymmetric solutions describing the three near-horizon geometries found by Gutowski and Reall, three 1-parameter asymptotically-AdS{sub 5} black-hole solutions with those three near-horizon geometries (Gutowski and Reall’s black hole being one of them), three generalizations of the Gödel universe and a few potentially homogenous solutions. A key rôle in finding these solutions is played by our ability to write AdS{sub 5}’s Kähler base space ( (ℂℙ)-bar {sup 2} or SU(1,2)/U(2)) is three different, yet simple, forms associated to three different isometries. Furthermore, our ansatz for the Kähler metric also allows us to study the dimensional compactification of the theory and its solutions in a systematic way.
NNLO time-like splitting functions in QCD
International Nuclear Information System (INIS)
Moch, S.; Vogt, A.
2008-07-01
We review the status of the calculation of the time-like splitting functions for the evolution of fragmentation functions to the next-to-next-to-leading order in perturbative QCD. By employing relations between space-like and time-like deep-inelastic processes, all quark-quark and the gluon-gluon time-like splitting functions have been obtained to three loops. The corresponding quantities for the quark-gluon and gluon-quark splitting at this order are presently still unknown except for their second Mellin moments. (orig.)
Analysis of velocity planning interpolation algorithm based on NURBS curve
Zhang, Wanjun; Gao, Shanping; Cheng, Xiyan; Zhang, Feng
2017-04-01
To reduce interpolation time and Max interpolation error in NURBS (Non-Uniform Rational B-Spline) inter-polation caused by planning Velocity. This paper proposed a velocity planning interpolation algorithm based on NURBS curve. Firstly, the second-order Taylor expansion is applied on the numerator in NURBS curve representation with parameter curve. Then, velocity planning interpolation algorithm can meet with NURBS curve interpolation. Finally, simulation results show that the proposed NURBS curve interpolator meet the high-speed and high-accuracy interpolation requirements of CNC systems. The interpolation of NURBS curve should be finished.
Structural Acoustic Physics Based Modeling of Curved Composite Shells
2017-09-19
NUWC-NPT Technical Report 12,236 19 September 2017 Structural Acoustic Physics -Based Modeling of Curved Composite Shells Rachel E. Hesse...SUBTITLE Structural Acoustic Physics -Based Modeling of Curved Composite Shells 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...study was to use physics -based modeling (PBM) to investigate wave propagations through curved shells that are subjected to acoustic excitation. An
Point- and curve-based geometric conflation
Ló pez-Vá zquez, C.; Manso Callejo, M.A.
2013-01-01
Geometric conflation is the process undertaken to modify the coordinates of features in dataset A in order to match corresponding ones in dataset B. The overwhelming majority of the literature considers the use of points as features to define the transformation. In this article we present a procedure to consider one-dimensional curves also, which are commonly available as Global Navigation Satellite System (GNSS) tracks, routes, coastlines, and so on, in order to define the estimate of the displacements to be applied to each object in A. The procedure involves three steps, including the partial matching of corresponding curves, the computation of some analytical expression, and the addition of a correction term in order to satisfy basic cartographic rules. A numerical example is presented. © 2013 Copyright Taylor and Francis Group, LLC.
Development of a statistically-based lower bound fracture toughness curve (Ksub(IR) curve)
International Nuclear Information System (INIS)
Wullaert, R.A.; Server, W.L.; Oldfield, W.; Stahlkopf, K.E.
1977-01-01
A program of initiation fracture toughness measurements on fifty heats of nuclear pressure vessel production materials (including weldments) was used to develop a methodology for establishing a revised reference toughness curve. The new methodology was statistically developed and provides a predefined confidence limit (or tolerance limit) for fracture toughness based upon many heats of a particular type of material. Overall reference curves were developed for seven specific materials using large specimen static and dynamic fracture toughness results. The heat-to-heat variation was removed by normalizing both the fracture toughness and temperature data with the precracked Charpy tanh curve coefficients for each particular heat. The variance and distribution about the curve were determined, and lower bounds of predetermined statistical significance were drawn based upon a Pearson distribution in the lower shelf region (since the data were skewed to high values) and a t-distribution in the transition temperature region (since the data were normally distributed)
Matrix model and time-like linear dila ton matter
International Nuclear Information System (INIS)
Takayanagi, Tadashi
2004-01-01
We consider a matrix model description of the 2d string theory whose matter part is given by a time-like linear dilaton CFT. This is equivalent to the c=1 matrix model with a deformed, but very simple Fermi surface. Indeed, after a Lorentz transformation, the corresponding 2d spacetime is a conventional linear dila ton background with a time-dependent tachyon field. We show that the tree level scattering amplitudes in the matrix model perfectly agree with those computed in the world-sheet theory. The classical trajectories of fermions correspond to the decaying D-boranes in the time-like linear dilaton CFT. We also discuss the ground ring structure. Furthermore, we study the properties of the time-like Liouville theory by applying this matrix model description. We find that its ground ring structure is very similar to that of the minimal string. (author)
Compact Hilbert Curve Index Algorithm Based on Gray Code
Directory of Open Access Journals (Sweden)
CAO Xuefeng
2016-12-01
Full Text Available Hilbert curve has best clustering in various kinds of space filling curves, and has been used as an important tools in discrete global grid spatial index design field. But there are lots of redundancies in the standard Hilbert curve index when the data set has large differences between dimensions. In this paper, the construction features of Hilbert curve is analyzed based on Gray code, and then the compact Hilbert curve index algorithm is put forward, in which the redundancy problem has been avoided while Hilbert curve clustering preserved. Finally, experiment results shows that the compact Hilbert curve index outperforms the standard Hilbert index, their 1 computational complexity is nearly equivalent, but the real data set test shows the coding time and storage space decrease 40%, the speedup ratio of sorting speed is nearly 4.3.
Comparison of wind turbines based on power curve analysis
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-02-01
In the study measured power curves for 46 wind turbines were analyzed with the purpose to establish the basis for a consistent comparison of the efficiency of the wind turbines. Emphasis is on wind turbines above 500 kW rated power, with power curves measured after 1994 according to international recommendations. The available power curves fulfilling these requirements were smoothened according to a procedure developed for the purpose in such a way that the smoothened power curves are equally representative as the measured curves. The resulting smoothened power curves are presented in a standardized format for the subsequent processing. Using wind turbine data from the power curve documentation the analysis results in curves for specific energy production (kWh/M{sup 2}/yr) versus specific rotor load (kW/M{sup 2}) for a range of mean wind speeds. On this basis generalized curves for specific annual energy production versus specific rotor load are established for a number of generalized wind turbine concepts. The 46 smoothened standardized power curves presented in the report, the procedure developed to establish them, and the results of the analysis based on them aim at providers of measured power curves as well as users of them including manufacturers, advisors and decision makers. (au)
Kaon transverse charge density from space- and timelike data
Mecholsky, N. A.; Meija-Ott, J.; Carmignotto, M.; Horn, T.; Miller, G. A.; Pegg, I. L.
2017-12-01
We used the world data on the kaon form factor to extract the transverse kaon charge density using a dispersion integral of the imaginary part of the kaon form factor in the timelike region. Our analysis includes recent data from e+e- annihiliation measurements extending the kinematic reach of the data into the region of high momentum transfers conjugate to the region of short transverse distances. To calculate the transverse density we created a superset of both timelike and spacelike data and developed an empirical parameterization of the kaon form factor. The spacelike set includes two new data points we extracted from existing cross section data. We estimate the uncertainty on the resulting transverse density to be 5% at b =0.025 fm and significantly better at large distances. New kaon data planned with the 12 GeV Jefferson Lab may have a significant impact on the charge density at distances of b <0.1 fm.
Form factors and QCD in spacelike and timelike region
International Nuclear Information System (INIS)
A.P. Bakulev; A.V. Radyushkin; N.G. Stefanis
2000-01-01
The authors analyze the basic hard exclusive processes: πγ * γ-transition, pion and nucleon electromagnetic form factors, and discuss the analytic continuation of QCD formulas from the spacelike q 2 2 > 0 of the relevant momentum transfers. They describe the construction of the timelike version of the coupling constant α s . They show that due to the analytic continuation of the collinear logarithms each eigenfunction of the evolution equation acquires a phase factor and investigate the resulting interference effects which are shown to be very small. They found no sources for the K-factor-type enhancements in the perturbative QCD contribution to the hadronic form factors. To study the soft part of the pion electromagnetic form factor, they use a QCD sum rule inspired model and show that there are non-canceling Sudakov double logarithms which result in a K-factor-type enhancement in the timelike region
Form factors and QCD in spacelike and timelike regions
International Nuclear Information System (INIS)
Bakulev, A. P.; Radyushkin, A. V.; Stefanis, N. G.
2000-01-01
We analyze the basic hard exclusive processes, the πγ * γ-transition and the pion and nucleon electromagnetic form factors, and discuss the analytic continuation of QCD formulas from the spacelike q 2 2 >0 of the relevant momentum transfers. We describe the construction of the timelike version of the coupling constant α s . We show that due to the analytic continuation of the collinear logarithms, each eigenfunction of the evolution equation acquires a phase factor and investigate the resulting interference effects which are shown to be very small. We find no sources for the K-factor-type enhancements in the perturbative QCD contribution to the hadronic form factors. To study the soft part of the pion electromagnetic form factor, we use a QCD sum rule inspired model and show that there are noncanceling Sudakov double logarithms which result in a K-factor-type enhancement in the timelike region
Timelike Compton scattering off the neutron and generalized parton distributions
Energy Technology Data Exchange (ETDEWEB)
Boer, M.; Guidal, M. [CNRS-IN2P3, Universite Paris-Sud, Institut de Physique Nucleaire d' Orsay, Orsay (France); Vanderhaeghen, M. [Johannes Gutenberg Universitaet, Institut fuer Kernphysik and PRISMA Cluster of Excellence, Mainz (Germany)
2016-02-15
We study the exclusive photoproduction of an electron-positron pair on a neutron target in the Jefferson Lab energy domain. The reaction consists of two processes: the Bethe-Heitler and the Timelike Compton Scattering. The latter process provides potentially access to the Generalized Parton Distributions (GPDs) of the nucleon. We calculate all the unpolarized, single- and double-spin observables of the reaction and study their sensitivities to GPDs. (orig.)
Reference results for time-like evolution up to
Bertone, Valerio; Carrazza, Stefano; Nocera, Emanuele R.
2015-03-01
We present high-precision numerical results for time-like Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution in the factorisation scheme, for the first time up to next-to-next-to-leading order accuracy in quantum chromodynamics. First, we scrutinise the analytical expressions of the splitting functions available in the literature, in both x and N space, and check their mutual consistency. Second, we implement time-like evolution in two publicly available, entirely independent and conceptually different numerical codes, in x and N space respectively: the already existing APFEL code, which has been updated with time-like evolution, and the new MELA code, which has been specifically developed to perform the study in this work. Third, by means of a model for fragmentation functions, we provide results for the evolution in different factorisation schemes, for different ratios between renormalisation and factorisation scales and at different final scales. Our results are collected in the format of benchmark tables, which could be used as a reference for global determinations of fragmentation functions in the future.
Design of airborne imaging spectrometer based on curved prism
Nie, Yunfeng; Xiangli, Bin; Zhou, Jinsong; Wei, Xiaoxiao
2011-11-01
A novel moderate-resolution imaging spectrometer spreading from visible wavelength to near infrared wavelength range with a spectral resolution of 10 nm, which combines curved prisms with the Offner configuration, is introduced. Compared to conventional imaging spectrometers based on dispersive prism or diffractive grating, this design possesses characteristics of small size, compact structure, low mass as well as little spectral line curve (smile) and spectral band curve (keystone or frown). Besides, the usage of compound curved prisms with two or more different materials can greatly reduce the nonlinearity inevitably brought by prismatic dispersion. The utilization ratio of light radiation is much higher than imaging spectrometer of the same type based on combination of diffractive grating and concentric optics. In this paper, the Seidel aberration theory of curved prism and the optical principles of Offner configuration are illuminated firstly. Then the optical design layout of the spectrometer is presented, and the performance evaluation of this design, including spot diagram and MTF, is analyzed. To step further, several types of telescope matching this system are provided. This work provides an innovational perspective upon optical system design of airborne spectral imagers; therefore, it can offer theoretic guide for imaging spectrometer of the same kind.
Using Spreadsheets to Produce Acid-Base Titration Curves.
Cawley, Martin James; Parkinson, John
1995-01-01
Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)
Qualitative Comparison of Contraction-Based Curve Skeletonization Methods
Sobiecki, André; Yasan, Haluk C.; Jalba, Andrei C.; Telea, Alexandru C.
2013-01-01
In recent years, many new methods have been proposed for extracting curve skeletons of 3D shapes, using a mesh-contraction principle. However, it is still unclear how these methods perform with respect to each other, and with respect to earlier voxel-based skeletonization methods, from the viewpoint
Prediction of flow boiling curves based on artificial neural network
International Nuclear Information System (INIS)
Wu Junmei; Xi'an Jiaotong Univ., Xi'an; Su Guanghui
2007-01-01
The effects of the main system parameters on flow boiling curves were analyzed by using an artificial neural network (ANN) based on the database selected from the 1960s. The input parameters of the ANN are system pressure, mass flow rate, inlet subcooling, wall superheat and steady/transition boiling, and the output parameter is heat flux. The results obtained by the ANN show that the heat flux increases with increasing inlet sub cooling for all heat transfer modes. Mass flow rate has no significant effects on nucleate boiling curves. The transition boiling and film boiling heat fluxes will increase with an increase of mass flow rate. The pressure plays a predominant role and improves heat transfer in whole boiling regions except film boiling. There are slight differences between the steady and the transient boiling curves in all boiling regions except the nucleate one. (authors)
Nucleon electromagnetic structure studies in the spacelike and timelike regions
Energy Technology Data Exchange (ETDEWEB)
Guttmann, Julia
2013-07-23
The thesis investigates the nucleon structure probed by the electromagnetic interaction. One of the most basic observables, reflecting the electromagnetic structure of the nucleon, are the form factors, which have been studied by means of elastic electron-proton scattering with ever increasing precision for several decades. In the timelike region, corresponding with the proton-antiproton annihilation into a electron-positron pair, the present experimental information is much less accurate. However, in the near future high-precision form factor measurements are planned. About 50 years after the first pioneering measurements of the electromagnetic form factors, polarization experiments stirred up the field since the results were found to be in striking contradiction to the findings of previous form factor investigations from unpolarized measurements. Triggered by the conflicting results, a whole new field studying the influence of two-photon exchange corrections to elastic electron-proton scattering emerged, which appeared as the most likely explanation of the discrepancy. The main part of this thesis deals with theoretical studies of two-photon exchange, which is investigated particularly with regard to form factor measurements in the spacelike as well as in the timelike region. An extraction of the two-photon amplitudes in the spacelike region through a combined analysis using the results of unpolarized cross section measurements and polarization experiments is presented. Furthermore, predictions of the two-photon exchange effects on the e{sup +}p/e{sup -}p cross section ratio are given for several new experiments, which are currently ongoing. The two-photon exchange corrections are also investigated in the timelike region in the process p anti p → e{sup +}e{sup -} by means of two factorization approaches. These corrections are found to be smaller than those obtained for the spacelike scattering process. The influence of the two-photon exchange corrections on
Nucleon electromagnetic structure studies in the spacelike and timelike regions
International Nuclear Information System (INIS)
Guttmann, Julia
2013-01-01
The thesis investigates the nucleon structure probed by the electromagnetic interaction. One of the most basic observables, reflecting the electromagnetic structure of the nucleon, are the form factors, which have been studied by means of elastic electron-proton scattering with ever increasing precision for several decades. In the timelike region, corresponding with the proton-antiproton annihilation into a electron-positron pair, the present experimental information is much less accurate. However, in the near future high-precision form factor measurements are planned. About 50 years after the first pioneering measurements of the electromagnetic form factors, polarization experiments stirred up the field since the results were found to be in striking contradiction to the findings of previous form factor investigations from unpolarized measurements. Triggered by the conflicting results, a whole new field studying the influence of two-photon exchange corrections to elastic electron-proton scattering emerged, which appeared as the most likely explanation of the discrepancy. The main part of this thesis deals with theoretical studies of two-photon exchange, which is investigated particularly with regard to form factor measurements in the spacelike as well as in the timelike region. An extraction of the two-photon amplitudes in the spacelike region through a combined analysis using the results of unpolarized cross section measurements and polarization experiments is presented. Furthermore, predictions of the two-photon exchange effects on the e + p/e - p cross section ratio are given for several new experiments, which are currently ongoing. The two-photon exchange corrections are also investigated in the timelike region in the process p anti p → e + e - by means of two factorization approaches. These corrections are found to be smaller than those obtained for the spacelike scattering process. The influence of the two-photon exchange corrections on cross section
On exclusive reactions in the time-like region
Kroll, P; Schürmann, M; Schweiger, W; Pilsner, Th.
1993-01-01
The electromagnetic form factors of the proton in the time-like region and two-photon annihilations into proton-antiproton are investigated. To calculate these processes at moderately large $s$ we use a variant of the Brodsky-Lepage hard-scattering formalism where diquarks are considered as quasi-elementary constituents of baryons. The proton wave function and the parameters controlling the diquark contributions are determined from fits to space-like data. We also comment on the decay $\\eta_c \\to p\\bar{p}$.
Thermodynamic Activity-Based Progress Curve Analysis in Enzyme Kinetics.
Pleiss, Jürgen
2018-03-01
Macrokinetic Michaelis-Menten models based on thermodynamic activity provide insights into enzyme kinetics because they separate substrate-enzyme from substrate-solvent interactions. Kinetic parameters are estimated from experimental progress curves of enzyme-catalyzed reactions. Three pitfalls are discussed: deviations between thermodynamic and concentration-based models, product effects on the substrate activity coefficient, and product inhibition. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reliability Based Geometric Design of Horizontal Circular Curves
Rajbongshi, Pabitra; Kalita, Kuldeep
2018-06-01
Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.
Reliability Based Geometric Design of Horizontal Circular Curves
Rajbongshi, Pabitra; Kalita, Kuldeep
2018-03-01
Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.
Investigation of the bases for use of the KIc curve
International Nuclear Information System (INIS)
McCabe, D.E.; Nanstad, R.K.; Rosenfield, A.R.; Marschall, C.W.; Irwin, G.R.
1991-01-01
Title 10 of the Code of Federal Regulations, Part 50 (10CFR50), Appendix G, establishes the bases for setting allowable pressure and temperature limits on reactors during heatup and cooldown operation. Both the K Ic and K Ia curves are utilized in prescribed ways to maintain reactor vessel structural integrity in the presence of an assumed or actual flaw and operating stresses. Currently, the code uses the K Ia curve, normalized to the RT NDT , to represent the fracture toughness trend for unirradiated and irradiated pressure vessel steels. Although this is clearly a conservative policy, it has been suggested that the K Ic curve is the more appropriate for application to a non-accident operating condition. A number of uncertainties have been identified, however, that might convert normal operating transients into a dynamic loading situation. Those include the introduction of running cracks from local brittle zones, crack pop-ins, reduced toughness from arrested cleavage cracks, description of the K Ic curve for irradiated materials, and other related unresolved issues relative to elastic-plastic fracture mechanics. Some observations and conclusions can be made regarding various aspects of those uncertainties and they are discussed in this paper. A discussion of further work required and under way to address the remaining uncertainties is also presented
Statistical data processing of mobility curves of univalent weak bases
Czech Academy of Sciences Publication Activity Database
Šlampová, Andrea; Boček, Petr
2008-01-01
Roč. 29, č. 2 (2008), s. 538-541 ISSN 0173-0835 R&D Projects: GA AV ČR IAA400310609; GA ČR GA203/05/2106 Institutional research plan: CEZ:AV0Z40310501 Keywords : mobility curve * univalent weak bases * statistical evaluation Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.509, year: 2008
THE CPA QUALIFICATION METHOD BASED ON THE GAUSSIAN CURVE FITTING
Directory of Open Access Journals (Sweden)
M.T. Adithia
2015-01-01
Full Text Available The Correlation Power Analysis (CPA attack is an attack on cryptographic devices, especially smart cards. The results of the attack are correlation traces. Based on the correlation traces, an evaluation is done to observe whether significant peaks appear in the traces or not. The evaluation is done manually, by experts. If significant peaks appear then the smart card is not considered secure since it is assumed that the secret key is revealed. We develop a method that objectively detects peaks and decides which peak is significant. We conclude that using the Gaussian curve fitting method, the subjective qualification of the peak significance can be objectified. Thus, better decisions can be taken by security experts. We also conclude that the Gaussian curve fitting method is able to show the influence of peak sizes, especially the width and height, to a significance of a particular peak.
Singularities of lightcone pedals of spacelike curves in Lorentz-Minkowski 3-space
Directory of Open Access Journals (Sweden)
Chen Liang
2016-01-01
Full Text Available In this paper, geometric properties of spacelike curves on a timelike surface in Lorentz-Minkowski 3-space are investigated by applying the singularity theory of smooth functions from the contact viewpoint.
Satellite altimetry based rating curves throughout the entire Amazon basin
Paris, A.; Calmant, S.; Paiva, R. C.; Collischonn, W.; Silva, J. S.; Bonnet, M.; Seyler, F.
2013-05-01
The Amazonian basin is the largest hydrological basin all over the world. In the recent past years, the basin has experienced an unusual succession of extreme draughts and floods, which origin is still a matter of debate. Yet, the amount of data available is poor, both over time and space scales, due to factor like basin's size, access difficulty and so on. One of the major locks is to get discharge series distributed over the entire basin. Satellite altimetry can be used to improve our knowledge of the hydrological stream flow conditions in the basin, through rating curves. Rating curves are mathematical relationships between stage and discharge at a given place. The common way to determine the parameters of the relationship is to compute the non-linear regression between the discharge and stage series. In this study, the discharge data was obtained by simulation through the entire basin using the MGB-IPH model with TRMM Merge input rainfall data and assimilation of gage data, run from 1998 to 2010. The stage dataset is made of ~800 altimetry series at ENVISAT and JASON-2 virtual stations. Altimetry series span between 2002 and 2010. In the present work we present the benefits of using stochastic methods instead of probabilistic ones to determine a dataset of rating curve parameters which are consistent throughout the entire Amazon basin. The rating curve parameters have been computed using a parameter optimization technique based on Markov Chain Monte Carlo sampler and Bayesian inference scheme. This technique provides an estimate of the best parameters for the rating curve, but also their posterior probability distribution, allowing the determination of a credibility interval for the rating curve. Also is included in the rating curve determination the error over discharges estimates from the MGB-IPH model. These MGB-IPH errors come from either errors in the discharge derived from the gage readings or errors in the satellite rainfall estimates. The present
Timelike geodesics around a charged spherically symmetric dilaton black hole
Directory of Open Access Journals (Sweden)
Blaga C.
2015-01-01
Full Text Available In this paper we study the timelike geodesics around a spherically symmetric charged dilaton black hole. The trajectories around the black hole are classified using the effective potential of a free test particle. This qualitative approach enables us to determine the type of orbit described by test particle without solving the equations of motion, if the parameters of the black hole and the particle are known. The connections between these parameters and the type of orbit described by the particle are obtained. To visualize the orbits we solve numerically the equation of motion for different values of parameters envolved in our analysis. The effective potential of a free test particle looks different for a non-extremal and an extremal black hole, therefore we have examined separately these two types of black holes.
Asymptotically AdS spacetimes with a timelike Kasner singularity
Energy Technology Data Exchange (ETDEWEB)
Ren, Jie [Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem 91904 (Israel)
2016-07-21
Exact solutions to Einstein’s equations for holographic models are presented and studied. The IR geometry has a timelike cousin of the Kasner singularity, which is the less generic case of the BKL (Belinski-Khalatnikov-Lifshitz) singularity, and the UV is asymptotically AdS. This solution describes a holographic RG flow between them. The solution’s appearance is an interpolation between the planar AdS black hole and the AdS soliton. The causality constraint is always satisfied. The entanglement entropy and Wilson loops are discussed. The boundary condition for the current-current correlation function and the Laplacian in the IR is examined. There is no infalling wave in the IR, but instead, there is a normalizable solution in the IR. In a special case, a hyperscaling-violating geometry is obtained after a dimensional reduction.
Modelling of acid-base titration curves of mineral assemblages
Directory of Open Access Journals (Sweden)
Stamberg Karel
2016-01-01
Full Text Available The modelling of acid-base titration curves of mineral assemblages was studied with respect to basic parameters of their surface sites to be obtained. The known modelling approaches, component additivity (CA and generalized composite (GC, and three types of different assemblages (fucoidic sandstones, sedimentary rock-clay and bentonite-magnetite samples were used. In contrary to GC-approach, application of which was without difficulties, the problem of CA-one consisted in the credibility and accessibility of the parameters characterizing the individual mineralogical components.
Covariant description of kinetic freeze-out through a finite time-like layer
International Nuclear Information System (INIS)
Molnar, E; Csernai, L P; Magas, V K; Lazar, Zs I; NyIri, A; Tamosiunas, K
2007-01-01
The freeze-out (FO) problem is addressed for a covariant FO probability and a finite FO layer with a time-like normal vector continuing the line of studies introduced in Molnar et al (2006 Phys. Rev. C 74 024907). The resulting post-FO momentum distribution functions are presented and discussed. We show that in general the post-FO distributions are non-thermal and asymmetric distributions even for time-like FO situations
Decline curve based models for predicting natural gas well performance
Directory of Open Access Journals (Sweden)
Arash Kamari
2017-06-01
Full Text Available The productivity of a gas well declines over its production life as cannot cover economic policies. To overcome such problems, the production performance of gas wells should be predicted by applying reliable methods to analyse the decline trend. Therefore, reliable models are developed in this study on the basis of powerful artificial intelligence techniques viz. the artificial neural network (ANN modelling strategy, least square support vector machine (LSSVM approach, adaptive neuro-fuzzy inference system (ANFIS, and decision tree (DT method for the prediction of cumulative gas production as well as initial decline rate multiplied by time as a function of the Arps' decline curve exponent and ratio of initial gas flow rate over total gas flow rate. It was concluded that the results obtained based on the models developed in current study are in satisfactory agreement with the actual gas well production data. Furthermore, the results of comparative study performed demonstrates that the LSSVM strategy is superior to the other models investigated for the prediction of both cumulative gas production, and initial decline rate multiplied by time.
Estimation of Curve Tracing Time in Supercapacitor based PV Characterization
Basu Pal, Sudipta; Das Bhattacharya, Konika; Mukherjee, Dipankar; Paul, Debkalyan
2017-08-01
Smooth and noise-free characterisation of photovoltaic (PV) generators have been revisited with renewed interest in view of large size PV arrays making inroads into the urban sector of major developing countries. Such practice has recently been observed to be confronted by the use of a suitable data acquisition system and also the lack of a supporting theoretical analysis to justify the accuracy of curve tracing. However, the use of a selected bank of supercapacitors can mitigate the said problems to a large extent. Assuming a piecewise linear analysis of the V-I characteristics of a PV generator, an accurate analysis of curve plotting time has been possible. The analysis has been extended to consider the effect of equivalent series resistance of the supercapacitor leading to increased accuracy (90-95%) of curve plotting times.
MOND rotation curves for spiral galaxies with Cepheid-based distances
Bottema, R; Pestana, JLG; Rothberg, B; Sanders, RH
2002-01-01
Rotation curves for four spiral galaxies with recently determined Cepheid-based distances are reconsidered in terms of modified Newtonian dynamics (MOND). For two of the objects, NGC 2403 and NGC 7331, the rotation curves predicted by MOND are compatible with the observed curves when these galaxies
Fractal based curves in musical creativity: A critical annotation
Georgaki, Anastasia; Tsolakis, Christos
In this article we examine fractal curves and synthesis algorithms in musical composition and research. First we trace the evolution of different approaches for the use of fractals in music since the 80's by a literature review. Furthermore, we review representative fractal algorithms and platforms that implement them. Properties such as self-similarity (pink noise), correlation, memory (related to the notion of Brownian motion) or non correlation at multiple levels (white noise), can be used to develop hierarchy of criteria for analyzing different layers of musical structure. L-systems can be applied in the modelling of melody in different musical cultures as well as in the investigation of musical perception principles. Finally, we propose a critical investigation approach for the use of artificial or natural fractal curves in systematic musicology.
Towards three-loop QCD corrections to the time-like splitting functions
International Nuclear Information System (INIS)
Gituliar, O.; Moch, S.
2015-05-01
We report on the status of a direct computation of the time-like splitting functions at next-to-next-to-leading order in QCD. Time-like splitting functions govern the collinear kinematics of inclusive hadron production and the evolution of the parton fragmentation distributions. Current knowledge about them at three loops has been inferred by means of crossing symmetry from their related space-like counterparts, which has left certain parts of the off-diagonal quark-gluon splitting function undetermined. This motivates an independent calculation from first principles. We review the tools and methods which are applied to attack the problem.
Single-Spin Polarization Effects and the Determination of Timelike Proton Form Factors
Energy Technology Data Exchange (ETDEWEB)
Brodsky, S
2003-10-24
We show that measurements of the proton's polarization in e{sup +}e{sup -} {yields} p{bar p} strongly discriminate between analytic forms of models which fit the proton form factors in the spacelike region. In particular, the single-spin asymmetry normal to the scattering plane measures the relative phase difference between the timelike G{sub E} and G{sub M} form factors. The expected proton polarization in the timelike region is large, of order of several tens of percent.
Ghazavi, Reza; Moafi Rabori, Ali; Ahadnejad Reveshty, Mohsen
2016-01-01
Estimate design storm based on rainfall intensity–duration–frequency (IDF) curves is an important parameter for hydrologic planning of urban areas. The main aim of this study was to estimate rainfall intensities of Zanjan city watershed based on overall relationship of rainfall IDF curves and appropriate model of hourly rainfall estimation (Sherman method, Ghahreman and Abkhezr method). Hydrologic and hydraulic impacts of rainfall IDF curves change in flood properties was evaluated via Stormw...
A volume-based method for denoising on curved surfaces
Biddle, Harry; von Glehn, Ingrid; Macdonald, Colin B.; Marz, Thomas
2013-01-01
We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.
A volume-based method for denoising on curved surfaces
Biddle, Harry
2013-09-01
We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.
Local thermal equilibrium and KMS states in curved spacetime
International Nuclear Information System (INIS)
Solveen, Christoph
2012-01-01
On the example of a free massless and conformally coupled scalar field, it is argued that in quantum field theory in curved spacetimes with the time-like Killing field, the corresponding KMS states (generalized Gibbs ensembles) at parameter β > 0 need not possess a definite temperature in the sense of the zeroth law. In fact, these states, although passive in the sense of the second law, are not always in local thermal equilibrium (LTE). A criterion characterizing LTE states with sharp local temperature is discussed. Moreover, a proposal is made for fixing the renormalization freedom of composite fields which serve as ‘thermal observables’ and a new definition of the thermal energy of LTE states is introduced. Based on these results, a general relation between the local temperature and the parameter β is established for KMS states in (anti) de Sitter spacetime. (paper)
Timelike symmetry of the quantum transition and Einstein-Podolsky-Rosen paradox
International Nuclear Information System (INIS)
Costa de Beauregard, Olivier
1976-01-01
The non-locality in the paradox is very close to that of Feynman's electron-positron system: the sum of two timelike vectors with 4th components of opposite signs may be spacelike. The intrinsic time symmetry of the quantum transition consists in the presence of both the delayed and the advanced wave inside the ''collapsed'' wave [fr
Solving the Rational Polynomial Coefficients Based on L Curve
Zhou, G.; Li, X.; Yue, T.; Huang, W.; He, C.; Huang, Y.
2018-05-01
The rational polynomial coefficients (RPC) model is a generalized sensor model, which can achieve high approximation accuracy. And it is widely used in the field of photogrammetry and remote sensing. Least square method is usually used to determine the optimal parameter solution of the rational function model. However the distribution of control points is not uniform or the model is over-parameterized, which leads to the singularity of the coefficient matrix of the normal equation. So the normal equation becomes ill conditioned equation. The obtained solutions are extremely unstable and even wrong. The Tikhonov regularization can effectively improve and solve the ill conditioned equation. In this paper, we calculate pathological equations by regularization method, and determine the regularization parameters by L curve. The results of the experiments on aerial format photos show that the accuracy of the first-order RPC with the equal denominators has the highest accuracy. The high order RPC model is not necessary in the processing of dealing with frame images, as the RPC model and the projective model are almost the same. The result shows that the first-order RPC model is basically consistent with the strict sensor model of photogrammetry. Orthorectification results both the firstorder RPC model and Camera Model (ERDAS9.2 platform) are similar to each other, and the maximum residuals of X and Y are 0.8174 feet and 0.9272 feet respectively. This result shows that RPC model can be used in the aerial photographic compensation replacement sensor model.
Mechanism based evaluation of materials behavior and reference curves
International Nuclear Information System (INIS)
Toerroenen, K.; Saario, T.; Wallin, K.; Forsten, J.
1984-01-01
The safety assessment of nuclear pressure vessels and piping requires a quantitative estimation of defect growth by stable and unstable manner during service. This estimation is essential for determining whether the defect detected during inspection should be repaired or whether the size of the defect even after its expected growth is small enough to leave the integrity of the vessel unaffected. The most important stable defect growth mechanism is that of environmentally assisted cyclic crack growth. Recent results indicate that it is markedly affected by sulfur content and/or manganese sulfide morphology and distribution. This implies that an essential improvement in component safety has been gained by currently applied steelmaking practices, which result in extra low sulfur content, generally below 0.01 wt%, and in round shape and small size of inclusions, through, e.g., calcium treatment, hence considerably reducing the effect of environment on crack growth rate. This further implies that the ASME Section XI reference curves for environmentally accelerated cyclic crack growth are conservative for steels produced by current steelmaking practices. (orig./WL)
Directory of Open Access Journals (Sweden)
Wenting Luo
2016-04-01
Full Text Available Pavement horizontal curve is designed to serve as a transition between straight segments, and its presence may cause a series of driving-related safety issues to motorists and drivers. As is recognized that traditional methods for curve geometry investigation are time consuming, labor intensive, and inaccurate, this study attempts to develop a method that can automatically conduct horizontal curve identification and measurement at network level. The digital highway data vehicle (DHDV was utilized for data collection, in which three Euler angles, driving speed, and acceleration of survey vehicle were measured with an inertial measurement unit (IMU. The 3D profiling data used for cross slope calibration was obtained with PaveVision3D Ultra technology at 1 mm resolution. In this study, the curve identification was based on the variation of heading angle, and the curve radius was calculated with kinematic method, geometry method, and lateral acceleration method. In order to verify the accuracy of the three methods, the analysis of variance (ANOVA test was applied by using the control variable of curve radius measured by field test. Based on the measured curve radius, a curve safety analysis model was used to predict the crash rates and safe driving speeds at horizontal curves. Finally, a case study on 4.35 km road segment demonstrated that the proposed method could efficiently conduct network level analysis.
Meites, T; Meites, L
1970-06-01
This paper deals with isovalent ion-combination titrations based on reactions that can be represented by the equation M(n+) + X(n-) --> MX, where the activity of the product MX is invariant throughout a titration, and with the derivative titration curves obtained by plotting d[M(+)]/dfversus f for such titrations. It describes some of the ways in which such curves can be obtained; it compares and contrasts them both with potentiometric titration curves, which resemble them in shape, and with segmented titration curves, from which they are derived; and it discusses their properties in detail.
Testing the equality of nonparametric regression curves based on ...
African Journals Online (AJOL)
Abstract. In this work we propose a new methodology for the comparison of two regression functions f1 and f2 in the case of homoscedastic error structure and a fixed design. Our approach is based on the empirical Fourier coefficients of the regression functions f1 and f2 respectively. As our main results we obtain the ...
Going Beyond, Going Further: The Preparation of Acid-Base Titration Curves.
McClendon, Michael
1984-01-01
Background information, list of materials needed, and procedures used are provided for a simple technique for generating mechanically plotted acid-base titration curves. The method is suitable for second-year high school chemistry students. (JN)
Directory of Open Access Journals (Sweden)
Janković Marko
2013-01-01
Full Text Available In this paper, we analyze the possibilities of the diagnosis of Parkinson's disease at an early stage, based on characteristics of the input-output curve. The input-output (IO curve was analyzed in two ways: we analyzed the gain of the curve for low-level transcranial stimulation and we analyzed the overall 'quality' of the IO curve. The 'quality' of the curve calculation is based on basic concepts from quantum mechanics and calculation of Tsallis entropy.
Isometries of half supersymmetric time-like solutions in five dimensions
International Nuclear Information System (INIS)
Gutowski, J B; Sabra, W A
2010-01-01
Spinorial geometry techniques have recently been used to classify all half supersymmetric solutions in gauged five-dimensional supergravity with vector multiplets. In this paper we consider solutions for which at least one of the Killing spinors generates a time-like Killing vector. We obtain coordinate transformations which considerably simplify the solutions, and in a number of cases, we obtain explicitly some additional Killing vectors which were hidden in the original analysis.
A New Curve Tracing Algorithm Based on Local Feature in the Vectorization of Paper Seismograms
Directory of Open Access Journals (Sweden)
Maofa Wang
2014-02-01
Full Text Available History paper seismograms are very important information for earthquake monitoring and prediction. The vectorization of paper seismograms is an import problem to be resolved. Auto tracing of waveform curves is a key technology for the vectorization of paper seismograms. It can transform an original scanning image into digital waveform data. Accurately tracing out all the key points of each curve in seismograms is the foundation for vectorization of paper seismograms. In the paper, we present a new curve tracing algorithm based on local feature, applying to auto extraction of earthquake waveform in paper seismograms.
Feasibility studies of time-like proton electromagnetic form factors at PANDA-FAIR
Energy Technology Data Exchange (ETDEWEB)
Dbeyssi, Alaa; Capozza, Luigi; Deiseroth, Malte; Froehlich, Bertold; Khaneft, Dmitry; Mora Espi, Maria Carmen; Noll, Oliver; Rodriguez Pineiro, David; Valente, Roserio; Zambrana, Manuel; Zimmermann, Iris [Helmholtz-Institut Mainz, Mainz (Germany); Maas, Frank [Helmholtz-Institut Mainz, Mainz (Germany); Institute of Nuclear Physics, Mainz (Germany); PRISMA Cluster of Excellence, Mainz (Germany); Marchand, Dominique; Tomasi-Gustafsson, Egle; Wang, Ying [Institut de Physique Nucleaire, Orsay (France); Collaboration: PANDA-Collaboration
2015-07-01
Electromagnetic form factors are fundamental quantities which describe the intrinsic electric and magnetic distributions of hadrons. Time-like proton form factors are experimentally accessible through the annihilation processes anti p+p <-> e{sup +}+e{sup -}. Their measurement in the time-like region had been limited by the low statistics achieved by the experiments. This contribution reports on the results of Monte Carlo simulations for future measurements of electromagnetic proton form factors at PANDA (antiProton ANnihilation at DArmstadt). In frame of the PANDARoot software, the statistical precision at which the proton form factors will be determined is estimated. The signal (anti p+p → e{sup +}+e{sup -}) identification and the suppression of the main background process (anti p+p → π{sup +}+π{sup -}) are studied. Different methods have been used and/or developed to generate and analyse the processes of interest. The results show that time-like proton form factors will be measured at PANDA with unprecedented statistical accuracy.
Reference results for time-like evolution up to O(α_s"3)
International Nuclear Information System (INIS)
Bertone, Valerio; Carrazza, Stefano; Nocera, Emanuele R.
2015-01-01
We present high-precision numerical results for time-like Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution in the (MS)-bar factorisation scheme, for the first time up to next-to-next-to-leading order accuracy in quantum chromodynamics. First, we scrutinise the analytical expressions of the splitting functions available in the literature, in both x and N space, and check their mutual consistency. Second, we implement time-like evolution in two publicly available, entirely independent and conceptually different numerical codes, in x and N space respectively: the already existing APFEL code, which has been updated with time-like evolution, and the new MELA code, which has been specifically developed to perform the study in this work. Third, by means of a model for fragmentation functions, we provide results for the evolution in different factorisation schemes, for different ratios between renormalisation and factorisation scales and at different final scales. Our results are collected in the format of benchmark tables, which could be used as a reference for global determinations of fragmentation functions in the future.
Feature Extraction from 3D Point Cloud Data Based on Discrete Curves
Directory of Open Access Journals (Sweden)
Yi An
2013-01-01
Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.
A graph-based method for fitting planar B-spline curves with intersections
Directory of Open Access Journals (Sweden)
Pengbo Bo
2016-01-01
Full Text Available The problem of fitting B-spline curves to planar point clouds is studied in this paper. A novel method is proposed to deal with the most challenging case where multiple intersecting curves or curves with self-intersection are necessary for shape representation. A method based on Delauney Triangulation of data points is developed to identify connected components which is also capable of removing outliers. A skeleton representation is utilized to represent the topological structure which is further used to create a weighted graph for deciding the merging of curve segments. Different to existing approaches which utilize local shape information near intersections, our method considers shape characteristics of curve segments in a larger scope and is thus capable of giving more satisfactory results. By fitting each group of data points with a B-spline curve, we solve the problems of curve structure reconstruction from point clouds, as well as the vectorization of simple line drawing images by drawing lines reconstruction.
Simulation-optimization model of reservoir operation based on target storage curves
Directory of Open Access Journals (Sweden)
Hong-bin Fang
2014-10-01
Full Text Available This paper proposes a new storage allocation rule based on target storage curves. Joint operating rules are also proposed to solve the operation problems of a multi-reservoir system with joint demands and water transfer-supply projects. The joint operating rules include a water diversion rule to determine the amount of diverted water in a period, a hedging rule based on an aggregated reservoir to determine the total release from the system, and a storage allocation rule to specify the release from each reservoir. A simulation-optimization model was established to optimize the key points of the water diversion curves, the hedging rule curves, and the target storage curves using the improved particle swarm optimization (IPSO algorithm. The multi-reservoir water supply system located in Liaoning Province, China, including a water transfer-supply project, was employed as a case study to verify the effectiveness of the proposed join operating rules and target storage curves. The results indicate that the proposed operating rules are suitable for the complex system. The storage allocation rule based on target storage curves shows an improved performance with regard to system storage distribution.
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
Analytical expression for initial magnetization curve of Fe-based soft magnetic composite material
Energy Technology Data Exchange (ETDEWEB)
Birčáková, Zuzana, E-mail: zuzana.bircakova@upjs.sk [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Kollár, Peter; Füzer, Ján [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Bureš, Radovan; Fáberová, Mária [Institute of Materials Research, Slovak Academy of Sciences, Watsonova 47, 04001 Košice (Slovakia)
2017-02-01
The analytical expression for the initial magnetization curve for Fe-phenolphormaldehyde resin composite material was derived based on the already proposed ideas of the magnetization vector deviation function and the domain wall annihilation function, characterizing the reversible magnetization processes through the extent of deviation of magnetization vectors from magnetic field direction and the irreversible processes through the effective numbers of movable domain walls, respectively. As for composite materials the specific dependences of these functions were observed, the ideas were extended meeting the composites special features, which are principally the much higher inner demagnetizing fields produced by magnetic poles on ferromagnetic particle surfaces. The proposed analytical expression enables us to find the relative extent of each type of magnetization processes when magnetizing a specimen along the initial curve. - Highlights: • Analytical expression of the initial curve derived for SMC. • Initial curve described by elementary magnetization processes. • Influence of inner demagnetizing fields on magnetization process in SMC.
Effect of β on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion
International Nuclear Information System (INIS)
Feng Qinghai; Yuan Wancheng
2010-01-01
In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of β is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of β are suggested.
International Nuclear Information System (INIS)
Ros, F C; Sidek, L M; Desa, M N; Arifin, K; Tosaka, H
2013-01-01
The purpose of the stage-discharge curves varies from water quality study, flood modelling study, can be used to project climate change scenarios and so on. As the bed of the river often changes due to the annual monsoon seasons that sometimes cause by massive floods, the capacity of the river will changed causing shifting controlled to happen. This study proposes to use the historical flood event data from 1960 to 2009 in calculating the stage-discharge curve of Guillemard Bridge located in Sg. Kelantan. Regression analysis was done to check the quality of the data and examine the correlation between the two variables, Q and H. The mean values of the two variables then were adopted to find the value of difference between zero gauge height and the level of zero flow, 'a', K and 'n' to fit into rating curve equation and finally plotting the stage-discharge rating curve. Regression analysis of the historical flood data indicate that 91 percent of the original uncertainty has been explained by the analysis with the standard error of 0.085.
DEFF Research Database (Denmark)
Tatu, Aditya Jayant
This thesis deals with two unrelated issues, restricting curve evolution to subspaces and computing image patches in the equivalence class of Histogram of Gradient orientation based features using nonlinear projection methods. Curve evolution is a well known method used in various applications like...... tracking interfaces, active contour based segmentation methods and others. It can also be used to study shape spaces, as deforming a shape can be thought of as evolving its boundary curve. During curve evolution a curve traces out a path in the infinite dimensional space of curves. Due to application...... specific requirements like shape priors or a given data model, and due to limitations of the computer, the computed curve evolution forms a path in some finite dimensional subspace of the space of curves. We give methods to restrict the curve evolution to a finite dimensional linear or implicitly defined...
A standard curve based method for relative real time PCR data processing
Directory of Open Access Journals (Sweden)
Krause Andreas
2005-03-01
Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that
Wells, Gary L; Yang, Yueran; Smalarz, Laura
2015-04-01
We provide a novel Bayesian treatment of the eyewitness identification problem as it relates to various system variables, such as instruction effects, lineup presentation format, lineup-filler similarity, lineup administrator influence, and show-ups versus lineups. We describe why eyewitness identification is a natural Bayesian problem and how numerous important observations require careful consideration of base rates. Moreover, we argue that the base rate in eyewitness identification should be construed as a system variable (under the control of the justice system). We then use prior-by-posterior curves and information-gain curves to examine data obtained from a large number of published experiments. Next, we show how information-gain curves are moderated by system variables and by witness confidence and we note how information-gain curves reveal that lineups are consistently more proficient at incriminating the guilty than they are at exonerating the innocent. We then introduce a new type of analysis that we developed called base rate effect-equivalency (BREE) curves. BREE curves display how much change in the base rate is required to match the impact of any given system variable. The results indicate that even relatively modest changes to the base rate can have more impact on the reliability of eyewitness identification evidence than do the traditional system variables that have received so much attention in the literature. We note how this Bayesian analysis of eyewitness identification has implications for the question of whether there ought to be a reasonable-suspicion criterion for placing a person into the jeopardy of an identification procedure. (c) 2015 APA, all rights reserved).
Directory of Open Access Journals (Sweden)
Sylvie Troncale
Full Text Available MOTIVATION: Reverse phase protein array (RPPA is a powerful dot-blot technology that allows studying protein expression levels as well as post-translational modifications in a large number of samples simultaneously. Yet, correct interpretation of RPPA data has remained a major challenge for its broad-scale application and its translation into clinical research. Satisfying quantification tools are available to assess a relative protein expression level from a serial dilution curve. However, appropriate tools allowing the normalization of the data for external sources of variation are currently missing. RESULTS: Here we propose a new method, called NormaCurve, that allows simultaneous quantification and normalization of RPPA data. For this, we modified the quantification method SuperCurve in order to include normalization for (i background fluorescence, (ii variation in the total amount of spotted protein and (iii spatial bias on the arrays. Using a spike-in design with a purified protein, we test the capacity of different models to properly estimate normalized relative expression levels. The best performing model, NormaCurve, takes into account a negative control array without primary antibody, an array stained with a total protein stain and spatial covariates. We show that this normalization is reproducible and we discuss the number of serial dilutions and the number of replicates that are required to obtain robust data. We thus provide a ready-to-use method for reliable and reproducible normalization of RPPA data, which should facilitate the interpretation and the development of this promising technology. AVAILABILITY: The raw data, the scripts and the normacurve package are available at the following web site: http://microarrays.curie.fr.
Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia
2017-09-01
The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.
ONODA, Tomoaki; YAMAMOTO, Ryuta; SAWAMURA, Kyohei; MURASE, Harutaka; NAMBO, Yasuo; INOUE, Yoshinobu; MATSUI, Akira; MIYAKE, Takeshi; HIRAI, Nobuhiro
2014-01-01
ABSTRACT We propose an approach of estimating individual growth curves based on the birthday information of Japanese Thoroughbred horses, with considerations of the seasonal compensatory growth that is a typical characteristic of seasonal breeding animals. The compensatory growth patterns appear during only the winter and spring seasons in the life of growing horses, and the meeting point between winter and spring depends on the birthday of each horse. We previously developed new growth curve equations for Japanese Thoroughbreds adjusting for compensatory growth. Based on the equations, a parameter denoting the birthday information was added for the modeling of the individual growth curves for each horse by shifting the meeting points in the compensatory growth periods. A total of 5,594 and 5,680 body weight and age measurements of Thoroughbred colts and fillies, respectively, and 3,770 withers height and age measurements of both sexes were used in the analyses. The results of predicted error difference and Akaike Information Criterion showed that the individual growth curves using birthday information better fit to the body weight and withers height data than not using them. The individual growth curve for each horse would be a useful tool for the feeding managements of young Japanese Thoroughbreds in compensatory growth periods. PMID:25013356
Refined tropical curve counts and canonical bases for quantum cluster algebras
DEFF Research Database (Denmark)
Mandel, Travis
We express the (quantizations of the) Gross-Hacking-Keel-Kontsevich canonical bases for cluster algebras in terms of certain (Block-Göttsche) weighted counts of tropical curves. In the process, we obtain via scattering diagram techniques a new invariance result for these Block-Göttsche counts....
A residual life prediction model based on the generalized σ -N curved surface
Zongwen AN; Xuezong BAI; Jianxiong GAO
2016-01-01
In order to investigate change rule of the residual life of structure under random repeated load, firstly, starting from the statistic meaning of random repeated load, the joint probability density function of maximum stress and minimum stress is derived based on the characteristics of order statistic (maximum order statistic and minimum order statistic); then, based on the equation of generalized σ -N curved surface, considering the influence of load cycles number on fatigue life, a relation...
Directory of Open Access Journals (Sweden)
Himanshu Sharma
2016-07-01
Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution
An explanation for the shape of nanoindentation unloading curves based on finite element simulation
International Nuclear Information System (INIS)
Bolshakov, A.; Pharr, G.M.
1995-01-01
Current methods for measuring hardness and modulus from nanoindentation load-displacement data are based on Sneddon's equations for the indentation of an elastic half-space by an axially symmetric rigid punch. Recent experiments have shown that nanoindentation unloading data are distinctly curved in a manner which is not consistent with either the flat punch or the conical indenter geometries frequently used in modeling, but are more closely approximated by a parabola of revolution. Finite element simulations for conical indentation of an elastic-plastic material are presented which corroborate the experimental observations, and from which a simple explanation for the shape of the unloading curve is derived. The explanation is based on the concept of an effective indenter shape whose geometry is determined by the shape of the plastic hardness impression formed during indentation
A residual life prediction model based on the generalized σ -N curved surface
Directory of Open Access Journals (Sweden)
Zongwen AN
2016-06-01
Full Text Available In order to investigate change rule of the residual life of structure under random repeated load, firstly, starting from the statistic meaning of random repeated load, the joint probability density function of maximum stress and minimum stress is derived based on the characteristics of order statistic (maximum order statistic and minimum order statistic; then, based on the equation of generalized σ -N curved surface, considering the influence of load cycles number on fatigue life, a relationship among minimum stress, maximum stress and residual life, that is the σmin(n- σmax(n-Nr(n curved surface model, is established; finally, the validity of the proposed model is demonstrated by a practical case. The result shows that the proposed model can reflect the influence of maximum stress and minimum stress on residual life of structure under random repeated load, which can provide a theoretical basis for life prediction and reliability assessment of structure.
WANG, J.
2017-12-01
In stream water quality control, the total maximum daily load (TMDL) program is very effective. However, the load duration curves (LDC) of TMDL are difficult to be established because no sufficient observed flow and pollutant data can be provided in data-scarce watersheds in which no hydrological stations or consecutively long-term hydrological data are available. Although the point sources or a non-point sources of pollutants can be clarified easily with the aid of LDC, where does the pollutant come from and to where it will be transported in the watershed cannot be traced by LDC. To seek out the best management practices (BMPs) of pollutants in a watershed, and to overcome the limitation of LDC, we proposed to develop LDC based on a distributed hydrological model of SWAT for the water quality management in data scarce river basins. In this study, firstly, the distributed hydrological model of SWAT was established with the scarce-hydrological data. Then, the long-term daily flows were generated with the established SWAT model and rainfall data from the adjacent weather station. Flow duration curves (FDC) was then developed with the aid of generated daily flows by SWAT model. Considering the goal of water quality management, LDC curves of different pollutants can be obtained based on the FDC. With the monitored water quality data and the LDC curves, the water quality problems caused by the point or non-point source pollutants in different seasons can be ascertained. Finally, the distributed hydrological model of SWAT was employed again to tracing the spatial distribution and the origination of the pollutants of coming from what kind of agricultural practices and/or other human activities. A case study was conducted in the Jian-jiang river, a tributary of Yangtze river, of Duyun city, Guizhou province. Results indicate that this kind of method can realize the water quality management based on TMDL and find out the suitable BMPs for reducing pollutant in a watershed.
Paris, Adrien; André Garambois, Pierre; Calmant, Stéphane; Paiva, Rodrigo; Walter, Collischonn; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Bonnet, Marie-Paule; Seyler, Frédérique; Monnier, Jérôme
2016-04-01
Estimating river discharge for ungauged river reaches from satellite measurements is not straightforward given the nonlinearity of flow behavior with respect to measurable and non measurable hydraulic parameters. As a matter of facts, current satellite datasets do not give access to key parameters such as river bed topography and roughness. A unique set of almost one thousand altimetry-based rating curves was built by fit of ENVISAT and Jason-2 water stages with discharges obtained from the MGB-IPH rainfall-runoff model in the Amazon basin. These rated discharges were successfully validated towards simulated discharges (Ens = 0.70) and in-situ discharges (Ens = 0.71) and are not mission-dependent. The rating curve writes Q = a(Z-Z0)b*sqrt(S), with Z the water surface elevation and S its slope gained from satellite altimetry, a and b power law coefficient and exponent and Z0 the river bed elevation such as Q(Z0) = 0. For several river reaches in the Amazon basin where ADCP measurements are available, the Z0 values are fairly well validated with a relative error lower than 10%. The present contribution aims at relating the identifiability and the physical meaning of a, b and Z0given various hydraulic and geomorphologic conditions. Synthetic river bathymetries sampling a wide range of rivers and inflow discharges are used to perform twin experiments. A shallow water model is run for generating synthetic satellite observations, and then rating curve parameters are determined for each river section thanks to a MCMC algorithm. Thanks to twin experiments, it is shown that rating curve formulation with water surface slope, i.e. closer from Manning equation form, improves parameter identifiability. The compensation between parameters is limited, especially for reaches with little water surface variability. Rating curve parameters are analyzed for riffle and pools for small to large rivers, different river slopes and cross section shapes. It is shown that the river bed
International Nuclear Information System (INIS)
Krakover, Naftaly; Krylov, Slava; Ilic, B Robert
2016-01-01
The ability to control nonlinear interactions of suspended mechanical structures offers a unique opportunity to engineer rich dynamical behavior that extends the dynamic range and ultimate device sensitivity. We demonstrate a displacement sensing technique based on resonant frequency monitoring of curved, doubly clamped, bistable micromechanical beams interacting with a movable electrode. In this configuration, the electrode displacement influences the nonlinear electrostatic interactions, effective stiffness and frequency of the curved beam. Increased sensitivity is made possible by dynamically operating the beam near the snap-through bistability onset. Various in-plane device architectures were fabricated from single crystal silicon and measured under ambient conditions using laser Doppler vibrometry. In agreement with the reduced order Galerkin-based model predictions, our experimental results show a significant resonant frequency reduction near critical snap-through, followed by a frequency increase within the post-buckling configuration. Interactions with a stationary electrode yield a voltage sensitivity up to ≈560 Hz V −1 and results with a movable electrode allow motion sensitivity up to ≈1.5 Hz nm −1 . Our theoretical and experimental results collectively reveal the potential of displacement sensing using nonlinear interactions of geometrically curved beams near instabilities, with possible applications ranging from highly sensitive resonant inertial detectors to complex optomechanical platforms providing an interface between the classical and quantum domains. (paper)
Curve aligning approach for gait authentication based on a wearable accelerometer
International Nuclear Information System (INIS)
Sun, Hu; Yuao, Tao
2012-01-01
Gait authentication based on a wearable accelerometer is a novel biometric which can be used for identity identification, medical rehabilitation and early detection of neurological disorders. The method for matching gait patterns tells heavily on authentication performances. In this paper, curve aligning is introduced as a new method for matching gait patterns and it is compared with correlation and dynamic time warping (DTW). A support vector machine (SVM) is proposed to fuse pattern-matching methods in a decision level. Accelerations collected from ankles of 22 walking subjects are processed for authentications in our experiments. The fusion of curve aligning with backward–forward accelerations and DTW with vertical accelerations promotes authentication performances substantially and consistently. This fusion algorithm is tested repeatedly. Its mean and standard deviation of equal error rates are 0.794% and 0.696%, respectively, whereas among all presented non-fusion algorithms, the best one shows an EER of 3.03%. (paper)
2011-01-01
Background Simulation-based medical education has been widely used in medical skills training; however, the effectiveness and long-term outcome of simulation-based training in thoracentesis requires further investigation. The purpose of this study was to assess the learning curve of simulation-based thoracentesis training, study skills retention and transfer of knowledge to a clinical setting following simulation-based education intervention in thoracentesis procedures. Methods Fifty-two medical students were enrolled in this study. Each participant performed five supervised trials on the simulator. Participant's performance was assessed by performance score (PS), procedure time (PT), and participant's confidence (PC). Learning curves for each variable were generated. Long-term outcome of the training was measured by the retesting and clinical performance evaluation 6 months and 1 year, respectively, after initial training on the simulator. Results Significant improvements in PS, PT, and PC were noted among the first 3 to 4 test trials (p 0.05). Clinical competency in thoracentesis was improved in participants who received simulation training relative to that of first year medical residents without such experience (p simulation-based thoracentesis training can significantly improve an individual's performance. The saturation of learning from the simulator can be achieved after four practice sessions. Simulation-based training can assist in long-term retention of skills and can be partially transferred to clinical practice. PMID:21696584
Long-term hydrological simulation based on the Soil Conservation Service curve number
Mishra, Surendra Kumar; Singh, Vijay P.
2004-05-01
Presenting a critical review of daily flow simulation models based on the Soil Conservation Service curve number (SCS-CN), this paper introduces a more versatile model based on the modified SCS-CN method, which specializes into seven cases. The proposed model was applied to the Hemavati watershed (area = 600 km2) in India and was found to yield satisfactory results in both calibration and validation. The model conserved monthly and annual runoff volumes satisfactorily. A sensitivity analysis of the model parameters was performed, including the effect of variation in storm duration. Finally, to investigate the model components, all seven variants of the modified version were tested for their suitability.
Energy Technology Data Exchange (ETDEWEB)
Jenkin, Thomas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Larson, Andrew [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Mark F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ben [U.S. Department of Energy; Spitsen, Paul [U.S. Department of Energy
2018-03-27
In light of the changing electricity resource mixes across the United States, an important question in electricity modeling is how additions and retirements of generation, including additions in variable renewable energy (VRE) generation could impact markets by changing hourly wholesale energy prices. Instead of using resource-intensive production cost models (PCMs) or building and using simple generator supply curves, this analysis uses a 'top-down' approach based on regression analysis of hourly historical energy and load data to estimate the impact of supply changes on wholesale electricity prices, provided the changes are not so substantial that they fundamentally alter the market and dispatch-order driven behavior of non-retiring units. The rolling supply curve (RSC) method used in this report estimates the shape of the supply curve that fits historical hourly price and load data for given time intervals, such as two-weeks, and then repeats this on a rolling basis through the year. These supply curves can then be modified on an hourly basis to reflect the impact of generation retirements or additions, including VRE and then reapplied to the same load data to estimate the change in hourly electricity price. The choice of duration over which these RSCs are estimated has a significant impact on goodness of fit. For example, in PJM in 2015, moving from fitting one curve per year to 26 rolling two-week supply curves improves the standard error of the regression from 16 dollars/MWh to 6 dollars/MWh and the R-squared of the estimate from 0.48 to 0.76. We illustrate the potential use and value of the RSC method by estimating wholesale price effects under various generator retirement and addition scenarios, and we discuss potential limits of the technique, some of which are inherent. The ability to do this type of analysis is important to a wide range of market participants and other stakeholders, and it may have a role in complementing use of or providing
Null to time-like infinity Green’s functions for asymptotic symmetries in Minkowski spacetime
International Nuclear Information System (INIS)
Campiglia, Miguel
2015-01-01
We elaborate on the Green’s functions that appeared in http://dx.doi.org/10.1007/JHEP07(2015)115http://arxiv.org/abs/1509.01406 when generalizing, from massless to massive particles, various equivalences between soft theorems and Ward identities of large gauge symmetries. We analyze these Green’s functions in considerable detail and show that they form a hierarchy of functions which describe ‘boundary to bulk’ propagators for large U(1) gauge parameters, supertranslations and sphere vector fields respectively. As a consistency check we verify that the Green’s functions associated to the large diffeomorphisms map the Poincare group at null infinity to the Poincare group at time-like infinity.
Boer, Marie
2017-09-01
Generalized Parton Distributions (GPDs) contain the correlation between the parton's longitudinal momentum and their transverse distribution. They are accessed through hard exclusive processes, such as Deeply Virtual Compton Scattering (DVCS). DVCS has already been measured in several experiments and several models allow for extracting GPDs from these measurements. Timelike Compton Scattering (TCS) is, at leading order, the time-reversal equivalent process to DVCS and accesses GPDs at the same kinematics. Comparing GPDs extracted from DVCS and TCS is a unique way for proving GPD universality. Combining fits from the two processes will also allow for better constraining the GPDs. We will present our method for extracting GPDs from DVCS and TCS pseudo-data. We will compare fit results from the two processes in similar conditions and present what can be expected in term of contraints on GPDs from combined fits.
The pion-nucleon form factor in space- and time-like regions
International Nuclear Information System (INIS)
Speth, J.; Tegen, R.
1990-01-01
We investigate the pion-nucleon vertex function in space- and time-like regions; these vertex functions appear as internal vertices in nucleon-antinucleon annihilations into n pions (n=2, 3, ..., 6). It is emphasised that only relativistic quark models can account for these vertices where one of the baryons/antibaryons is far off-shell with total energy close to zero. Using a novel 4-momentum projection technique we obtain results which generalize the usual (Breit frame) calculation of G πNN (k 2 ) (space-like) thereby removing completely the discrepancy in the Goldberger-Treiman relation. Our relativistic quark model calculation also explains the empirical suppression of antibaryonic contributions to the vertex functions Gsub(πNanti B) and Gsub(πBanti N) which enter in processes like Nanti N→ππ. (orig.)
Determination of critical nitrogen dilution curve based on stem dry matter in rice.
Directory of Open Access Journals (Sweden)
Syed Tahir Ata-Ul-Karim
Full Text Available Plant analysis is a very promising diagnostic tool for assessment of crop nitrogen (N requirements in perspectives of cost effective and environment friendly agriculture. Diagnosing N nutritional status of rice crop through plant analysis will give insights into optimizing N requirements of future crops. The present study was aimed to develop a new methodology for determining the critical nitrogen (Nc dilution curve based on stem dry matter (SDM and to assess its suitability to estimate the level of N nutrition for rice (Oryza sativa L. in east China. Three field experiments with varied N rates (0-360 kg N ha(-1 using three Japonica rice hybrids, Lingxiangyou-18, Wuxiangjing-14 and Wuyunjing were conducted in Jiangsu province of east China. SDM and stem N concentration (SNC were determined during vegetative stage for growth analysis. A Nc dilution curve based on SDM was described by the equation (Nc = 2.17W(-0.27 with W being SDM in t ha(-1, when SDM ranged from 0.88 to 7.94 t ha(-1. However, for SDM < 0.88 t ha(-1, the constant critical value Nc = 1.76% SDM was applied. The curve was dually validated for N-limiting and non-N-limiting growth conditions. The N nutrition index (NNI and accumulated N deficit (Nand of stem ranged from 0.57 to 1.06 and 51.1 to -7.07 kg N ha(-1, respectively, during key growth stages under varied N rates in 2010 and 2011. The values of ΔN derived from either NNI or Nand could be used as references for N dressing management during rice growth. Our results demonstrated that the present curve well differentiated the conditions of limiting and non-limiting N nutrition in rice crop. The SDM based Nc dilution curve can be adopted as an alternate and novel approach for evaluating plant N status to support N fertilization decision during the vegetative growth of Japonica rice in east China.
Uncertainty estimation with bias-correction for flow series based on rating curve
Shao, Quanxi; Lerat, Julien; Podger, Geoff; Dutta, Dushmanta
2014-03-01
Streamflow discharge constitutes one of the fundamental data required to perform water balance studies and develop hydrological models. A rating curve, designed based on a series of concurrent stage and discharge measurements at a gauging location, provides a way to generate complete discharge time series with a reasonable quality if sufficient measurement points are available. However, the associated uncertainty is frequently not available even though it has a significant impact on hydrological modelling. In this paper, we identify the discrepancy of the hydrographers' rating curves used to derive the historical discharge data series and proposed a modification by bias correction which is also in the form of power function as the traditional rating curve. In order to obtain the uncertainty estimation, we propose a further both-side Box-Cox transformation to stabilize the regression residuals as close to the normal distribution as possible, so that a proper uncertainty can be attached for the whole discharge series in the ensemble generation. We demonstrate the proposed method by applying it to the gauging stations in the Flinders and Gilbert rivers in north-west Queensland, Australia.
Diagnostic tests’ decision-making rules based upon analysis of ROC-curves
Directory of Open Access Journals (Sweden)
Л. В. Батюк
2015-10-01
Full Text Available In this paper we propose the model which substantiates diagnostics decision making based on the analysis of Receiver Operating Characteristic curves (ROC-curves and predicts optimal values of diagnostic indicators of biomedical information. To assess the quality of the test result prediction the standard criteria of the sensitivity and specificity of the model were used. Values of these criteria were calculated for the cases when the sensitivity of the test was greater than specificity by several times, when the number of correct diagnoses was maximal, when the sensitivity of the test was equal to its specificity and the sensitivity of the test was several times greater than the specificity of the test. To assess the significance of the factor characteristics and to compare the prognostic characteristics of models we used mathematical modeling and plotting the ROC-curves. The optimal value of the diagnostic indicator was found to be achieved when the sensitivity of the test is equal to its specificity. The model was adapted to solve the case when the sensitivity of the test is greater than specificity of the test.
Enhancement of global flood damage assessments using building material based vulnerability curves
Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen
2017-04-01
This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves
Optical fiber sensors for process refractometry and temperature measuring based on curved fibers
International Nuclear Information System (INIS)
Willsch, R.; Schwotzer, G.; Haubenreisser, W.; Jahn, J.U.
1986-01-01
Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples. (orig.) [de
Optical fiber sensors for process refractometry and temperature measuring based on curved fibers
Energy Technology Data Exchange (ETDEWEB)
Willsch, R; Schwotzer, G; Haubenreisser, W; Jahn, J U
1986-01-01
Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples.
On-chip magnetic bead-based DNA melting curve analysis using a magnetoresistive sensor
DEFF Research Database (Denmark)
Rizzi, Giovanni; Østerberg, Frederik Westergaard; Henriksen, Anders Dahl
2014-01-01
We present real-time measurements of DNA melting curves in a chip-based system that detects the amount of surface-bound magnetic beads using magnetoresistive magnetic field sensors. The sensors detect the difference between the amount of beads bound to the top and bottom sensor branches....... The beads are magnetized by the field arising from the bias current passed through the sensors. We demonstrate the first on-chip measurements of the melting of DNA hybrids upon a ramping of the temperature. This overcomes the limitation of using a single washing condition at constant temperature. Moreover...
Li, Y J; Wang, Y G; An, B; Xu, H; Liu, Y; Zhang, L C; Ma, H Y; Wang, W M
2016-01-01
A practical anodic and cathodic curve intersection model, which consisted of an apparent anodic curve and an imaginary cathodic line, was proposed to explain multiple corrosion potentials occurred in potentiodynamic polarization curves of Fe-based glassy alloys in alkaline solution. The apparent anodic curve was selected from the measured anodic curves. The imaginary cathodic line was obtained by linearly fitting the differences of anodic curves and can be moved evenly or rotated to predict the number and value of corrosion potentials.
Directory of Open Access Journals (Sweden)
Wei Chen
2017-01-01
Full Text Available Automated tool trajectory planning for spray painting robots is still a challenging problem, especially for a large complex curved surface. This paper presents a new method of trajectory optimization for spray painting robot based on exponential mean Bézier method. The definition and the three theorems of exponential mean Bézier curves are discussed. Then a spatial painting path generation method based on exponential mean Bézier curves is developed. A new simple algorithm for trajectory optimization on complex curved surfaces is introduced. A golden section method is adopted to calculate the values. The experimental results illustrate that the exponential mean Bézier curves enhanced flexibility of the path planning, and the trajectory optimization algorithm achieved satisfactory performance. This method can also be extended to other applications.
Energy Technology Data Exchange (ETDEWEB)
Huang, Xianjun, E-mail: xianjun.huang@manchester.ac.uk [School of Electrical and Electronic Engineering, University of Manchester, Manchester M13 9PL (United Kingdom); College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073 (China); Hu, Zhirun [School of Electrical and Electronic Engineering, University of Manchester, Manchester M13 9PL (United Kingdom); Liu, Peiguo [College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073 (China)
2014-11-15
This paper proposes a new type of graphene based tunable radar absorbing screen. The absorbing screen consists of Hilbert curve metal strip array and chemical vapour deposition (CVD) graphene sheet. The graphene based screen is not only tunable when the chemical potential of the graphene changes, but also has broadband effective absorption. The absorption bandwidth is from 8.9GHz to 18.1GHz, ie., relative bandwidth of more than 68%, at chemical potential of 0eV, which is significantly wider than that if the graphene sheet had not been employed. As the chemical potential varies from 0 to 0.4eV, the central frequency of the screen can be tuned from 13.5GHz to 19.0GHz. In the proposed structure, Hilbert curve metal strip array was designed to provide multiple narrow band resonances, whereas the graphene sheet directly underneath the metal strip array provides tunability and averagely required surface resistance so to significantly extend the screen operation bandwidth by providing broadband impedance matching and absorption. In addition, the thickness of the screen has been optimized to achieve nearly the minimum thickness limitation for a nonmagnetic absorber. The working principle of this absorbing screen is studied in details, and performance under various incident angles is presented. This work extends applications of graphene into tunable microwave radar cross section (RCS) reduction applications.
International Nuclear Information System (INIS)
Huang, Xianjun; Hu, Zhirun; Liu, Peiguo
2014-01-01
This paper proposes a new type of graphene based tunable radar absorbing screen. The absorbing screen consists of Hilbert curve metal strip array and chemical vapour deposition (CVD) graphene sheet. The graphene based screen is not only tunable when the chemical potential of the graphene changes, but also has broadband effective absorption. The absorption bandwidth is from 8.9GHz to 18.1GHz, ie., relative bandwidth of more than 68%, at chemical potential of 0eV, which is significantly wider than that if the graphene sheet had not been employed. As the chemical potential varies from 0 to 0.4eV, the central frequency of the screen can be tuned from 13.5GHz to 19.0GHz. In the proposed structure, Hilbert curve metal strip array was designed to provide multiple narrow band resonances, whereas the graphene sheet directly underneath the metal strip array provides tunability and averagely required surface resistance so to significantly extend the screen operation bandwidth by providing broadband impedance matching and absorption. In addition, the thickness of the screen has been optimized to achieve nearly the minimum thickness limitation for a nonmagnetic absorber. The working principle of this absorbing screen is studied in details, and performance under various incident angles is presented. This work extends applications of graphene into tunable microwave radar cross section (RCS) reduction applications
DEFF Research Database (Denmark)
Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja
2013-01-01
-arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...
Xu, Lili; Luo, Shuqian
2010-11-01
Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.
A Method for Formulizing Disaster Evacuation Demand Curves Based on SI Model
Directory of Open Access Journals (Sweden)
Yulei Song
2016-10-01
Full Text Available The prediction of evacuation demand curves is a crucial step in the disaster evacuation plan making, which directly affects the performance of the disaster evacuation. In this paper, we discuss the factors influencing individual evacuation decision making (whether and when to leave and summarize them into four kinds: individual characteristics, social influence, geographic location, and warning degree. In the view of social contagion of decision making, a method based on Susceptible-Infective (SI model is proposed to formulize the disaster evacuation demand curves to address both social influence and other factors’ effects. The disaster event of the “Tianjin Explosions” is used as a case study to illustrate the modeling results influenced by the four factors and perform the sensitivity analyses of the key parameters of the model. Some interesting phenomena are found and discussed, which is meaningful for authorities to make specific evacuation plans. For example, due to the lower social influence in isolated communities, extra actions might be taken to accelerate evacuation process in those communities.
Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation
Directory of Open Access Journals (Sweden)
Marisa W. Paryasto
2013-09-01
Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.
Automatic Curve Fitting Based on Radial Basis Functions and a Hierarchical Genetic Algorithm
Directory of Open Access Journals (Sweden)
G. Trejo-Caballero
2015-01-01
Full Text Available Curve fitting is a very challenging problem that arises in a wide variety of scientific and engineering applications. Given a set of data points, possibly noisy, the goal is to build a compact representation of the curve that corresponds to the best estimate of the unknown underlying relationship between two variables. Despite the large number of methods available to tackle this problem, it remains challenging and elusive. In this paper, a new method to tackle such problem using strictly a linear combination of radial basis functions (RBFs is proposed. To be more specific, we divide the parameter search space into linear and nonlinear parameter subspaces. We use a hierarchical genetic algorithm (HGA to minimize a model selection criterion, which allows us to automatically and simultaneously determine the nonlinear parameters and then, by the least-squares method through Singular Value Decomposition method, to compute the linear parameters. The method is fully automatic and does not require subjective parameters, for example, smooth factor or centre locations, to perform the solution. In order to validate the efficacy of our approach, we perform an experimental study with several tests on benchmarks smooth functions. A comparative analysis with two successful methods based on RBF networks has been included.
Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation
Directory of Open Access Journals (Sweden)
Marisa W. Paryasto
2012-04-01
Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.
An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment
Directory of Open Access Journals (Sweden)
Vinothkumar Muthurajan
2016-01-01
Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.
An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.
Muthurajan, Vinothkumar; Narayanasamy, Balaji
2016-01-01
Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.
FLAT TIME-LIKE SUBMANIFOLDS IN ANTI-DE SITTER SPACE H12n-1(-1)
Institute of Scientific and Technical Information of China (English)
ZUO DAFENG; CHEN QING; CHENG YI
2005-01-01
By using dressing actions of the Gn-1 1,1,n-1-system, the authors study geometric transformations for flat time-like n-submanifolds with flat, non-degenerate normal bun dle in anti-de Sitter space H1 2n-1(-1), where G1,1 n-1,n-1= O(2n - 2, 2)/O(n - 1, 1) ×O(n - 1, 1).
Spectral optimization simulation of white light based on the photopic eye-sensitivity curve
Energy Technology Data Exchange (ETDEWEB)
Dai, Qi, E-mail: qidai@tongji.edu.cn [College of Architecture and Urban Planning, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Institute for Advanced Study, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Key Laboratory of Ecology and Energy-saving Study of Dense Habitat (Tongji University), Ministry of Education, 1239 Siping Road, Shanghai 200092 (China); Hao, Luoxi; Lin, Yi; Cui, Zhe [College of Architecture and Urban Planning, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Key Laboratory of Ecology and Energy-saving Study of Dense Habitat (Tongji University), Ministry of Education, 1239 Siping Road, Shanghai 200092 (China)
2016-02-07
Spectral optimization simulation of white light is studied to boost maximum attainable luminous efficacy of radiation at high color-rendering index (CRI) and various color temperatures. The photopic eye-sensitivity curve V(λ) is utilized as the dominant portion of white light spectra. Emission spectra of a blue InGaN light-emitting diode (LED) and a red AlInGaP LED are added to the spectrum of V(λ) to match white color coordinates. It is demonstrated that at the condition of color temperature from 2500 K to 6500 K and CRI above 90, such white sources can achieve spectral efficacy of 330–390 lm/W, which is higher than the previously reported theoretical maximum values. We show that this eye-sensitivity-based approach also has advantages on component energy conversion efficiency compared with previously reported optimization solutions.
Spectral optimization simulation of white light based on the photopic eye-sensitivity curve
International Nuclear Information System (INIS)
Dai, Qi; Hao, Luoxi; Lin, Yi; Cui, Zhe
2016-01-01
Spectral optimization simulation of white light is studied to boost maximum attainable luminous efficacy of radiation at high color-rendering index (CRI) and various color temperatures. The photopic eye-sensitivity curve V(λ) is utilized as the dominant portion of white light spectra. Emission spectra of a blue InGaN light-emitting diode (LED) and a red AlInGaP LED are added to the spectrum of V(λ) to match white color coordinates. It is demonstrated that at the condition of color temperature from 2500 K to 6500 K and CRI above 90, such white sources can achieve spectral efficacy of 330–390 lm/W, which is higher than the previously reported theoretical maximum values. We show that this eye-sensitivity-based approach also has advantages on component energy conversion efficiency compared with previously reported optimization solutions
A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.
Chen, Huifang; Ge, Linlin; Xie, Lei
2015-07-14
The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.
Renson, Ludovic; Barton, David A. W.; Neild, Simon A.
Control-based continuation (CBC) is a means of applying numerical continuation directly to a physical experiment for bifurcation analysis without the use of a mathematical model. CBC enables the detection and tracking of bifurcations directly, without the need for a post-processing stage as is often the case for more traditional experimental approaches. In this paper, we use CBC to directly locate limit-point bifurcations of a periodically forced oscillator and track them as forcing parameters are varied. Backbone curves, which capture the overall frequency-amplitude dependence of the system’s forced response, are also traced out directly. The proposed method is demonstrated on a single-degree-of-freedom mechanical system with a nonlinear stiffness characteristic. Results are presented for two configurations of the nonlinearity — one where it exhibits a hardening stiffness characteristic and one where it exhibits softening-hardening.
A study of swing-curve physics in diffraction-based overlay
Bhattacharyya, Kaustuve; den Boef, Arie; Storms, Greet; van Heijst, Joost; Noot, Marc; An, Kevin; Park, Noh-Kyoung; Jeon, Se-Ra; Oh, Nang-Lyeom; McNamara, Elliott; van de Mast, Frank; Oh, SeungHwa; Lee, Seung Yoon; Hwang, Chan; Lee, Kuntack
2016-03-01
With the increase of process complexity in advanced nodes, the requirements of process robustness in overlay metrology continues to tighten. Especially with the introduction of newer materials in the film-stack along with typical stack variations (thickness, optical properties, profile asymmetry etc.), the signal formation physics in diffraction-based overlay (DBO) becomes an important aspect to apply in overlay metrology target and recipe selection. In order to address the signal formation physics, an effort is made towards studying the swing-curve phenomena through wavelength and polarizations on production stacks using simulations as well as experimental technique using DBO. The results provide a wealth of information on target and recipe selection for robustness. Details from simulation and measurements will be reported in this technical publication.
Directory of Open Access Journals (Sweden)
Satar Rezaei
2016-06-01
Full Text Available Introduction: Inequality is prevalent in all sectors, particularly in distribution of and access to resources in the health sector. The aim of current study was to investigate the distribution of physicians and hospital beds in Iran in 2001, 2006 and 2011. Methods: This retrospective, cross-sectional study evaluated the distribution of physicians and hospital beds in 2001, 2006 and 2011 using Gini coefficient and Lorenz curve. The required data, including the number of physicians (general practitioners and specialists, number of hospital beds and number of hospitalized patients were obtained from the statistical yearbook of Iranian Statistical Center (ISC. The data analysis was performed by DASP software. Results: The Gini Coefficients for physicians and hospital beds based on population in 2001 were 0.19 and 0.16, and based on hospitalized patients, were 0.48 and 0.37, respectively. In 2006, these values were found to be 0.18 and 0.15 based on population, and 0.21 and 0.21 based on hospitalized patients, respectively. In 2011, however, the Gini coefficients were reported to be 0.16 and 0.13 based on population, and 0.47 and 0.37 based on hospitalized patients, respectively. Although distribution status had improved in 2011compared with 2001 in terms of population and number of hospitalized patients, there was more inequality in distribution based on the number of hospitalized patients than based on population. Conclusion: This study indicated that inequality in distribution of physicians and hospital beds was declined in 2011 compared with 2001. This distribution was based on the population, so it is suggested that, in allocation of resource, the health policymakers consider such need indices as the pattern of diseases and illness-prone areas, number of inpatients, and mortality.
High resolution melt curve analysis based on methylation status for human semen identification.
Fachet, Caitlyn; Quarino, Lawrence; Karnas, K Joy
2017-03-01
A high resolution melt curve assay to differentiate semen from blood, saliva, urine, and vaginal fluid based on methylation status at the Dapper Isoform 1 (DACT1) gene was developed. Stains made from blood, saliva, urine, semen, and vaginal fluid were obtained from volunteers and DNA was isolated using either organic extraction (saliva, urine, and vaginal fluid) or Chelex ® 100 extraction (blood and semen). Extracts were then subjected to bisulfite modification in order to convert unmethylated cytosines to uracil, consequently creating sequences whose amplicons have melt curves that vary depending on their initial methylation status. When primers designed to amplify the promoter region of the DACT1 gene were used, DNA from semen samples was distinguishable from other fluids by a having a statistically significant lower melting temperature. The assay was found to be sperm-significant since semen from a vasectomized man produced a melting temperature similar to the non-semen body fluids. Blood and semen stains stored up to 5 months and tested at various intervals showed little variation in melt temperature indicating the methylation status was stable during the course of the study. The assay is a more viable method for forensic science practice than most molecular-based methods for body fluid stain identification since it is time efficient and utilizes instrumentation common to forensic biology laboratories. In addition, the assay is advantageous over traditional presumptive chemical methods for body fluid identification since results are confirmatory and the assay offers the possibility of multiplexing which may test for multiple body fluids simultaneously.
Determination of performance degradation of a marine diesel engine by using curve based approach
International Nuclear Information System (INIS)
Kökkülünk, Görkem; Parlak, Adnan; Erdem, Hasan Hüseyin
2016-01-01
Highlights: • Mathematical model was developed for a marine diesel engine. • Measurements were taken from Main Engine of M/V Ince Inebolu. • The model was validated for the marine diesel engine. • Curve Based Method was performed to evaluate the performance. • Degradation values of a marine diesel engine were found for power and SFC. - Abstract: Nowadays, energy efficiency measures on ships are the top priority topic for the maritime sector. One of the important key parameters of energy efficiency is to find the useful tool to improve the energy efficiency. There are two steps to improve the energy efficiency on ships: Measurement and Evaluation of performance of main fuel consumers. Performance evaluation is the method that evaluates how much the performance changes owing to engine component degradation which cause to reduce the performance due to wear, fouling, mechanical problems, etc. In this study, zero dimensional two zone combustion model is developed and validated for two stroke marine diesel engine (MITSUI MAN B&W 6S50MC). The measurements are taken from a real ship named M/V Ince Inebolu by the research team during the normal operation of the main engine in the region of the Marmara Sea. To evaluate the performance, “Curve based method” is used to calculate the total performance degradation. This total degradation is classified as parameters of compression pressure, injection timing, injection pressure, scavenge air temperature and scavenge air pressure by means of developed mathematical model. In conclusion, the total degradation of the applied ship is found as 620 kW by power and 26.74 g/kW h by specific fuel consumption.
International Nuclear Information System (INIS)
Lindgren, Jonathan
2016-01-01
We study collisions of massive pointlike particles in three dimensional anti-de Sitter space, generalizing the work on massless particles in http://dx.doi.org/10.1088/0264-9381/33/14/145009. We show how to construct exact solutions corresponding to the formation of either a black hole or a conical singularity from the collision of an arbitrary number of massive particles that fall in radially and collide at the origin of AdS. No restrictions on the masses or the angular and radial positions from where the particles are released, are imposed. We also consider the limit of an infinite number of particles, obtaining novel timelike thin shell spacetimes. These thin shells have an arbitrary mass distribution as well as a non-trivial embedding where the radial location of the shell depends on the angular coordinate, and we analyze these shells using the junction formalism of general relativity. We also consider the massless limit and find consistency with earlier results, as well as comment on the stress-energy tensor modes of the dual CFT.
Energy Technology Data Exchange (ETDEWEB)
Lindgren, Jonathan [Theoretische Natuurkunde, Vrije Universiteit Brussel, and the International Solvay Institutes,Pleinlaan 2, B-1050 Brussels (Belgium); Physique Théorique et Mathématique, Université Libre de Bruxelles,Campus Plaine C.P. 231, B-1050 Bruxelles (Belgium)
2016-12-13
We study collisions of massive pointlike particles in three dimensional anti-de Sitter space, generalizing the work on massless particles in http://dx.doi.org/10.1088/0264-9381/33/14/145009. We show how to construct exact solutions corresponding to the formation of either a black hole or a conical singularity from the collision of an arbitrary number of massive particles that fall in radially and collide at the origin of AdS. No restrictions on the masses or the angular and radial positions from where the particles are released, are imposed. We also consider the limit of an infinite number of particles, obtaining novel timelike thin shell spacetimes. These thin shells have an arbitrary mass distribution as well as a non-trivial embedding where the radial location of the shell depends on the angular coordinate, and we analyze these shells using the junction formalism of general relativity. We also consider the massless limit and find consistency with earlier results, as well as comment on the stress-energy tensor modes of the dual CFT.
Rational quadratic trigonometric Bézier curve based on new basis with exponential functions
Directory of Open Access Journals (Sweden)
Wu Beibei
2017-06-01
Full Text Available We construct a rational quadratic trigonometric Bézier curve with four shape parameters by introducing two exponential functions into the trigonometric basis functions in this paper. It has the similar properties as the rational quadratic Bézier curve. For given control points, the shape of the curve can be flexibly adjusted by changing the shape parameters and the weight. Some conics can be exactly represented when the control points, the shape parameters and the weight are chosen appropriately. The C0, C1 and C2 continuous conditions for joining two constructed curves are discussed. Some examples are given.
Stage-discharge rating curves based on satellite altimetry and modeled discharge in the Amazon basin
Paris, Adrien; Dias de Paiva, Rodrigo; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Calmant, Stephane; Garambois, Pierre-André; Collischonn, Walter; Bonnet, Marie-Paule; Seyler, Frederique
2016-05-01
In this study, rating curves (RCs) were determined by applying satellite altimetry to a poorly gauged basin. This study demonstrates the synergistic application of remote sensing and watershed modeling to capture the dynamics and quantity of flow in the Amazon River Basin, respectively. Three major advancements for estimating basin-scale patterns in river discharge are described. The first advancement is the preservation of the hydrological meanings of the parameters expressed by Manning's equation to obtain a data set containing the elevations of the river beds throughout the basin. The second advancement is the provision of parameter uncertainties and, therefore, the uncertainties in the rated discharge. The third advancement concerns estimating the discharge while considering backwater effects. We analyzed the Amazon Basin using nearly one thousand series that were obtained from ENVISAT and Jason-2 altimetry for more than 100 tributaries. Discharge values and related uncertainties were obtained from the rain-discharge MGB-IPH model. We used a global optimization algorithm based on the Monte Carlo Markov Chain and Bayesian framework to determine the rating curves. The data were randomly allocated into 80% calibration and 20% validation subsets. A comparison with the validation samples produced a Nash-Sutcliffe efficiency (Ens) of 0.68. When the MGB discharge uncertainties were less than 5%, the Ens value increased to 0.81 (mean). A comparison with the in situ discharge resulted in an Ens value of 0.71 for the validation samples (and 0.77 for calibration). The Ens values at the mouths of the rivers that experienced backwater effects significantly improved when the mean monthly slope was included in the RC. Our RCs were not mission-dependent, and the Ens value was preserved when applying ENVISAT rating curves to Jason-2 altimetry at crossovers. The cease-to-flow parameter of our RCs provided a good proxy for determining river bed elevation. This proxy was validated
Detection of Time Lags between Quasar Continuum Emission Bands Based On Pan-STARRS Light Curves
Energy Technology Data Exchange (ETDEWEB)
Jiang, Yan-Fei [Kavli Institute for Theoretical Physics, University of California, Santa Barbara, CA 93106 (United States); Green, Paul J.; Pancoast, Anna; MacLeod, Chelsea L. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Greene, Jenny E. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Morganson, Eric; Shen, Yue [Department of Astronomy, University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States); Anderson, Scott F.; Ruan, John J. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Brandt, W. N.; Grier, C. J. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Rix, H.-W. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Protopapas, Pavlos [Institute for Applied Computational Science, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138 (United States); Scott, Caroline [Astrophysics, Imperial College London, Blackett Laboratory, London SW7 2AZ (United Kingdom); Burgett, W. S.; Hodapp, K. W.; Huber, M. E.; Kaiser, N.; Kudritzki, R. P.; Magnier, E. A. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu HI 96822 (United States); and others
2017-02-20
We study the time lags between the continuum emission of quasars at different wavelengths, based on more than four years of multi-band ( g , r , i , z ) light curves in the Pan-STARRS Medium Deep Fields. As photons from different bands emerge from different radial ranges in the accretion disk, the lags constrain the sizes of the accretion disks. We select 240 quasars with redshifts of z ≈ 1 or z ≈ 0.3 that are relatively emission-line free. The light curves are sampled from day to month timescales, which makes it possible to detect lags on the scale of the light crossing time of the accretion disks. With the code JAVELIN , we detect typical lags of several days in the rest frame between the g band and the riz bands. The detected lags are ∼2–3 times larger than the light crossing time estimated from the standard thin disk model, consistent with the recently measured lag in NGC 5548 and microlensing measurements of quasars. The lags in our sample are found to increase with increasing luminosity. Furthermore, the increase in lags going from g − r to g − i and then to g − z is slower than predicted in the thin disk model, particularly for high-luminosity quasars. The radial temperature profile in the disk must be different from what is assumed. We also find evidence that the lags decrease with increasing line ratios between ultraviolet Fe ii lines and Mg ii, which may point to changes in the accretion disk structure at higher metallicity.
On-chip magnetic bead-based DNA melting curve analysis using a magnetoresistive sensor
International Nuclear Information System (INIS)
Rizzi, Giovanni; Østerberg, Frederik W.; Henriksen, Anders D.; Dufva, Martin; Hansen, Mikkel F.
2015-01-01
We present real-time measurements of DNA melting curves in a chip-based system that detects the amount of surface-bound magnetic beads using magnetoresistive magnetic field sensors. The sensors detect the difference between the amount of beads bound to the top and bottom sensor branches of the differential sensor geometry. The sensor surfaces are functionalized with wild type (WT) and mutant type (MT) capture probes, differing by a single base insertion (a single nucleotide polymorphism, SNP). Complementary biotinylated targets in suspension couple streptavidin magnetic beads to the sensor surface. The beads are magnetized by the field arising from the bias current passed through the sensors. We demonstrate the first on-chip measurements of the melting of DNA hybrids upon a ramping of the temperature. This overcomes the limitation of using a single washing condition at constant temperature. Moreover, we demonstrate that a single sensor bridge can be used to genotype a SNP. - Highlights: • We apply magnetoresistive sensors to study solid-surface hybridization kinetics of DNA. • We measure DNA melting profiles for perfectly matching DNA duplexes and for a single base mismatch. • We present a procedure to correct for temperature dependencies of the sensor output. • We reliably extract melting temperatures for the DNA hybrids. • We demonstrate direct measurement of differential binding signal for two probes on a single sensor
A Method of Timbre-Shape Synthesis Based On Summation of Spherical Curves
DEFF Research Database (Denmark)
Putnam, Lance Jonathan
2014-01-01
It is well-known that there is a rich correspondence between sound and visual curves, perhaps most widely explored through direct input of sound into an oscilloscope. However, there have been relatively few proposals on how to translate sound into three-dimensional curves. We present a novel meth...
A Literature-Based Analysis of the Learning Curves of Laparoscopic Radical Prostatectomy
Directory of Open Access Journals (Sweden)
Daniel W. Good
2014-05-01
Full Text Available There is a trend for the increased adoption of minimally invasive techniques of radical prostatectomy (RP – laparoscopic (LRP and robotic assisted (RARP – from the traditional open radical retropubic prostatectomy (ORP, popularised by Partin et al. Recently there has been a dramatic expansion in the rates of RARP being performed, and there have been many early reports postulating that the learning curve for RARP is shorter than for LRP. The aim of this study was to review the literature and analyse the length of the LRP learning curves for the various outcome measures: perioperative, oncologic, and functional outcomes. A broad search of the literature was performed in November 2013 using the PubMed database. Only studies of real patients and those from 2004 until 2013 were included; those on simulators were excluded. In total, 239 studies were identified after which 13 were included. The learning curve is a heterogeneous entity, depending entirely on the criteria used to define it. There is evidence of multiple learning curves; however the length of these is dependent on the definitions used by the authors. Few studies use the more rigorous definition of plateauing of the curve. Perioperative learning curve takes approximately 150-200 cases to plateau, oncologic curve approximately 200 cases, and the functional learning curve up to 700 cases to plateau (700 for potency, 200 cases for continence. In this review, we have analysed the literature with respect to the learning curve for LRP. It is clear that the learning curve is long. This necessitates centralising LRP to high volume centres such that surgeons, trainees, and patients are able to utilise the benefits of LRP.
Steenhuis, T. S.; Mendoza, G.; Lyon, S. W.; Gerard Marchant, P.; Walter, M. T.; Schneiderman, E.
2003-04-01
Because the traditional Soil Conservation Service Curve Number (SCS-CN) approach continues to be ubiquitously used in GIS-BASED water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed within an integrated GIS modeling environment a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Spatial representation of hydrologic processes is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point source pollution. The methodology presented here uses the traditional SCS-CN method to predict runoff volume and spatial extent of saturated areas and uses a topographic index to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was incorporated in an existing GWLF water quality model and applied to sub-watersheds of the Delaware basin in the Catskill Mountains region of New York State. We found that the distributed CN-VSA approach provided a physically-based method that gives realistic results for watersheds with VSA hydrology.
Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use
Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil
2013-01-01
The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648
Probability- and curve-based fractal reconstruction on 2D DEM terrain profile
International Nuclear Information System (INIS)
Lai, F.-J.; Huang, Y.M.
2009-01-01
Data compression and reconstruction has been playing important roles in information science and engineering. As part of them, image compression and reconstruction that mainly deal with image data set reduction for storage or transmission and data set restoration with least loss is still a topic deserved a great deal of works to focus on. In this paper we propose a new scheme in comparison with the well-known Improved Douglas-Peucker (IDP) method to extract characteristic or feature points of two-dimensional digital elevation model (2D DEM) terrain profile to compress data set. As for reconstruction in use of fractal interpolation, we propose a probability-based method to speed up the fractal interpolation execution to a rate as high as triple or even ninefold of the regular. In addition, a curve-based method is proposed in the study to determine the vertical scaling factor that much affects the generation of the interpolated data points to significantly improve the reconstruction performance. Finally, an evaluation is made to show the advantage of employing the proposed new method to extract characteristic points associated with our novel fractal interpolation scheme.
Proposal of fatigue crack growth rate curve in air for nickel-base alloys used in BWR
International Nuclear Information System (INIS)
Ogawa, Takuya; Itatani, Masao; Nagase, Hiroshi; Aoike, Satoru; Yoneda, Hideki
2013-01-01
When the defects are detected in the nuclear components in Japan, structural integrity assessment should be performed for the technical judgment on continuous service based on the Rules on Fitness-for-Service for Nuclear Power Plants of the Japan Society of Mechanical Engineers Code (JSME FFS Code). Fatigue crack growth analysis is required when the cyclic loading would be applied for the components. Recently, fatigue crack growth rate curve in air environment for Nickel-base alloys weld metal used in BWR was proposed by the authors and it was adopted as a code case of JSME FFS Code to evaluate the embedded flaw. In this study, fatigue crack growth behavior for heat-affected zone (HAZ) of Nickel-base alloys in air was investigated. And a unified fatigue crack growth rate curve in air for HAZ and weld metal of Nickel-base alloys used in BWR was evaluated. As a result, it was found that the curve for weld metal could be applied as a curve for both HAZ and weld metal since moderately conservative assessment of fatigue crack growth rate of HAZ is possible by the curve for weld metal in the Paris region. And the threshold value of stress intensity far range (ΔK th ) is determined to 3.0 MPa√m based on the fatigue crack growth rate of HAZ. (author)
Non-sky-averaged sensitivity curves for space-based gravitational-wave observatories
International Nuclear Information System (INIS)
Vallisneri, Michele; Galley, Chad R
2012-01-01
The signal-to-noise ratio (SNR) is used in gravitational-wave observations as the basic figure of merit for detection confidence and, together with the Fisher matrix, for the amount of physical information that can be extracted from a detected signal. SNRs are usually computed from a sensitivity curve, which describes the gravitational-wave amplitude needed by a monochromatic source of given frequency to achieve a threshold SNR. Although the term 'sensitivity' is used loosely to refer to the detector's noise spectral density, the two quantities are not the same: the sensitivity includes also the frequency- and orientation-dependent response of the detector to gravitational waves and takes into account the duration of observation. For interferometric space-based detectors similar to LISA, which are sensitive to long-lived signals and have constantly changing position and orientation, exact SNRs need to be computed on a source-by-source basis. For convenience, most authors prefer to work with sky-averaged sensitivities, accepting inaccurate SNRs for individual sources and giving up control over the statistical distribution of SNRs for source populations. In this paper, we describe a straightforward end-to-end recipe to compute the non-sky-averaged sensitivity of interferometric space-based detectors of any geometry. This recipe includes the effects of spacecraft motion and of seasonal variations in the partially subtracted confusion foreground from Galactic binaries, and it can be used to generate a sampling distribution of sensitivities for a given source population. In effect, we derive error bars for the sky-averaged sensitivity curve, which provide a stringent statistical interpretation for previously unqualified statements about sky-averaged SNRs. As a worked-out example, we consider isotropic and Galactic-disk populations of monochromatic sources, as observed with the 'classic LISA' configuration. We confirm that the (standard) inverse-rms average sensitivity
Directory of Open Access Journals (Sweden)
Noriyuki Oka
Full Text Available In the brain, the mechanisms of attention to the left and the right are known to be different. It is possible that brain activity when driving also differs with different horizontal road alignments (left or right curves, but little is known about this. We found driver brain activity to be different when driving on left and right curves, in an experiment using a large-scale driving simulator and functional near-infrared spectroscopy (fNIRS.The participants were fifteen healthy adults. We created a course simulating an expressway, comprising straight line driving and gentle left and right curves, and monitored the participants under driving conditions, in which they drove at a constant speed of 100 km/h, and under non-driving conditions, in which they simply watched the screen (visual task. Changes in hemoglobin concentrations were monitored at 48 channels including the prefrontal cortex, the premotor cortex, the primary motor cortex and the parietal cortex. From orthogonal vectors of changes in deoxyhemoglobin and changes in oxyhemoglobin, we calculated changes in cerebral oxygen exchange, reflecting neural activity, and statistically compared the resulting values from the right and left curve sections.Under driving conditions, there were no sites where cerebral oxygen exchange increased significantly more during right curves than during left curves (p > 0.05, but cerebral oxygen exchange increased significantly more during left curves (p < 0.05 in the right premotor cortex, the right frontal eye field and the bilateral prefrontal cortex. Under non-driving conditions, increases were significantly greater during left curves (p < 0.05 only in the right frontal eye field.Left curve driving was thus found to require more brain activity at multiple sites, suggesting that left curve driving may require more visual attention than right curve driving. The right frontal eye field was activated under both driving and non-driving conditions.
Oka, Noriyuki; Yoshino, Kayoko; Yamamoto, Kouji; Takahashi, Hideki; Li, Shuguang; Sugimachi, Toshiyuki; Nakano, Kimihiko; Suda, Yoshihiro; Kato, Toshinori
2015-01-01
Objectives In the brain, the mechanisms of attention to the left and the right are known to be different. It is possible that brain activity when driving also differs with different horizontal road alignments (left or right curves), but little is known about this. We found driver brain activity to be different when driving on left and right curves, in an experiment using a large-scale driving simulator and functional near-infrared spectroscopy (fNIRS). Research Design and Methods The participants were fifteen healthy adults. We created a course simulating an expressway, comprising straight line driving and gentle left and right curves, and monitored the participants under driving conditions, in which they drove at a constant speed of 100 km/h, and under non-driving conditions, in which they simply watched the screen (visual task). Changes in hemoglobin concentrations were monitored at 48 channels including the prefrontal cortex, the premotor cortex, the primary motor cortex and the parietal cortex. From orthogonal vectors of changes in deoxyhemoglobin and changes in oxyhemoglobin, we calculated changes in cerebral oxygen exchange, reflecting neural activity, and statistically compared the resulting values from the right and left curve sections. Results Under driving conditions, there were no sites where cerebral oxygen exchange increased significantly more during right curves than during left curves (p > 0.05), but cerebral oxygen exchange increased significantly more during left curves (p right premotor cortex, the right frontal eye field and the bilateral prefrontal cortex. Under non-driving conditions, increases were significantly greater during left curves (p right frontal eye field. Conclusions Left curve driving was thus found to require more brain activity at multiple sites, suggesting that left curve driving may require more visual attention than right curve driving. The right frontal eye field was activated under both driving and non-driving conditions
Multiaxial fatigue criterion based on parameters from torsion and axial S-N curve
Directory of Open Access Journals (Sweden)
M. Margetin
2016-07-01
Full Text Available Multiaxial high cycle fatigue is a topic that concerns nearly all industrial domains. In recent years, a great deal of recommendations how to address problems with multiaxial fatigue life time estimation have been made and a huge progress in the field has been achieved. Until now, however, no universal criterion for multiaxial fatigue has been proposed. Addressing this situation, this paper offers a design of a new multiaxial criterion for high cycle fatigue. This criterion is based on critical plane search. Damage parameter consists of a combination of normal and shear stresses on a critical plane (which is a plane with maximal shear stress amplitude. Material parameters used in proposed criterion are obtained from torsion and axial S-N curves. Proposed criterion correctly calculates life time for boundary loading condition (pure torsion and pure axial loading. Application of proposed model is demonstrated on biaxial loading and the results are verified with testing program using specimens made from S355 steel. Fatigue material parameters for proposed criterion and multiple sets of data for different combination of axial and torsional loading have been obtained during the experiment.
Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis
Directory of Open Access Journals (Sweden)
Gustavo Rech
2013-03-01
Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.
Interior Temperature Measurement Using Curved Mercury Capillary Sensor Based on X-ray Radiography
Chen, Shuyue; Jiang, Xing; Lu, Guirong
2017-07-01
A method was presented for measuring the interior temperature of objects using a curved mercury capillary sensor based on X-ray radiography. The sensor is composed of a mercury bubble, a capillary and a fixed support. X-ray digital radiography was employed to capture image of the mercury column in the capillary, and a temperature control system was designed for the sensor calibration. We adopted livewire algorithms and mathematical morphology to calculate the mercury length. A measurement model relating mercury length to temperature was established, and the measurement uncertainty associated with the mercury column length and the linear model fitted by least-square method were analyzed. To verify the system, the interior temperature measurement of an autoclave, which is totally closed, was taken from 29.53°C to 67.34°C. The experiment results show that the response of the system is approximately linear with an uncertainty of maximum 0.79°C. This technique provides a new approach to measure interior temperature of objects.
Wang, Fei; Gong, Haoran; Chen, Xi; Chen, C. Q.
2016-09-01
Origami structures enrich the field of mechanical metamaterials with the ability to convert morphologically and systematically between two-dimensional (2D) thin sheets and three-dimensional (3D) spatial structures. In this study, an in-plane design method is proposed to approximate curved surfaces of interest with generalized Miura-ori units. Using this method, two combination types of crease lines are unified in one reprogrammable procedure, generating multiple types of cylindrical structures. Structural completeness conditions of the finite-thickness counterparts to the two types are also proposed. As an example of the design method, the kinematics and elastic properties of an origami-based circular cylindrical shell are analysed. The concept of Poisson’s ratio is extended to the cylindrical structures, demonstrating their auxetic property. An analytical model of rigid plates linked by elastic hinges, consistent with numerical simulations, is employed to describe the mechanical response of the structures. Under particular load patterns, the circular shells display novel mechanical behaviour such as snap-through and limiting folding positions. By analysing the geometry and mechanics of the origami structures, we extend the design space of mechanical metamaterials and provide a basis for their practical applications in science and engineering.
HYSOGs250m, global gridded hydrologic soil groups for curve-number-based runoff modeling.
Ross, C Wade; Prihodko, Lara; Anchang, Julius; Kumar, Sanath; Ji, Wenjie; Hanan, Niall P
2018-05-15
Hydrologic soil groups (HSGs) are a fundamental component of the USDA curve-number (CN) method for estimation of rainfall runoff; yet these data are not readily available in a format or spatial-resolution suitable for regional- and global-scale modeling applications. We developed a globally consistent, gridded dataset defining HSGs from soil texture, bedrock depth, and groundwater. The resulting data product-HYSOGs250m-represents runoff potential at 250 m spatial resolution. Our analysis indicates that the global distribution of soil is dominated by moderately high runoff potential, followed by moderately low, high, and low runoff potential. Low runoff potential, sandy soils are found primarily in parts of the Sahara and Arabian Deserts. High runoff potential soils occur predominantly within tropical and sub-tropical regions. No clear pattern could be discerned for moderately low runoff potential soils, as they occur in arid and humid environments and at both high and low elevations. Potential applications of this data include CN-based runoff modeling, flood risk assessment, and as a covariate for biogeographical analysis of vegetation distributions.
Energy Technology Data Exchange (ETDEWEB)
Li, Ben; He, Feng; Ouyang, Jiting, E-mail: jtouyang@bit.edu.cn [School of Physics, Beijing Institute of Technology, Beijing 100081 (China); Duan, Xiaoxi [Research Center of Laser Fusion, CAEP, Mianyang 621900 (China)
2015-12-15
Simulation work is very important for understanding the formation of self-organized discharge patterns. Previous works have witnessed different models derived from other systems for simulation of discharge pattern, but most of these models are complicated and time-consuming. In this paper, we introduce a convenient phenomenological dynamic model based on the basic dynamic process of glow discharge and the voltage transfer curve (VTC) to study the dielectric barrier glow discharge (DBGD) pattern. VTC is an important characteristic of DBGD, which plots the change of wall voltage after a discharge as a function of the initial total gap voltage. In the modeling, the combined effect of the discharge conditions is included in VTC, and the activation-inhibition effect is expressed by a spatial interaction term. Besides, the model reduces the dimensionality of the system by just considering the integration effect of current flow. All these greatly facilitate the construction of this model. Numerical simulations turn out to be in good accordance with our previous fluid modeling and experimental result.
Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.
Directory of Open Access Journals (Sweden)
Liping Zhang
Full Text Available In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.
Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.
Zhang, Liping; Tang, Shanyu; Luo, He
2016-01-01
In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.
A novel and compact spectral imaging system based on two curved prisms
Nie, Yunfeng; Bin, Xiangli; Zhou, Jinsong; Li, Yang
2013-09-01
As a novel detection approach which simultaneously acquires two-dimensional visual picture and one-dimensional spectral information, spectral imaging offers promising applications on biomedical imaging, conservation and identification of artworks, surveillance of food safety, and so forth. A novel moderate-resolution spectral imaging system consisting of merely two optical elements is illustrated in this paper. It can realize the function of a relay imaging system as well as a 10nm spectral resolution spectroscopy. Compared to conventional prismatic imaging spectrometers, this design is compact and concise with only two special curved prisms by utilizing two reflective surfaces. In contrast to spectral imagers based on diffractive grating, the usage of compound-prism possesses characteristics of higher energy utilization and wider free spectral range. The seidel aberration theory and dispersive principle of this special prism are analyzed at first. According to the results, the optical system of this design is simulated, and the performance evaluation including spot diagram, MTF and distortion, is presented. In the end, considering the difficulty and particularity of manufacture and alignment, an available method for fabrication and measurement is proposed.
International Nuclear Information System (INIS)
Ramazani, A.; Mukherjee, K.; Quade, H.; Prahl, U.; Bleck, W.
2013-01-01
A microstructure-based approach by means of representative volume elements (RVEs) is employed to evaluate the flow curve of DP steels using virtual tensile tests. Microstructures with different martensite fractions and morphologies are studied in two- and three-dimensional approaches. Micro sections of DP microstructures with various amounts of martensite have been converted to 2D RVEs, while 3D RVEs were constructed statistically with randomly distributed phases. A dislocation-based model is used to describe the flow curve of each ferrite and martensite phase separately as a function of carbon partitioning and microstructural features. Numerical tensile tests of RVE were carried out using the ABAQUS/Standard code to predict the flow behaviour of DP steels. It is observed that 2D plane strain modelling gives an underpredicted flow curve for DP steels, while the 3D modelling gives a quantitatively reasonable description of flow curve in comparison to the experimental data. In this work, a von Mises stress correlation factor σ 3D /σ 2D has been identified to compare the predicted flow curves of these two dimensionalities showing a third order polynomial relation with respect to martensite fraction and a second order polynomial relation with respect to equivalent plastic strain, respectively. The quantification of this polynomial correlation factor is performed based on laboratory-annealed DP600 chemistry with varying martensite content and it is validated for industrially produced DP qualities with various chemistry, strength level and martensite fraction.
Model-based methodology to develop the isochronous stress-strain curves for modified 9Cr steels
International Nuclear Information System (INIS)
Kim, Woo Gon; Yin, Song Nan; Kim, Sung Ho; Lee, Chan Bock; Jung, Ik Hee
2008-01-01
Since high temperature materials are designed with a target life based on a specified amount of allowable strain and stress, their Isochronous Stress-Strain Curves (ISSC) are needed to avoid an excessive deformation during an intended service life. In this paper, a model-based methodology to develop the isochronous curves for a G91 steel is described. Creep strain-time curves were reviewed for typical high-temperature materials, and Garofalo's model which conforms well to the primary and secondary creep stages was proper for the G91 steel. Procedures to obtain an instantaneous elastic-plastic strain, ε i were given in detail. Also, to accurately determine the P 1 , P 2 and P 3 parameters in the Garofalo's model, a Nonlinear Least Square Fitting (NLSF) method was adopted and useful. The long-term creep curves for the G91 steel can be modeled by the Garofalo's model, and the long-term ISSCs can be developed using the modeled creep curves
Supergravity on an Atiyah-Hitchin base
International Nuclear Information System (INIS)
Stotyn, Sean; Mann, R.B.
2008-01-01
We construct solutions to five dimensional minimal supergravity using an Atiyah-Hitchin base space. In examining the structure of solutions we show that they generically contain a singularity either on the Atiyah-Hitchin bolt or at larger radius where there is a singular solitonic boundary. However for most points in parameter space the solution exhibits a velocity of light surface (analogous to what appears in a Goedel space-time) that shields the singularity. For these solutions, all closed time-like curves are causally disconnected from the rest of the space-time in that they exist within the velocity of light surface, which null geodesics are unable to cross. The singularities in these solutions are thus found to be hidden behind the velocity of light surface and so are not naked despite the lack of an event horizon. Outside of this surface the space-time is geodesically complete, asymptotically flat and can be arranged so as not to contain closed time-like curves at infinity. The rest of parameter space simply yields solutions with naked singularities.
Memristance controlling approach based on modification of linear M—q curve
International Nuclear Information System (INIS)
Liu Hai-Jun; Li Zhi-Wei; Yu Hong-Qi; Sun Zhao-Lin; Nie Hong-Shan
2014-01-01
The memristor has broad application prospects in many fields, while in many cases, those fields require accurate impedance control. The nonlinear model is of great importance for realizing memristance control accurately, but the implementing complexity caused by iteration has limited the actual application of this model. Considering the approximate linear characteristics at the middle region of the memristance-charge (M—q) curve of the nonlinear model, this paper proposes a memristance controlling approach, which is achieved by linearizing the middle region of the M—q curve of the nonlinear memristor, and establishes the linear relationship between memristances M and input excitations so that it can realize impedance control precisely by only adjusting input signals briefly. First, it analyzes the feasibility for linearizing the middle part of the M—q curve of the memristor with a nonlinear model from the qualitative perspective. Then, the linearization equations of the middle region of the M—q curve is constructed by using the shift method, and under a sinusoidal excitation case, the analytical relation between the memristance M and the charge time t is derived through the Taylor series expansions. At last, the performance of the proposed approach is demonstrated, including the linearizing capability for the middle part of the M—q curve of the nonlinear model memristor, the controlling ability for memristance M, and the influence of input excitation on linearization errors. (interdisciplinary physics and related areas of science and technology)
Directory of Open Access Journals (Sweden)
M. Sivapalan
2012-11-01
Full Text Available Predictions of hydrological responses in ungauged catchments can benefit from a classification scheme that can organize and pool together catchments that exhibit a level of hydrologic similarity, especially similarity in some key variable or signature of interest. Since catchments are complex systems with a level of self-organization arising from co-evolution of climate and landscape properties, including vegetation, there is much to be gained from developing a classification system based on a comparative study of a population of catchments across climatic and landscape gradients. The focus of this paper is on climate seasonality and seasonal runoff regime, as characterized by the ensemble mean of within-year variation of climate and runoff. The work on regime behavior is part of an overall study of the physical controls on regional patterns of flow duration curves (FDCs, motivated by the fact that regime behavior leaves a major imprint upon the shape of FDCs, especially the slope of the FDCs. As an exercise in comparative hydrology, the paper seeks to assess the regime behavior of 428 catchments from the MOPEX database simultaneously, classifying and regionalizing them into homogeneous or hydrologically similar groups. A decision tree is developed on the basis of a metric chosen to characterize similarity of regime behavior, using a variant of the Iterative Dichotomiser 3 (ID3 algorithm to form a classification tree and associated catchment classes. In this way, several classes of catchments are distinguished, in which the connection between the five catchments' regime behavior and climate and catchment properties becomes clearer. Only four similarity indices are entered into the algorithm, all of which are obtained from smoothed daily regime curves of climatic variables and runoff. Results demonstrate that climate seasonality plays the most significant role in the classification of US catchments, with rainfall timing and climatic aridity index
Quantitative analysis by laser-induced breakdown spectroscopy based on generalized curves of growth
Energy Technology Data Exchange (ETDEWEB)
Aragón, C., E-mail: carlos.aragon@unavarra.es; Aguilera, J.A.
2015-08-01
A method for quantitative elemental analysis by laser-induced breakdown spectroscopy (LIBS) is proposed. The method (Cσ-LIBS) is based on Cσ graphs, generalized curves of growth which allow including several lines of various elements at different concentrations. A so-called homogeneous double (HD) model of the laser-induced plasma is used, defined by an integration over a single-region of the radiative transfer equation, combined with a separated treatment for neutral atoms (z = 0) and singly-charged ions (z = 1) in Cσ graphs and characteristic parameters. The procedure includes a criterion, based on a model limit, for eliminating data which, due to a high line intensity or concentration, are not well described by the HD model. An initial procedure provides a set of parameters (βA){sup z}, (ηNl){sup z}, T{sup z} and N{sub e}{sup z} (z = 0, 1) which characterize the plasma and the LIBS system. After characterization, two different analytical procedures, resulting in relative and absolute concentrations, may be applied. To test the method, fused glass samples prepared from certified slags and pure compounds are analyzed. We determine concentrations of Ca, Mn, Mg, V, Ti, Si and Al relative to Fe in three samples prepared from slags, and absolute concentrations of Fe, Ca and Mn in three samples prepared from Fe{sub 2}O{sub 3}, CaCO{sub 3} and Mn{sub 2}O{sub 3}. The accuracy obtained is 3.2% on the average for relative concentrations and 9.2% for absolute concentrations. - Highlights: • Method for quantitative analysis by LIBS, based on Csigma graphs • Conventional calibration is replaced with characterization of the LIBS system. • All elements are determined from measurement of one or two Csigma graphs. • The method is tested with fused glass disks prepared from slags and pure compounds. • Accurate results for relative (3.2%) and absolute concentrations (9.2%)
International Nuclear Information System (INIS)
Paegert, Martin; Stassun, Keivan G.; Burger, Dan M.
2014-01-01
We describe a new neural-net-based light curve classifier and provide it with documentation as a ready-to-use tool for the community. While optimized for identification and classification of eclipsing binary stars, the classifier is general purpose, and has been developed for speed in the context of upcoming massive surveys such as the Large Synoptic Survey Telescope. A challenge for classifiers in the context of neural-net training and massive data sets is to minimize the number of parameters required to describe each light curve. We show that a simple and fast geometric representation that encodes the overall light curve shape, together with a chi-square parameter to capture higher-order morphology information results in efficient yet robust light curve classification, especially for eclipsing binaries. Testing the classifier on the ASAS light curve database, we achieve a retrieval rate of 98% and a false-positive rate of 2% for eclipsing binaries. We achieve similarly high retrieval rates for most other periodic variable-star classes, including RR Lyrae, Mira, and delta Scuti. However, the classifier currently has difficulty discriminating between different sub-classes of eclipsing binaries, and suffers a relatively low (∼60%) retrieval rate for multi-mode delta Cepheid stars. We find that it is imperative to train the classifier's neural network with exemplars that include the full range of light curve quality to which the classifier will be expected to perform; the classifier performs well on noisy light curves only when trained with noisy exemplars. The classifier source code, ancillary programs, a trained neural net, and a guide for use, are provided.
International Nuclear Information System (INIS)
Zhao Yongxiang
1999-01-01
A probabilistic evaluating approach of design S-N curve and a reliability assessment approach of the ASME code-based evaluation are presented on the basis of Langer S-N model-based P-S-N curves. The P-S-N curves are estimated by a so-called general maximum likelihood method. This method can be applied to deal with the virtual stress amplitude-crack initial life data which have a characteristics of double random variables. Investigation of a set of the virtual stress amplitude-crack initial life (S-N) data of 1Cr18Ni9Ti austenitic stainless steel-welded joint reveals that the P-S-N curves can give a good prediction of scatter regularity of the S-N data. Probabilistic evaluation of the design S-N curve with 0.9999 survival probability has considered various uncertainties, besides of the scatter of the S-N data, to an appropriate extent. The ASME code-based evaluation with 20 reduction factor on the mean life is much more conservative than that with 2 reduction factor on the stress amplitude. Evaluation of the latter in 666.61 MPa virtual stress amplitude is equivalent to 0.999522 survival probability and in 2092.18 MPa virtual stress amplitude equivalent to 0.9999999995 survival probability. This means that the evaluation in the low loading level may be non-conservative and in contrast, too conservative in the high loading level. Cause is that the reduction factors are constants and the factors can not take into account the general observation that scatter of the N data increases with the loading level decreasing. This has indicated that it is necessary to apply the probabilistic approach to the evaluation of design S-N curve
Monitoring and Fault Detection in Photovoltaic Systems Based On Inverter Measured String I-V Curves
DEFF Research Database (Denmark)
Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas
2015-01-01
Most photovoltaic (PV) string inverters have the hardware capability to measure at least part of the current-voltage (I-V) characteristic curve of the PV strings connected at the input. However, this intrinsic capability of the inverters is not used, since I-V curve measurement and monitoring...... functions are not implemented in the inverter control software. In this paper, we aim to show how such a functionality can be useful for PV system monitoring purposes, to detect the presence and cause of power-loss in the PV strings, be it due to shading, degradation of the PV modules or balance......-of-system components through increased series resistance losses, or shunting of the PV modules. To achieve this, we propose and experimentally demonstrate three complementary PV system monitoring methods that make use of the I-V curve measurement capability of a commercial string inverter. The first method is suitable...
Tauler, R.; Smilde, A. K.; HENSHAW, J. M.; BURGESS, L. W.; KOWALSKI, B. R.
1994-01-01
A new multivariate curve resolution method that can extract analytical information from UV/visible spectroscopic data collected from a reaction-based chemical sensor is proposed. The method is demonstrated with the determination of mixtures of chlorinated hydrocarbons by estimating the kinetic and
International Nuclear Information System (INIS)
Kuivalainen, Kalle; Peiponen, Kai-Erik; Myller, Kari
2009-01-01
An optical measurement device, which is a diffractive element-based sensor, is presented for the detection of latent fingerprints on curved objects such as a ballpoint pen. The device provides image and gloss information on the ridges of a fingerprint. The device is expected to have applications in forensic studies. (technical design note)
Hsu, Pi-Shan
2012-01-01
This study aims to develop the core mechanism for realizing the development of personalized adaptive e-learning platform, which is based on the previous learning effort curve research and takes into account the learner characteristics of learning style and self-efficacy. 125 university students from Taiwan are classified into 16 groups according…
Directory of Open Access Journals (Sweden)
Xiaoyun Huang
2015-09-01
Full Text Available To improve the real-time performance and detection rate of a Lane Detection and Reconstruction (LDR system, an extended-search-based lane detection method and a Bézier curve-based lane reconstruction algorithm are proposed in this paper. The extended-search-based lane detection method is designed to search boundary blocks from the initial position, in an upwards direction and along the lane, with small search areas including continuous search, discontinuous search and bending search in order to detect different lane boundaries. The Bézier curve-based lane reconstruction algorithm is employed to describe a wide range of lane boundary forms with comparatively simple expressions. In addition, two Bézier curves are adopted to reconstruct the lanes' outer boundaries with large curvature variation. The lane detection and reconstruction algorithm — including initial-blocks' determining, extended search, binarization processing and lane boundaries' fitting in different scenarios — is verified in road tests. The results show that this algorithm is robust against different shadows and illumination variations; the average processing time per frame is 13 ms. Significantly, it presents an 88.6% high-detection rate on curved lanes with large or variable curvatures, where the accident rate is higher than that of straight lanes.
Directory of Open Access Journals (Sweden)
Alavalapati Goutham Reddy
Full Text Available Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
Optimization of ISOL targets based on Monte-Carlo simulations of ion release curves
International Nuclear Information System (INIS)
Mustapha, B.; Nolen, J.A.
2003-01-01
A detailed model for simulating release curves from ISOL targets has been developed. The full 3D geometry is implemented using Geant-4. Produced particles are followed individually from production to release. The delay time is computed event by event. All processes involved: diffusion, effusion and decay are included to obtain the overall release curve. By fitting to the experimental data, important parameters of the release process (diffusion coefficient, sticking time, ...) are extracted. They can be used to improve the efficiency of existing targets and design new ones more suitable to produce beams of rare isotopes
Optimization of ISOL targets based on Monte-Carlo simulations of ion release curves
Mustapha, B
2003-01-01
A detailed model for simulating release curves from ISOL targets has been developed. The full 3D geometry is implemented using Geant-4. Produced particles are followed individually from production to release. The delay time is computed event by event. All processes involved: diffusion, effusion and decay are included to obtain the overall release curve. By fitting to the experimental data, important parameters of the release process (diffusion coefficient, sticking time, ...) are extracted. They can be used to improve the efficiency of existing targets and design new ones more suitable to produce beams of rare isotopes.
Multi-binding site model-based curve-fitting program for the computation of RIA data
International Nuclear Information System (INIS)
Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.
1977-01-01
In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de
Lu, Jun; Xiao, Jun; Gao, Dong Jun; Zong, Shu Yu; Li, Zhu
2018-03-01
In the production of the Association of American Railroads (AAR) locomotive wheel-set, the press-fit curve is the most important basis for the reliability of wheel-set assembly. In the past, Most of production enterprises mainly use artificial detection methods to determine the quality of assembly. There are cases of miscarriage of justice appear. For this reason, the research on the standard is carried out. And the automatic judgment of press-fit curve is analysed and designed, so as to provide guidance for the locomotive wheel-set production based on AAR standard.
Paleosecular Type Curves for South America Based on Holocene-Pleistocene Lake Sediments Studies
Gogorza, C. S.
2007-05-01
Most of the high-resolution paleomagnetic secular variation (PSV) results were obtained from records on sediments from the Northern Hemisphere. Experimental results from South America are scarce. The first results were obtained by Creer et al. (1983) and have been continued since few years ago by the author and collaborators. This review deals with studies of PSV records from bottom sediments from three lakes: Escondido, Moreno and El Trébol (south-western Argentina, 41° S, 71° 30'W). Measurements of directions (declination D and inclination I) and intensity of natural remanent magnetization (NRM), magnetic susceptibility at low and high frequency (specific, X and volumetric, k), isothermal remanent magnetization (IRM), saturation isothermal remanent magnetization (SIRM), and back field were carried out. Stability of the NRM was investigated by alternating-field demagnetization. Rock magnetic studies suggest that the main carriers of magnetization are ferrimagnetic minerals, predominantly pseudo single domain magnetite. The correlation between cores was based on magnetic parameters as X and NRM. The tephra layers were identified from the lithologic profiles and also from the magnetic susceptibility logs. Due to their different chronological meaning and their rather bad behavior as magnetic recorder, these layers were removed from the sequence and the gaps that were produced along the profiles by the removal were closed, obtaining a "shortened depth". Radiocarbon age estimates from these cores and from earlier studies allow us to construct paleosecular variation records for the past 22,000 years. Inclination and declination curves (Gogorza et al., 2000a; Gogorza et al., 2002; Irurzun et al., 2006) show trends that are similar to a paleomagnetic secular variation curve for SW of Argentina (Gogorza et al., 2000b). References Creer, K.M., Tucholka, P. and Barton, C.E. 1983. Paleomagnetism of lake sediments, in Geomagnetism of Baked Clays and Recent Sediments, edited
Yang, Zhichun; Zhou, Jian; Gu, Yingsong
2014-10-01
A flow field modified local piston theory, which is applied to the integrated analysis on static/dynamic aeroelastic behaviors of curved panels, is proposed in this paper. The local flow field parameters used in the modification are obtained by CFD technique which has the advantage to simulate the steady flow field accurately. This flow field modified local piston theory for aerodynamic loading is applied to the analysis of static aeroelastic deformation and flutter stabilities of curved panels in hypersonic flow. In addition, comparisons are made between results obtained by using the present method and curvature modified method. It shows that when the curvature of the curved panel is relatively small, the static aeroelastic deformations and flutter stability boundaries obtained by these two methods have little difference, while for curved panels with larger curvatures, the static aeroelastic deformation obtained by the present method is larger and the flutter stability boundary is smaller compared with those obtained by the curvature modified method, and the discrepancy increases with the increasing of curvature of panels. Therefore, the existing curvature modified method is non-conservative compared to the proposed flow field modified method based on the consideration of hypersonic flight vehicle safety, and the proposed flow field modified local piston theory for curved panels enlarges the application range of piston theory.
Image Features Based on Characteristic Curves and Local Binary Patterns for Automated HER2 Scoring
Directory of Open Access Journals (Sweden)
Ramakrishnan Mukundan
2018-02-01
Full Text Available This paper presents novel feature descriptors and classification algorithms for the automated scoring of HER2 in Whole Slide Images (WSI of breast cancer histology slides. Since a large amount of processing is involved in analyzing WSI images, the primary design goal has been to keep the computational complexity to the minimum possible level and to use simple, yet robust feature descriptors that can provide accurate classification of the slides. We propose two types of feature descriptors that encode important information about staining patterns and the percentage of staining present in ImmunoHistoChemistry (IHC-stained slides. The first descriptor is called a characteristic curve, which is a smooth non-increasing curve that represents the variation of percentage of staining with saturation levels. The second new descriptor introduced in this paper is a local binary pattern (LBP feature curve, which is also a non-increasing smooth curve that represents the local texture of the staining patterns. Both descriptors show excellent interclass variance and intraclass correlation and are suitable for the design of automatic HER2 classification algorithms. This paper gives the detailed theoretical aspects of the feature descriptors and also provides experimental results and a comparative analysis.
Aspects of Pairing Based Cryptography on Jacobians of Genus Two Curves
DEFF Research Database (Denmark)
Ravnshøj, Christian Robenhagen
The thesis concerns properties of Jacobians of genus two curves defined over a finite field. Such Jacobians have a wide range of applications in data security; e.g. netbanking and digital signature. New properties of the Jacobians are proved; here, a description of the embedding of -torsion point...
International Nuclear Information System (INIS)
Wallin, K.; Rintamaa, R.
1999-01-01
Historically the ASME reference curve concept assumes a constant relation between static fracture toughness initiation toughness and crack arrest toughness. In reality, this is not the case. Experimental results show that the difference between K IC and K Ia is material specific. For some materials there is a big difference while for others they nearly coincide. So far, however, no systematic study regarding a possible correlation between the two parameters has been performed. The recent Master curve method, developed for brittle fracture initiation estimation, has enabled a consistent analysis of fracture initiation toughness data. The Master curve method has been modified to be able to describe also crack arrest toughness. Here, this modified 'crack arrest master curve' is further validated and used to develop a simple, but yet (for safety assessment purpose) adequately accurate correlation between the two fracture toughness parameters. The correlation enables the estimation of crack arrest toughness from small Charpy-sized static fracture toughness tests. The correlation is valid for low Nickel steels ≤ (1.2% Ni). If a more accurate description of the crack arrest toughness is required, it can either be measured experimentally or estimated from instrumented Charpy-V crack arrest load information. (orig.)
Spacelike and timelike form factors for ω→πγ* and K*→Kγ* in the light-front quark model
International Nuclear Information System (INIS)
Choi, Ho-Meoyng
2008-01-01
We investigate space- and timelike form factors for ω→πγ* and K*→Kγ* decays using the light-front quark model constrained by the variational principle for the QCD-motivated effective Hamiltonian. The momentum dependent spacelike form factors are obtained in the q + =0 frame and then analytically continued to the timelike region. Our prediction for the timelike form factor F ωπ (q 2 ) is in good agreement with the experimental data. We also find that the spacelike form factor F K* ± K ± (Q 2 ) for charged kaons encounters a zero because of the negative interference between the two currents to the quark and the antiquark.
Wave equation in curved spacetime
Energy Technology Data Exchange (ETDEWEB)
Choquet-Bruhat, Y [Paris-6 Univ., 75 (France). Dept. de Mecanique; Christodoulou, D [Max-Planck-Institut fuer Physik und Astrophysik, Muenchen (Germany, F.R.); Francaviglia, M [Istituto di Fisica Matematica, Turin (Italy)
1979-01-01
We study the case where g tends at timelike infinity to a stationary metric, we prove some existence theorems in spaces of solutions with finite energy for all times, and also with finite s-order energy.
Kumar, Gautam; Maji, Kuntal
2018-04-01
This article deals with the prediction of strain-and stress-based forming limit curves for advanced high strength steel DP590 sheet using Marciniak-Kuczynski (M-K) method. Three yield criteria namely Von-Mises, Hill's 48 and Yld2000-2d and two hardening laws i.e., Hollomon power and Swift hardening laws were considered to predict the forming limit curves (FLCs) for DP590 steel sheet. The effects of imperfection factor and initial groove angle on prediction of FLC were also investigated. It was observed that the FLCs shifted upward with the increase of imperfection factor value. The initial groove angle was found to have significant effects on limit strains in the left side of FLC, and insignificant effect for the right side of FLC for certain range of strain paths. The limit strains were calculated at zero groove angle for the right side of FLC, and a critical groove angle was used for the left side of FLC. The numerically predicted FLCs considering the different combinations of yield criteria and hardening laws were compared with the published experimental results of FLCs for DP590 steel sheet. The FLC predicted using the combination of Yld2000-2d yield criterion and swift hardening law was in better coorelation with the experimental data. Stress based forming limit curves (SFLCs) were also calculated from the limiting strain values obtained by M-K model. Theoretically predicted SFLCs were compared with that obtained from the experimental forming limit strains. Stress based forming limit curves were seen to better represent the forming limits of DP590 steel sheet compared to that by strain-based forming limit curves.
Mou, Yi; Athar, Muhammad Ammar; Wu, Yuzhen; Xu, Ye; Wu, Jianhua; Xu, Zhenxing; Hayder, Zulfiqar; Khan, Saeed; Idrees, Muhammad; Nasir, Muhammad Israr; Liao, Yiqun; Li, Qingge
2016-11-01
Detection of anti-hepatitis B virus (HBV) drug resistance mutations is critical for therapeutic decisions for chronic hepatitis B virus infection. We describe a real-time PCR-based assay using multicolor melting curve analysis (MMCA) that could accurately detect 24 HBV nucleotide mutations at 10 amino acid positions in the reverse transcriptase region of the HBV polymerase gene. The two-reaction assay had a limit of detection of 5 copies per reaction and could detect a minor mutant population (5% of the total population) with the reverse transcriptase M204V amino acid mutation in the presence of the major wild-type population when the overall concentration was 10 4 copies/μl. The assay could be finished within 3 h, and the cost of materials for each sample was less than $10. Clinical validation studies using three groups of samples from both nucleos(t)ide analog-treated and -untreated patients showed that the results for 99.3% (840/846) of the samples and 99.9% (8,454/8,460) of the amino acids were concordant with those of Sanger sequencing of the PCR amplicon from the HBV reverse transcriptase region (PCR Sanger sequencing). HBV DNA in six samples with mixed infections consisting of minor mutant subpopulations was undetected by the PCR Sanger sequencing method but was detected by MMCA, and the results were confirmed by coamplification at a lower denaturation temperature-PCR Sanger sequencing. Among the treated patients, 48.6% (103/212) harbored viruses that displayed lamivudine monoresistance, adefovir monoresistance, entecavir resistance, or lamivudine and adefovir resistance. Among the untreated patients, the Chinese group had more mutation-containing samples than did the Pakistani group (3.3% versus 0.56%). Because of its accuracy, rapidness, wide-range coverage, and cost-effectiveness, the real-time PCR assay could be a robust tool for the detection if anti-HBV drug resistance mutations in resource-limited countries. Copyright © 2016, American Society for
Evaluation of R-curves in ceramic materials based on bridging interactions
International Nuclear Information System (INIS)
Fett, T.; Munz, D.
1991-10-01
In coarse-grained alumina the crack growth resistance increases with increasing crack extension due to crack-border interactions. The crack shielding stress intensity factor can be calculated from the relation between the bridging stresses and the crack opening displacement. The parameters of this relation can be obtained from experimental results on stable or subcritical crack extension. Finally the effected of the R-curve on the behaviour of components with small cracks is discussed. (orig.) [de
Stage-discharge rating curves based on satellite altimetry and modeled discharge in the Amazon basin
Paris, Adrien; Dias de Paiva, Rodrigo; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Calmant, Stephane; Garambois, Pierre-André; Collischonn, Walter; Bonnet, Marie-Paule; Seyler, Frederique
2016-01-01
In this study, rating curves (RCs) were determined by applying satellite altimetry to a poorly gauged basin. This study demonstrates the synergistic application of remote sensing and watershed modeling to capture the dynamics and quantity of flow in the Amazon River Basin, respectively. Three major advancements for estimating basin-scale patterns in river discharge are described. The first advancement is the preservation of the hydrological meanings of the parameters expressed by ...
The Predicting Model of E-commerce Site Based on the Ideas of Curve Fitting
Tao, Zhang; Li, Zhang; Dingjun, Chen
On the basis of the idea of the second multiplication curve fitting, the number and scale of Chinese E-commerce site is analyzed. A preventing increase model is introduced in this paper, and the model parameters are solved by the software of Matlab. The validity of the preventing increase model is confirmed though the numerical experiment. The experimental results show that the precision of preventing increase model is ideal.
A New Model of Stopping Sight Distance of Curve Braking Based on Vehicle Dynamics
Directory of Open Access Journals (Sweden)
Rong-xia Xia
2016-01-01
Full Text Available Compared with straight-line braking, cornering brake has longer braking distance and poorer stability. Therefore, drivers are more prone to making mistakes. The braking process and the dynamics of vehicles in emergency situations on curves were analyzed. A biaxial four-wheel vehicle was simplified to a single model. Considering the braking process, dynamics, force distribution, and stability, a stopping sight distance of the curve braking calculation model was built. Then a driver-vehicle-road simulation platform was built using multibody dynamic software. The vehicle test of brake-in-turn was realized in this platform. The comparison of experimental and calculated values verified the reliability of the computational model. Eventually, the experimental values and calculated values were compared with the stopping sight distance recommended by the Highway Route Design Specification (JTGD20-2006; the current specification of stopping sight distance does not apply to cornering brake sight distance requirements. In this paper, the general values and limits of the curve stopping sight distance are presented.
Hsu, Shu-Hui; Kulasekere, Ravi; Roberson, Peter L
2010-08-05
Film calibration is time-consuming work when dose accuracy is essential while working in a range of photon scatter environments. This study uses the single-target single-hit model of film response to fit the calibration curves as a function of calibration method, processor condition, field size and depth. Kodak XV film was irradiated perpendicular to the beam axis in a solid water phantom. Standard calibration films (one dose point per film) were irradiated at 90 cm source-to-surface distance (SSD) for various doses (16-128 cGy), depths (0.2, 0.5, 1.5, 5, 10 cm) and field sizes (5 × 5, 10 × 10 and 20 × 20 cm²). The 8-field calibration method (eight dose points per film) was used as a reference for each experiment, taken at 95 cm SSD and 5 cm depth. The delivered doses were measured using an Attix parallel plate chamber for improved accuracy of dose estimation in the buildup region. Three fitting methods with one to three dose points per calibration curve were investigated for the field sizes of 5 × 5, 10 × 10 and 20 × 20 cm². The inter-day variation of model parameters (background, saturation and slope) were 1.8%, 5.7%, and 7.7% (1 σ) using the 8-field method. The saturation parameter ratio of standard to 8-field curves was 1.083 ± 0.005. The slope parameter ratio of standard to 8-field curves ranged from 0.99 to 1.05, depending on field size and depth. The slope parameter ratio decreases with increasing depth below 0.5 cm for the three field sizes. It increases with increasing depths above 0.5 cm. A calibration curve with one to three dose points fitted with the model is possible with 2% accuracy in film dosimetry for various irradiation conditions. The proposed fitting methods may reduce workload while providing energy dependence correction in radiographic film dosimetry. This study is limited to radiographic XV film with a Lumisys scanner.
Interpolating Spline Curve-Based Perceptual Encryption for 3D Printing Models
Directory of Open Access Journals (Sweden)
Giao N. Pham
2018-02-01
Full Text Available With the development of 3D printing technology, 3D printing has recently been applied to many areas of life including healthcare and the automotive industry. Due to the benefit of 3D printing, 3D printing models are often attacked by hackers and distributed without agreement from the original providers. Furthermore, certain special models and anti-weapon models in 3D printing must be protected against unauthorized users. Therefore, in order to prevent attacks and illegal copying and to ensure that all access is authorized, 3D printing models should be encrypted before being transmitted and stored. A novel perceptual encryption algorithm for 3D printing models for secure storage and transmission is presented in this paper. A facet of 3D printing model is extracted to interpolate a spline curve of degree 2 in three-dimensional space that is determined by three control points, the curvature coefficients of degree 2, and an interpolating vector. Three control points, the curvature coefficients, and interpolating vector of the spline curve of degree 2 are encrypted by a secret key. The encrypted features of the spline curve are then used to obtain the encrypted 3D printing model by inverse interpolation and geometric distortion. The results of experiments and evaluations prove that the entire 3D triangle model is altered and deformed after the perceptual encryption process. The proposed algorithm is responsive to the various formats of 3D printing models. The results of the perceptual encryption process is superior to those of previous methods. The proposed algorithm also provides a better method and more security than previous methods.
Fracture resistance curves and toughening mechanisms in polymer based dental composites
DEFF Research Database (Denmark)
De Souza, J.A.; Goutianos, Stergios; Skovgaard, M.
2011-01-01
The fracture resistance (R-curve behaviour) of two commercial dental composites (Filtek Z350® and Concept Advanced®) were studied using Double Cantilever Beam sandwich specimens loaded with pure bending moments to obtain stable crack growth. The experiments were conducted in an environmental...... significantly higher fracture resistance than the composite with the coarser microstructure. The fracture properties were related to the flexural strength of the dental composites. The method, thus, can provide useful insight into how the microstructure enhances toughness, which is necessary for the future...
An industrial batch dryer simulation tool based on the concept of the characteristic drying curve
DEFF Research Database (Denmark)
Kærn, Martin Ryhl; Elmegaard, Brian; Schneider, P.
2013-01-01
content in the material to be invariant in the airflow direction. In the falling-rate period, the concept of the Characteristic Drying Curve (CDC) is used as proposed by Langrish et al. (1991), but modified to account for a possible end-drying rate. Using the CDC both hygroscopic and non....... However, the tool may be used to analyze overall effects of inlet temperature, volume flow rate, geometry, infiltration etc. on the performance in terms of drying time, heat consumption and blower power....
An Improved Minimum Error Interpolator of CNC for General Curves Based on FPGA
Directory of Open Access Journals (Sweden)
Jiye HUANG
2014-05-01
Full Text Available This paper presents an improved minimum error interpolation algorithm for general curves generation in computer numerical control (CNC. Compared with the conventional interpolation algorithms such as the By-Point Comparison method, the Minimum- Error method and the Digital Differential Analyzer (DDA method, the proposed improved Minimum-Error interpolation algorithm can find a balance between accuracy and efficiency. The new algorithm is applicable for the curves of linear, circular, elliptical and parabolic. The proposed algorithm is realized on a field programmable gate array (FPGA with Verilog HDL language, and simulated by the ModelSim software, and finally verified on a two-axis CNC lathe. The algorithm has the following advantages: firstly, the maximum interpolation error is only half of the minimum step-size; and secondly the computing time is only two clock cycles of the FPGA. Simulations and actual tests have proved that the high accuracy and efficiency of the algorithm, which shows that it is highly suited for real-time applications.
He, G.; Zhu, H.; Xu, J.; Gao, K.; Zhu, D.
2017-09-01
The bionic research of shape is an important aspect of the research on bionic robot, and its implementation cannot be separated from the shape modeling and numerical simulation of the bionic object, which is tedious and time-consuming. In order to improve the efficiency of shape bionic design, the feet of animals living in soft soil and swamp environment are taken as bionic objects, and characteristic skeleton curve, section curve, joint rotation variable, position and other parameters are used to describe the shape and position information of bionic object’s sole, toes and flipper. The geometry modeling of the bionic object is established by using the parameterization of characteristic curves and variables. Based on this, the integration framework of parametric modeling and finite element modeling, dynamic analysis and post-processing of sinking process in soil is proposed in this paper. The examples of bionic ostrich foot and bionic duck foot are also given. The parametric modeling and integration technique can achieve rapid improved design based on bionic object, and it can also greatly improve the efficiency and quality of robot foot bionic design, and has important practical significance to improve the level of bionic design of robot foot’s shape and structure.
Properties of three-body decay functions derived with time-like jet calculus beyond leading order
International Nuclear Information System (INIS)
Sugiura, Tetsuya
2002-01-01
Three-body decay functions in time-like parton branching are calculated using the jet calculus to the next-to-leading logarithmic (NLL) order in perturbative quantum chromodynamics (QCD). The phase space contributions from each of the ladder diagrams and interference diagrams are presented. We correct part of the results for the three-body decay functions calculated previously by two groups. Employing our new results, the properties of the three-body decay functions in the regions of soft partons are examined numerically. Furthermore, we examine the contribution of the three-body decay functions modified by the restriction resulting from the kinematical boundary of the phase space for two-body decay in the parton shower model. This restriction leads to some problems for the parton shower model. For this reason, we propose a new restriction introduced by the kinematical boundary of the phase space for two-body decay. (author)
Two-Factor User Authentication with Key Agreement Scheme Based on Elliptic Curve Cryptosystem
Directory of Open Access Journals (Sweden)
Juan Qu
2014-01-01
Full Text Available A password authentication scheme using smart card is called two-factor authentication scheme. Two-factor authentication scheme is the most accepted and commonly used mechanism that provides the authorized users a secure and efficient method for accessing resources over insecure communication channel. Up to now, various two-factor user authentication schemes have been proposed. However, most of them are vulnerable to smart card loss attack, offline password guessing attack, impersonation attack, and so on. In this paper, we design a password remote user authentication with key agreement scheme using elliptic curve cryptosystem. Security analysis shows that the proposed scheme has high level of security. Moreover, the proposed scheme is more practical and secure in contrast to some related schemes.
Metal-mesh based transparent electrode on a 3-D curved surface by electrohydrodynamic jet printing
International Nuclear Information System (INIS)
Seong, Baekhoon; Yoo, Hyunwoong; Jang, Yonghee; Ryu, Changkook; Byun, Doyoung; Nguyen, Vu Dat
2014-01-01
Invisible Ag mesh transparent electrodes (TEs), with a width of 7 μm, were prepared on a curved glass surface by electrohydrodynamic (EHD) jet printing. With a 100 μm pitch, the EHD jet printed the Ag mesh on the convex glass which had a sheet resistance of 1.49 Ω/□. The printing speed was 30 cm s −1 using Ag ink, which had a 10 000 cPs viscosity and a 70 wt% Ag nanoparticle concentration. We further showed the performance of a 3-D transparent heater using the Ag mesh transparent electrode. The EHD jet printed an invisible Ag grid transparent electrode with good electrical and optical properties with promising applications on printed optoelectronic devices. (technical note)
Broadband giant-refractive-index material based on mesoscopic space-filling curves
Chang, Taeyong; Kim, Jong Uk; Kang, Seung Kyu; Kim, Hyowook; Kim, Do Kyung; Lee, Yong-Hee; Shin, Jonghwa
2016-08-01
The refractive index is the fundamental property of all optical materials and dictates Snell's law, propagation speed, wavelength, diffraction, energy density, absorption and emission of light in materials. Experimentally realized broadband refractive indices remain 1,800 resulting from a mesoscopic crystal with a dielectric constant greater than three million. This gigantic enhancement effect originates from the space-filling curve concept from mathematics. The principle is inherently very broad band, the enhancement being nearly constant from zero up to the frequency of interest. This broadband giant-refractive-index medium promises not only enhanced resolution in imaging and raised fundamental absorption limits in solar energy devices, but also compact, power-efficient components for optical communication and increased performance in many other applications.
Prior-knowledge-based feedforward network simulation of true boiling point curve of crude oil.
Chen, C W; Chen, D Z
2001-11-01
Theoretical results and practical experience indicate that feedforward networks can approximate a wide class of functional relationships very well. This property is exploited in modeling chemical processes. Given finite and noisy training data, it is important to encode the prior knowledge in neural networks to improve the fit precision and the prediction ability of the model. In this paper, as to the three-layer feedforward networks and the monotonic constraint, the unconstrained method, Joerding's penalty function method, the interpolation method, and the constrained optimization method are analyzed first. Then two novel methods, the exponential weight method and the adaptive method, are proposed. These methods are applied in simulating the true boiling point curve of a crude oil with the condition of increasing monotonicity. The simulation experimental results show that the network models trained by the novel methods are good at approximating the actual process. Finally, all these methods are discussed and compared with each other.
Papathoma-Köhle, Maria
2016-08-01
The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.
Komatsu, Shohei; Scatton, Olivier; Goumard, Claire; Sepulveda, Ailton; Brustia, Raffaele; Perdigao, Fabiano; Soubrane, Olivier
2017-05-01
Laparoscopic hepatectomy continues to be a challenging operation associated with a steep learning curve. This study aimed to evaluate the learning process during 15 years of experience with laparoscopic hepatectomy and to identify approaches to standardization of this procedure. Prospectively collected data of 317 consecutive laparoscopic hepatectomies performed from January 2000 to December 2014 were reviewed retrospectively. The operative procedures were classified into 4 categories (minor hepatectomy, left lateral sectionectomy [LLS], left hepatectomy, and right hepatectomy), and indications were classified into 5 categories (benign-borderline tumor, living donor, metastatic liver tumor, biliary malignancy, and hepatocellular carcinoma). During the first 10 years, the procedures were limited mainly to minor hepatectomy and LLS, and the indications were limited to benign-borderline tumor and living donor. Implementation of major hepatectomy rapidly increased the proportion of malignant tumors, especially hepatocellular carcinoma, starting from 2011. Conversion rates decreased with experience for LLS (13.3% vs 3.4%; p = 0.054) and left hepatectomy (50.0% vs 15.0%; p = 0.012), but not for right hepatectomy (41.4% vs 35.7%; p = 0.661). Our 15-year experience clearly demonstrates the stepwise procedural evolution from LLS through left hepatectomy to right hepatectomy, as well as the trend in indications from benign-borderline tumor/living donor to malignant tumors. In contrast to LLS and left hepatectomy, a learning curve was not observed for right hepatectomy. The ongoing development process can contribute to faster standardization necessary for future advances in laparoscopic hepatectomy. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Clarke, F H; Cahoon, N M
1987-08-01
A convenient procedure has been developed for the determination of partition and distribution coefficients. The method involves the potentiometric titration of the compound, first in water and then in a rapidly stirred mixture of water and octanol. An automatic titrator is used, and the data is collected and analyzed by curve fitting on a microcomputer with 64 K of memory. The method is rapid and accurate for compounds with pKa values between 4 and 10. Partition coefficients can be measured for monoprotic and diprotic acids and bases. The partition coefficients of the neutral compound and its ion(s) can be determined by varying the ratio of octanol to water. Distribution coefficients calculated over a wide range of pH values are presented graphically as "distribution profiles". It is shown that subtraction of the titration curve of solvent alone from that of the compound in the solvent offers advantages for pKa determination by curve fitting for compounds of low aqueous solubility.
International Nuclear Information System (INIS)
Perevertov, Oleksiy
2003-01-01
The classical Preisach model (PM) of magnetic hysteresis requires that any minor differential permeability curve lies under minor curves with larger field amplitude. Measurements of ferromagnetic materials show that very often this is not true. By applying the classical PM formalism to measured minor curves one can discover that it leads to an oval-shaped region on each half of the Preisach plane where the calculations produce negative values in the Preisach function. Introducing an effective field, which differs from the applied one by a mean-field term proportional to the magnetization, usually solves this problem. Complex techniques exist to estimate the minimum necessary proportionality constant (the moving parameter). In this paper we propose a simpler way to estimate the mean-field effects for use in nondestructive testing, which is based on experience from the measurements of industrial steels. A new parameter (parameter of shift) is introduced, which monitors the mean-field effects. The relation between the shift parameter and the moving one was studied for a number of steels. From preliminary experiments no correlation was found between the shift parameter and the classical magnetic ones such as the coercive field, maximum differential permeability and remanent magnetization
A Study on the Surface and Subsurface Water Interaction Based on the Groundwater Recession Curve
Wang, S. T.; Chen, Y. W.; Chang, L. C.; Chiang, C. J.; Wang, Y. S.
2017-12-01
The interaction of surface to subsurface water is an important issue for groundwater resources assessment and management. The influences of surface water to groundwater are mainly through the rainfall recharge, river recharge and discharge and other boundary sources. During a drought period, the interaction of river and groundwater may be one of the main sources of groundwater level recession. Therefore, this study explores the interaction of surface water to groundwater via the groundwater recession. During drought periods, the pumping and river interaction together are the main mechanisms causing the recession of groundwater level. In principle, larger gradient of the recession curve indicates more groundwater discharge and it is an important characteristic of the groundwater system. In this study, to avoid time-consuming manual analysis, the Python programming language is used to develop a statistical analysis model for exploring the groundwater recession information. First, the slopes of the groundwater level hydrograph at every time step were computed for each well. Then, for each well, the represented slope to each groundwater level was defined as the slope with 90% exceedance probability. The relationship between the recession slope and the groundwater level can then be obtained. The developed model is applied to Choushui River Alluvial Fan. In most wells, the results show strong positive correlations between the groundwater levels and the absolute values of the recession slopes.
Search procedure for models based on the evolution of experimental curves
International Nuclear Information System (INIS)
Delforge, J.
1975-01-01
The possibilities offered by numerical analysis regarding the identification of parameters for the model are outlined. The use of a large number of experimental measurements is made possible by the flexibility of the proposed method. It is shown that the errors of numerical identification over all parameters are proportional to experimental errors, and to a proportionality factor called conditioning of the identification problem which is easily computed. Moreover, it is possible to define and calculate, for each parameter, a factor of sensitivity to experimental errors. The numerical values of conditioning and sensitivity factor depend on all experimental conditions, that is, on the one hand, the specific definition of the experiments, and on the other hand, the number and quality of the undertaken measurements. The identification procedure proposed includes several phases. The preliminary phase consists in a first definition of experimental conditions, in agreement with the experimenter. From the data thus obtained, it is generally possible to evaluate the minimum number of equivalence classes required for an interpretation compatible with the morphology of experimental curves. Possibly, from this point, some additional measurements may prove useful or required. The numerical phase comes afterwards to determine a first approximate model by means of the methods previously described. Next phases again require a close collaboration between experimenters and theoreticians. They consist mainly in refining the first model [fr
Energy Technology Data Exchange (ETDEWEB)
Soriguera Marti, F.; Martinez-Diaz, M.; Perez Perez, I.
2016-07-01
Travel time is probably the most important indicator of the level of service of a highway, and it is also the most appreciated information for its users. Administrations and private companies make increasing efforts to improve its real time estimation. The appearance of new technologies makes the precise measurement of travel times easier than never before. However, direct measurements of travel time are, by nature, outdated in real time, and lack of the desired forecasting capabilities. This paper introduces a new methodology to improve the real time estimation of travel times by using the equipment usually present in most highways, i.e., loop detectors, in combination with Automatic Vehicle Identification or Tracking Technologies. One of the most important features of the method is the usage of cumulative counts at detectors as an input, avoiding the drawbacks of common spot-speed methodologies. Cumulative count curves have great potential for freeway travel time information systems, as they provide spatial measurements and thus allow the calculation of instantaneous travel times. In addition, they exhibit predictive capabilities. Nevertheless, they have not been used extensively mainly because of the error introduced by the accumulation of the detector drift. The proposed methodology solves this problem by correcting the deviations using direct travel time measurements. The method results highly beneficial for its accuracy as well as for its low implementation cost. (Author)
Wang, Nianfeng; Guo, Hao; Chen, Bicheng; Cui, Chaoyu; Zhang, Xianmin
2018-05-01
Dielectric elastomers (DE), known as electromechanical transducers, have been widely used in the field of sensors, generators, actuators and energy harvesting for decades. A large number of DE actuators including bending actuators, linear actuators and rotational actuators have been designed utilizing an experience design method. This paper proposes a new method for the design of DE actuators by using a topology optimization method based on pairs of curves. First, theoretical modeling and optimization design are discussed, after which a rotary dielectric elastomer actuator has been designed using this optimization method. Finally, experiments and comparisons between several DE actuators have been made to verify the optimized result.
Analysis of diffusion paths for photovoltaic technology based on experience curves
International Nuclear Information System (INIS)
Poponi, Daniele
2003-04-01
This paper assesses the prospects for diffusion of photovoltaic (PV) technology for electricity generation in grid-connected systems. The analysis begins with the calculation of the break-even price of PV systems and modules, which is the price that can assure commercial viability without incentives or subsidies. The calculated average break-even price of PV systems for building-integrated applications is about US dollars 3.2/W p but can go up to about US dollars 4.5/W p in areas with very good solar irradiation and if a low real discount rate is applied. These are higher values than the break-even prices estimated in the literature to date. PV system break-even prices for intermediate load generation in utility-owned systems are also calculated, their average being about US dollars 1/W p The methodology of experience curves is used to predict what would be the different levels of cumulative world PV shipments required to reach the calculated break-even prices of PV systems, assuming different trends in the relationship between price and the increase in cumulative shipments. The years in which the break-even levels of cumulative shipments could be theoretically obtained are then calculated by considering different market growth rates. Photovoltaics could enter the niche of building-integrated applications without incentives in the first years of the next decade, provided that the PR is 80% and the average annual world market growth rate is at least 15%. The final part of the paper analyzes the niche markets or applications that seem promising for the diffusion of photovoltaics in the next few years (Author)
Path tracking control of mobile robots with techniques based on the use of curved abscissa
International Nuclear Information System (INIS)
Micaelli, A.
1992-01-01
The paper describes a particular method, developed by the CEA, for the control of mobile robot trajectories based on Cornu's spiral, i.e; sections of trajectories with constant curvature. New approaches are discussed for methods more convenient
Wang, Ten-See
1993-07-01
Excessive base heating has been a problem for many launch vehicles. For certain designs such as the direct dump of turbine exhaust in the nozzle section and at the nozzle lip of the Space Transportation Systems Engine (STME), the potential burning of the turbine exhaust in the base region has caused tremendous concern. Two conventional approaches have been considered for predicting the base environment: (1) empirical approach, and (2) experimental approach. The empirical approach uses a combination of data correlations and semi-theoretical calculations. It works best for linear problems, simple physics and geometry. However, it is highly suspicious when complex geometry and flow physics are involved, especially when the subject is out of historical database. The experimental approach is often used to establish database for engineering analysis. However, it is qualitative at best for base flow problems. Other criticisms include the inability to simulate forebody boundary layer correctly, the interference effect from tunnel walls, and the inability to scale all pertinent parameters. Furthermore, there is a contention that the information extrapolated from subscale tests with combustion is not conservative. One potential alternative to the conventional methods is computational fluid dynamics (CFD), which has none of the above restrictions and is becoming more feasible due to maturing algorithms and advancing computer technology. It provides more details of the flowfield and is only limited by computer resources. However, it has its share of criticisms as a predictive tool for base environment. One major concern is that CFD has not been extensively tested for base flow problems. It is therefore imperative that CFD be assessed and benchmarked satisfactorily for base flows. In this study, the turbulent base flowfield of a experimental investigation for a four-engine clustered nozzle is numerically benchmarked using a pressure based CFD method. Since the cold air was the
DEFF Research Database (Denmark)
Bureau, Emil; Schilder, Frank; Santos, Ilmar
2014-01-01
We show how to implement control-based continuation in a nonlinear experiment using existing and freely available software. We demonstrate that it is possible to track the complete frequency response, including the unstable branches, for a harmonically forced impact oscillator.......We show how to implement control-based continuation in a nonlinear experiment using existing and freely available software. We demonstrate that it is possible to track the complete frequency response, including the unstable branches, for a harmonically forced impact oscillator....
Directory of Open Access Journals (Sweden)
Ugur Ozturk
2016-07-01
Full Text Available Early-warning systems (EWSs are crucial to reduce the risk of landslide, especially where the structural measures are not fully capable of preventing the devastating impact of such an event. Furthermore, designing and successfully implementing a complete landslide EWS is a highly complex task. The main technical challenges are linked to the definition of heterogeneous material properties (geotechnical and geomechanical parameters as well as a variety of the triggering factors. In addition, real-time data processing creates a significant complexity, since data collection and numerical models for risk assessment are time consuming tasks. Therefore, uncertainties in the physical properties of a landslide together with the data management represent the two crucial deficiencies in an efficient landslide EWS. Within this study the application is explored of the concept of fragility curves to landslides; fragility curves are widely used to simulate systems response to natural hazards, i.e. floods or earthquakes. The application of fragility curves to landslide risk assessment is believed to simplify emergency risk assessment; even though it cannot substitute detailed analysis during peace-time. A simplified risk assessment technique can remove some of the unclear features and decrease data processing time. The method is based on synthetic samples which are used to define the approximate failure thresholds for landslides, taking into account the materials and the piezometric levels. The results are presented in charts. The method presented in this paper, which is called failure index fragility curve (FIFC, allows assessment of the actual real-time risk in a case study that is based on the most appropriate FIFC. The application of an FIFC to a real case is presented as an example. This method to assess the landslide risk is another step towards a more integrated dynamic approach to a potential landslide prevention system. Even if it does not define
International Nuclear Information System (INIS)
Hong, Sungjun; Chung, Yanghon; Woo, Chungwon
2015-01-01
South Korea, as the 9th largest energy consuming in 2013 and the 7th largest greenhouse gas emitting country in 2011, established ‘Low Carbon Green Growth’ as the national vision in 2008, and is announcing various active energy policies that are set to gain the attention of the world. In this paper, we estimated the decrease of photovoltaic power generation cost in Korea based on the learning curve theory. Photovoltaic energy is one of the leading renewable energy sources, and countries all over the world are currently expanding R and D, demonstration and deployment of photovoltaic technology. In order to estimate the learning rate of photovoltaic energy in Korea, both conventional 1FLC (one-factor learning curve), which considers only the cumulative power generation, and 2FLC, which also considers R and D investment were applied. The 1FLC analysis showed that the cost of power generation decreased by 3.1% as the cumulative power generation doubled. The 2FCL analysis presented that the cost decreases by 2.33% every time the cumulative photovoltaic power generation is doubled and by 5.13% every time R and D investment is doubled. Moreover, the effect of R and D investment on photovoltaic technology took after around 3 years, and the depreciation rate of R and D investment was around 20%. - Highlights: • We analyze the learning effects of photovoltaic energy technology in Korea. • In order to calculate the learning rate, we use 1FLC (one-factor learning curve) and 2FLC methods, respectively. • 1FLC method considers only the cumulative power generation. • 2FLC method considers both cumulative power generation and knowledge stock. • We analyze a variety of scenarios by time lag and depreciation rate of R and D investment
Acid-base titration curves in an integrated computer learning environment
Heck, A.; Kędzierska, E.; Rodgers, L.; Chmurska, M.
2008-01-01
The topic of acid-base reactions is a regular component of many chemistry curricula that requires integrated understanding of various areas of introductory chemistry. Many students have considerable difficulties understanding the concepts and processes involved. It has been suggested and confirmed
Detector evaluation for improved situational awareness: Receiver operator characteristic curve based
Wuijckhuijse, A.L. van; Nieuwenhuizen, M.S.
2016-01-01
In military and civilian operations good situational awareness is a prerequisite to make proper decisions. The situational awareness is among others based upon intelligence, threat analysis and detection, altogether element of the so-called DIM (detection, identification, monitoring) system. In case
Ibourk, Aomar; Amaghouss, Jabrane
2012-01-01
Although the quantity of education is widely used to measure the economical and social performances of educative systems, only a few works have addressed the issue of equity in education. In this work, we have calculated two measures of inequality in education based on Barro and Lee's (2010) data: the Gini index of education and the standard…
Directory of Open Access Journals (Sweden)
Alaauldin Ibrahim
2017-01-01
Full Text Available Information in patients’ medical histories is subject to various security and privacy concerns. Meanwhile, any modification or error in a patient’s medical data may cause serious or even fatal harm. To protect and transfer this valuable and sensitive information in a secure manner, radio-frequency identification (RFID technology has been widely adopted in healthcare systems and is being deployed in many hospitals. In this paper, we propose a mutual authentication protocol for RFID tags based on elliptic curve cryptography and advanced encryption standard. Unlike existing authentication protocols, which only send the tag ID securely, the proposed protocol could also send the valuable data stored in the tag in an encrypted pattern. The proposed protocol is not simply a theoretical construct; it has been coded and tested on an experimental RFID tag. The proposed scheme achieves mutual authentication in just two steps and satisfies all the essential security requirements of RFID-based healthcare systems.
Estimation of natural-gas consumption in Poland based on the logistic-curve interpretation
International Nuclear Information System (INIS)
Siemek, J.; Nagy, S.; Rychlicki, S.
2003-01-01
This paper describes the possible scenario of the development of the gas sector in Poland. An adaptation of the Hubbert model is implemented to the Polish situation based upon the Starzman modification. The model presented describes hypothetical natural-gas demand based on average trend of the economy development during recent decades; the model considers natural production/demand maxima of energy carriers. The prognosis is loaded with an error resulting from the use of average data related to yearly increases of the national gross product. The adapted model expresses good compatibility with the natural-gas demand for the period 1995-2000. However, the error of prognosis may reach 20%. The simple structure of the model enables the possibility of yearly updating, and eventual correction of the natural-gas demand. In cases of untypical changes of the economy growth rate (long stagnation, extreme long and accelerated development), the prognosis error may increase. (author)
Liu, Boshi; Huang, Renliang; Yu, Yanjun; Su, Rongxin; Qi, Wei; He, Zhimin
2018-01-01
Ochratoxin A (OTA) is a type of mycotoxin generated from the metabolism of Aspergillus and Penicillium , and is extremely toxic to humans, livestock, and poultry. However, traditional assays for the detection of OTA are expensive and complicated. Other than OTA aptamer, OTA itself at high concentration can also adsorb on the surface of gold nanoparticles (AuNPs), and further inhibit AuNPs salt aggregation. We herein report a new OTA assay by applying the localized surface plasmon resonance effect of AuNPs and their aggregates. The result obtained from only one single linear calibration curve is not reliable, and so we developed a "double calibration curve" method to address this issue and widen the OTA detection range. A number of other analytes were also examined, and the structural properties of analytes that bind with the AuNPs were further discussed. We found that various considerations must be taken into account in the detection of these analytes when applying AuNP aggregation-based methods due to their different binding strengths.
Li, Kenli; Zou, Shuting; Xv, Jin
2008-01-01
Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.
Persky, Adam M; Henry, Teague; Campbell, Ashley
2015-03-25
To examine factors that determine the interindividual variability of learning within a team-based learning environment. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students' Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course.
Uno, Hajime; Tian, Lu; Claggett, Brian; Wei, L J
2015-12-10
With censored event time observations, the logrank test is the most popular tool for testing the equality of two underlying survival distributions. Although this test is asymptotically distribution free, it may not be powerful when the proportional hazards assumption is violated. Various other novel testing procedures have been proposed, which generally are derived by assuming a class of specific alternative hypotheses with respect to the hazard functions. The test considered by Pepe and Fleming (1989) is based on a linear combination of weighted differences of the two Kaplan-Meier curves over time and is a natural tool to assess the difference of two survival functions directly. In this article, we take a similar approach but choose weights that are proportional to the observed standardized difference of the estimated survival curves at each time point. The new proposal automatically makes weighting adjustments empirically. The new test statistic is aimed at a one-sided general alternative hypothesis and is distributed with a short right tail under the null hypothesis but with a heavy tail under the alternative. The results from extensive numerical studies demonstrate that the new procedure performs well under various general alternatives with a caution of a minor inflation of the type I error rate when the sample size is small or the number of observed events is small. The survival data from a recent cancer comparative study are utilized for illustrating the implementation of the process. Copyright © 2015 John Wiley & Sons, Ltd.
Preparing for TESS: Precision Ground-based Light-curves of Newly Discovered Transiting Exoplanets
Li, Yiting; Stefansson, Gudmundur; Mahadevan, Suvrath; Monson, Andy; Hebb, Leslie; Wisniewski, John; Huehnerhoff, Joseph
2018-01-01
NASA’s Transiting Exoplanet Survey Satellite (TESS), to be launched in early 2018, is expected to catalog a myriad of transiting exoplanet candidates ranging from Earth-sized to gas giants, orbiting a diverse range of stellar types in the solar neighborhood. In particular, TESS will find small planets orbiting the closest and brightest stars, and will enable detailed atmospheric characterizations of planets with current and future telescopes. In the TESS era, ground-based follow-up resources will play a critical role in validating and confirming the planetary nature of the candidates TESS will discover. Along with confirming the planetary nature of exoplanet transits, high precision ground-based transit observations allow us to put further constraints on exoplanet orbital parameters and transit timing variations. In this talk, we present new observations of transiting exoplanets recently discovered by the K2 mission, using the optical diffuser on the 3.5m ARC Telescope at Apache Point Observatory. These include observations of the mini-Neptunes K2-28b and K2-104b orbiting early-to-mid M-dwarfs. In addition, other recent transit observations performed using the robotic 30cm telescope at Las Campanas Observatory in Chile will be presented.
Genetic Algorithm-Based Optimization to Match Asteroid Energy Deposition Curves
Tarano, Ana; Mathias, Donovan; Wheeler, Lorien; Close, Sigrid
2018-01-01
An asteroid entering Earth's atmosphere deposits energy along its path due to thermal ablation and dissipative forces that can be measured by ground-based and spaceborne instruments. Inference of pre-entry asteroid properties and characterization of the atmospheric breakup is facilitated by using an analytic fragment-cloud model (FCM) in conjunction with a Genetic Algorithm (GA). This optimization technique is used to inversely solve for the asteroid's entry properties, such as diameter, density, strength, velocity, entry angle, and strength scaling, from simulations using FCM. The previous parameters' fitness evaluation involves minimizing error to ascertain the best match between the physics-based calculated energy deposition and the observed meteors. This steady-state GA provided sets of solutions agreeing with literature, such as the meteor from Chelyabinsk, Russia in 2013 and Tagish Lake, Canada in 2000, which were used as case studies in order to validate the optimization routine. The assisted exploration and exploitation of this multi-dimensional search space enables inference and uncertainty analysis that can inform studies of near-Earth asteroids and consequently improve risk assessment.
Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian
2015-03-01
The telecare medical information systems (TMISs) enable patients to conveniently enjoy telecare services at home. The protection of patient's privacy is a key issue due to the openness of communication environment. Authentication as a typical approach is adopted to guarantee confidential and authorized interaction between the patient and remote server. In order to achieve the goals, numerous remote authentication schemes based on cryptography have been presented. Recently, Arshad et al. (J Med Syst 38(12): 2014) presented a secure and efficient three-factor authenticated key exchange scheme to remedy the weaknesses of Tan et al.'s scheme (J Med Syst 38(3): 2014). In this paper, we found that once a successful off-line password attack that results in an adversary could impersonate any user of the system in Arshad et al.'s scheme. In order to thwart these security attacks, an enhanced biometric and smart card based remote authentication scheme for TMISs is proposed. In addition, the BAN logic is applied to demonstrate the completeness of the enhanced scheme. Security and performance analyses show that our enhanced scheme satisfies more security properties and less computational cost compared with previously proposed schemes.
Low-loss curved subwavelength grating waveguide based on index engineering
Wang, Zheng; Xu, Xiaochuan; Fan, D. L.; Wang, Yaoguo; Chen, Ray T.
2016-03-01
Subwavelength grating (SWG) waveguide is an intriguing alternative to conventional optical waveguides due to its freedom to tune a few important waveguide properties such as dispersion and refractive index. Devices based on SWG waveguide have demonstrated impressive performances compared to those of conventional waveguides. However, the large loss of SWG waveguide bends jeopardizes their applications in integrated photonics circuits. In this work, we propose that a predistorted refractive index distribution in SWG waveguide bends can effectively decrease the mode mismatch noise and radiation loss simultaneously, and thus significantly reduce the bend loss. Here, we achieved the pre-distortion refractive index distribution by using trapezoidal silicon pillars. This geometry tuning approach is numerically optimized and experimentally demonstrated. The average insertion loss of a 5 μm SWG waveguide bend can be reduced drastically from 5.58 dB to 1.37 dB per 90° bend for quasi-TE polarization. In the future, the proposed approach can be readily adopted to enhance performance of an array of SWG waveguide-based photonics devices.
Radial artery pulse waveform analysis based on curve fitting using discrete Fourier series.
Jiang, Zhixing; Zhang, David; Lu, Guangming
2018-04-19
Radial artery pulse diagnosis has been playing an important role in traditional Chinese medicine (TCM). For its non-invasion and convenience, the pulse diagnosis has great significance in diseases analysis of modern medicine. The practitioners sense the pulse waveforms in patients' wrist to make diagnoses based on their non-objective personal experience. With the researches of pulse acquisition platforms and computerized analysis methods, the objective study on pulse diagnosis can help the TCM to keep up with the development of modern medicine. In this paper, we propose a new method to extract feature from pulse waveform based on discrete Fourier series (DFS). It regards the waveform as one kind of signal that consists of a series of sub-components represented by sine and cosine (SC) signals with different frequencies and amplitudes. After the pulse signals are collected and preprocessed, we fit the average waveform for each sample using discrete Fourier series by least squares. The feature vector is comprised by the coefficients of discrete Fourier series function. Compared with the fitting method using Gaussian mixture function, the fitting errors of proposed method are smaller, which indicate that our method can represent the original signal better. The classification performance of proposed feature is superior to the other features extracted from waveform, liking auto-regression model and Gaussian mixture model. The coefficients of optimized DFS function, who is used to fit the arterial pressure waveforms, can obtain better performance in modeling the waveforms and holds more potential information for distinguishing different psychological states. Copyright © 2018 Elsevier B.V. All rights reserved.
2002-01-01
The aim of this experiment is to measure with precision the electromagnetic form factors of the proton in the time-like region via the reaction: .ce @*p @A e|+e|- with antiprotons of momenta between 0 and 2 GeV/c. Up to @= 800 MeV/c, a continuous energy scan in @= 2 MeV (@]s) bins will be performed. The form factor !G(E)! and !G(M)! will be determined separately since large statistics can be collected with LEAR antiproton beams, so that angular distributions can be obtained at many momenta.\\\\ \\\\ In addition, e|+e|- pairs produced via the reaction: .ce @*p @A V|0 + neutrals, .ce !@A e|+e|- where the antiprotons are at rest, will be detected allowing the vector meson mass spectrum between @= 1 GeV and @= 1.7 GeV to be obtained with high statistics and in one run. \\\\ \\\\ The proposed apparatus consists of a central detector, surrounded by a gas Cerenkov counter, wire chambers, hodoscopes, and an electromagnetic calorimeter. The central detector consists of several layers of proportional chambers around a liquid-h...
International Nuclear Information System (INIS)
Pollock, M.D.
1986-01-01
We consider the (4+N)-dimensional theory whose Lagrangian function is Lsub(4+N)=√-g-circumflex α R-circumflex 2 , where R-circumflex is the Ricci scalar and α is a positive constant. The metric is g-circumflexsub(AB)= diag(gsub(ab), phi -1 g-barsub(mn)). Dimensional reduction leads to an effective four-dimensional Lagrangian of induced-gravity type. The positive semi-definiteness of L avoids the difficulties, pointed out recently by Horowitz and by Rubakov, which can arise in quantum cosmology when the (Euclidean) action becomes negative. The compactification is onto a time-like internal space g-barsub(mn), as suggested by Aref'eva and Volovich, giving a four-dimensional de Sitter space-time with phi=constant, which however is classically unstable on a time scale approx. H -1 . Decrease of the radius phisup(-1/2) of the internal space is ultimately halted by quantum effects, via some V(phi), and L 4 then includes the usual Hilbert term and a cosmological constant. (author)
Titration Curves: Fact and Fiction.
Chamberlain, John
1997-01-01
Discusses ways in which datalogging equipment can enable titration curves to be measured accurately and how computing power can be used to predict the shape of curves. Highlights include sources of error, use of spreadsheets to generate titration curves, titration of a weak acid with a strong alkali, dibasic acids, weak acid and weak base, and…
Directory of Open Access Journals (Sweden)
Jiang Lin
2016-01-01
Full Text Available The overall efficiency of PV arrays is affected by hot spots which should be detected and diagnosed by applying responsible monitoring techniques. The method using the IR thermal image to detect hot spots has been studied as a direct, noncontact, nondestructive technique. However, IR thermal images suffer from relatively high stochastic noise and non-uniformity clutter, so the conventional methods of image processing are not effective. The paper proposes a method to detect hotspots based on curve fitting of gray histogram. The result of MATLAB simulation proves the method proposed in the paper is effective to detect the hot spots suppressing the noise generated during the process of image acquisition.
International Nuclear Information System (INIS)
Sriyono; Ismu Wahyono, Puradwi; Mulyanto, Dwijo; Kusmono, Siamet
2001-01-01
The main component of Multipurpose G.A.Siwabessy had been analyzed by its failure rate curve. The main component ha'..e been analyzed namely, the pump of ''Fuel Storage Pool Purification System'' (AK-AP), ''Primary Cooling System'' (JE01-AP), ''Primary Pool Purification System'' (KBE01-AP), ''Warm Layer System'' (KBE02-AP), ''Cooling Tower'' (PA/D-AH), ''Secondary Cooling System'', and Diesel (BRV). The Failure Rate Curve is made by component database that was taken from 'log book' operation of RSG GAS. The total operation of that curve is 2500 hours. From that curve it concluded that the failure rate of components form of bathtub curve. The maintenance processing causes the curve anomaly
Law, Tameeka L; Katikaneni, Lakshmi D; Taylor, Sarah N; Korte, Jeffrey E; Ebeling, Myla D; Wagner, Carol L; Newman, Roger B
2012-07-01
Compare customized versus population-based growth curves for identification of small-for-gestational-age (SGA) and body fat percent (BF%) among preterm infants. Prospective cohort study of 204 preterm infants classified as SGA or appropriate-for-gestational-age (AGA) by population-based and customized growth curves. BF% was determined by air-displacement plethysmography. Differences between groups were compared using bivariable and multivariable linear and logistic regression analyses. Customized curves reclassified 30% of the preterm infants as SGA. SGA infants identified by customized method only had significantly lower BF% (13.8 ± 6.0) than the AGA (16.2 ± 6.3, p = 0.02) infants and similar to the SGA infants classified by both methods (14.6 ± 6.7, p = 0.51). Customized growth curves were a significant predictor of BF% (p = 0.02), whereas population-based growth curves were not a significant independent predictor of BF% (p = 0.50) at term corrected gestational age. Customized growth potential improves the differentiation of SGA infants and low BF% compared with a standard population-based growth curve among a cohort of preterm infants.
Greco, Roberto; Gargano, Rudy
2016-04-01
The evaluation of suction stress in unsaturated soils has important implications in several practical applications. Suction stress affects soil aggregate stability and soil erosion. Furthermore, the equilibrium of shallow unsaturated soil deposits along steep slopes is often possible only thanks to the contribution of suction to soil effective stress. Experimental evidence, as well as theoretical arguments, shows that suction stress is a nonlinear function of matric suction. The relationship expressing the dependence of suction stress on soil matric suction is usually indicated as Soil Stress Characteristic Curve (SSCC). In this study, a novel equation for the evaluation of the suction stress of an unsaturated soil is proposed, assuming that the exchange of stress between soil water and solid particles occurs only through the part of the surface of the solid particles which is in direct contact with water. The proposed equation, based only upon geometric considerations related to soil pore-size distribution, allows to easily derive the SSCC from the water retention curve (SWRC), with the assignment of two additional parameters. The first parameter, representing the projection of the external surface area of the soil over a generic plane surface, can be reasonably estimated from the residual water content of the soil. The second parameter, indicated as H0, is the water potential, below which adsorption significantly contributes to water retention. For the experimental verification of the proposed approach such a parameter is considered as a fitting parameter. The proposed equation is applied to the interpretation of suction stress experimental data, taken from the literature, spanning over a wide range of soil textures. The obtained results show that in all cases the proposed relationships closely reproduces the experimental data, performing better than other currently used expressions. The obtained results also show that the adopted values of the parameter H0
Müller, M. F.; Thompson, S. E.
2016-02-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.
International Nuclear Information System (INIS)
Rothman, Dale S.
1998-01-01
Recent research has examined the hypothesis of an environmental Kuznets curve (EKC): the notion that environmental impact increases in the early stages of development followed by declines in the later stages. These studies have focused on the relationship between per capita income and a variety of environmental indicators. Results imply that EKCs may exist for a number of cases. However, the measures of environmental impact used generally focus on production processes and reflect environmental impacts that are local in nature and for which abatement is relatively inexpensive in terms of monetary costs and/or lifestyle changes. Significantly, more consumption-based measures, such as CO 2 emissions and municipal waste, for which impacts are relatively easy to externalize or costly to control, show no tendency to decline with increasing per capita income. By considering consumption and trade patterns, the author re-examines the concept of the EKC and propose the use of alternative, consumption-based measures of environmental impact. The author speculates that what appear to be improvements in environmental quality may in reality be indicators of increased ability of consumers in wealthy nations to distance themselves from the environmental degradation associated with their consumption
Jacobsen, Anna L; Pratt, R Brandon
2012-06-01
Vulnerability to cavitation curves are used to estimate xylem cavitation resistance and can be constructed using multiple techniques. It was recently suggested that a technique that relies on centrifugal force to generate negative xylem pressures may be susceptible to an open vessel artifact in long-vesselled species. Here, we used custom centrifuge rotors to measure different sample lengths of 1-yr-old stems of grapevine to examine the influence of open vessels on vulnerability curves, thus testing the hypothesized open vessel artifact. These curves were compared with a dehydration-based vulnerability curve. Although samples differed significantly in the number of open vessels, there was no difference in the vulnerability to cavitation measured on 0.14- and 0.271-m-long samples of Vitis vinifera. Dehydration and centrifuge-based curves showed a similar pattern of declining xylem-specific hydraulic conductivity (K(s)) with declining water potential. The percentage loss in hydraulic conductivity (PLC) differed between dehydration and centrifuge curves and it was determined that grapevine is susceptible to errors in estimating maximum K(s) during dehydration because of the development of vessel blockages. Our results from a long-vesselled liana do not support the open vessel artifact hypothesis. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.
Somoskeöy, Szabolcs; Tunyogi-Csapó, Miklós; Bogyó, Csaba; Illés, Tamás
2012-10-01
For many decades, visualization and evaluation of three-dimensional (3D) spinal deformities have only been possible by two-dimensional (2D) radiodiagnostic methods, and as a result, characterization and classification were based on 2D terminologies. Recent developments in medical digital imaging and 3D visualization techniques including surface 3D reconstructions opened a chance for a long-sought change in this field. Supported by a 3D Terminology on Spinal Deformities of the Scoliosis Research Society, an approach for 3D measurements and a new 3D classification of scoliosis yielded several compelling concepts on 3D visualization and new proposals for 3D classification in recent years. More recently, a new proposal for visualization and complete 3D evaluation of the spine by 3D vertebra vectors has been introduced by our workgroup, a concept, based on EOS 2D/3D, a groundbreaking new ultralow radiation dose integrated orthopedic imaging device with sterEOS 3D spine reconstruction software. Comparison of accuracy, correlation of measurement values, intraobserver and interrater reliability of methods by conventional manual 2D and vertebra vector-based 3D measurements in a routine clinical setting. Retrospective, nonrandomized study of diagnostic X-ray images created as part of a routine clinical protocol of eligible patients examined at our clinic during a 30-month period between July 2007 and December 2009. In total, 201 individuals (170 females, 31 males; mean age, 19.88 years) including 10 healthy athletes with normal spine and patients with adolescent idiopathic scoliosis (175 cases), adult degenerative scoliosis (11 cases), and Scheuermann hyperkyphosis (5 cases). Overall range of coronal curves was between 2.4 and 117.5°. Analysis of accuracy and reliability of measurements was carried out on a group of all patients and in subgroups based on coronal plane deviation: 0 to 10° (Group 1; n=36), 10 to 25° (Group 2; n=25), 25 to 50° (Group 3; n=69), 50 to 75
International Nuclear Information System (INIS)
Rochedo, Pedro R.R.; Szklo, Alexandre
2013-01-01
Highlights: • This work defines the minimum work of separation (MWS) for a capture process. • Findings of the analysis indicated a MWS of 0.158 GJ/t for post-combustion. • A review of commercially available processes based on chemical absorption was made. • A review of learning models was conducted, with the addition on a novel model. • A learning curve for post-combustion carbon capture was successfully designed. - Abstract: Carbon capture is one of the most important alternatives for mitigating greenhouse gas emissions in energy facilities. The post-combustion route based on chemical absorption with amine solvents is the most feasible alternative for the short term. However, this route implies in huge energy penalties, mainly related to the solvent regeneration. By defining the minimum work of separation (MWS), this study estimated the minimum energy required to capture the CO 2 emitted by coal-fired thermal power plants. Then, by evaluating solvents and processes and comparing it to the MWS, it proposes the learning model with the best fit for the post-combustion chemical absorption of CO 2 . Learning models are based on earnings from experience, which can include the intensity of research and development. In this study, three models are tested: Wright, DeJong and D and L. Findings of the thermochemical analysis indicated a MWS of 0.158 GJ/t for post-combustion. Conventional solvents currently present an energy penalty eight times the MWS. By using the MWS as a constraint, this study found that the D and L provided the best fit to the available data of chemical solvents and absorption plants. The learning rate determined through this model is very similar to the ones found in the literature
Energy Technology Data Exchange (ETDEWEB)
Chhipa, Mayur Kumar, E-mail: mayurchhipa1@gmail.com [Deptt. of Electronics and Communication Engineering, Government Engineering College Ajmer Rajasthan INDIA (India); Dusad, Lalit Kumar [Rajasthan Technical University Kota, Rajasthan (India)
2016-05-06
In this paper channel drop filter (CDF) is designed using dual curved photonic crystal ring resonator (PCRR). The photonic band gap (PBG) is calculated by plane wave expansion (PWE) method and the photonic crystal (PhC) based on two dimensional (2D) square lattice periodic arrays of silicon (Si) rods in air structure have been investigated using finite difference time domain (FDTD) method. The number of rods in Z and X directions is 21 and 20 respectively with lattice constant 0.540 nm and rod radius r = 0.1 µm. The channel drop filter has been optimized for telecommunication wavelengths λ = 1.591 µm with refractive indices 3.533. In the designed structure further analysis is also done by changing whole rods refractive index and it has been observed that this filter may be used for filtering several other channels also. The designed structure is useful for CWDM systems. This device may serve as a key component in photonic integrated circuits. The device is ultra compact with the overall size around 123 µm{sup 2}.
Directory of Open Access Journals (Sweden)
Yuanyuan Zhang
2015-01-01
Full Text Available Since the concept of ubiquitous computing is firstly proposed by Mark Weiser, its connotation has been extending and expanding by many scholars. In pervasive computing application environment, many kinds of small devices containing smart cart are used to communicate with others. In 2013, Yang et al. proposed an enhanced authentication scheme using smart card for digital rights management. They demonstrated that their scheme is secure enough. However, Mishra et al. pointed out that Yang et al.’s scheme suffers from the password guessing attack and the denial of service attack. Moreover, they also demonstrated that Yang et al.’s scheme is not efficient enough when the user inputs an incorrect password. In this paper, we analyze Yang et al.’s scheme again, and find that their scheme is vulnerable to the session key attack. And, there are some mistakes in their scheme. To surmount the weakness of Yang et al.’s scheme, we propose a more efficient and provable secure digital rights management authentication scheme using smart card based on elliptic curve cryptography.
International Nuclear Information System (INIS)
Neij, Lena
2008-01-01
Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development
Local differential geometry of null curves in conformally flat space-time
International Nuclear Information System (INIS)
Urbantke, H.
1989-01-01
The conformally invariant differential geometry of null curves in conformally flat space-times is given, using the six-vector formalism which has generalizations to higher dimensions. This is then paralleled by a twistor description, with a twofold merit: firstly, sometimes the description is easier in twistor terms, sometimes in six-vector terms, which leads to a mutual enlightenment of both; and secondly, the case of null curves in timelike pseudospheres or 2+1 Minkowski space we were only able to treat twistorially, making use of an invariant differential found by Fubini and Cech. The result is the expected one: apart from stated exceptional cases there is a conformally invariant parameter and two conformally invariant curvatures which, when specified in terms of this parameter, serve to characterize the curve up to conformal transformations. 12 refs. (Author)
Measurement of time-like baryon electro-magnetic form factors in BESIII
Energy Technology Data Exchange (ETDEWEB)
Morales Morales, Cristina; Dbeyssi, Alaa [Helmholtz-Institut Mainz (Germany); Ahmed, Samer Ali Nasher; Lin, Dexu; Rosner, Christoph; Wang, Yadi [Helmholtz-Institut Mainz (Germany); Institut fuer Kernphysik, Johannes Gutenberg-Universitaet Mainz (Germany); Maas, Frank [Helmholtz-Institut Mainz (Germany); Institut fuer Kernphysik, Johannes Gutenberg-Universitaet Mainz (Germany); PRISMA Cluster of Excellence, Johannes Gutenberg-Universitaet Mainz (Germany); Collaboration: BESIII-Collaboration
2016-07-01
BEPCII is a symmetric electron-positron collider located in Beijing running at center-of-mass energies between 2.0 and 4.6 GeV. This energy range allows BESIII experiment to measure baryon form factors both from direct electron-positron annihilation and from initial state radiation processes. We present results on direct electron-positron annihilation into proton anti-proton and preliminary results on direct electron-positron annihilation into lambda anti-lambda based on data collected by BESIII in 2011 and 2012. Finally, expectations on the measurement of nucleon and hyperon electro-magnetic form factors from the BESIII high luminosity energy scan in 2015 and from initial state radiation processes at different center-of-mass energies are also shown.
IGMtransmission: Transmission curve computation
Harrison, Christopher M.; Meiksin, Avery; Stock, David
2015-04-01
IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.
Directory of Open Access Journals (Sweden)
Ping Wan
2016-08-01
Full Text Available Driving anger, called “road rage”, has become increasingly common nowadays, affecting road safety. A few researches focused on how to identify driving anger, however, there is still a gap in driving anger grading, especially in real traffic environment, which is beneficial to take corresponding intervening measures according to different anger intensity. This study proposes a method for discriminating driving anger states with different intensity based on Electroencephalogram (EEG spectral features. First, thirty drivers were recruited to conduct on-road experiments on a busy route in Wuhan, China where anger could be inducted by various road events, e.g., vehicles weaving/cutting in line, jaywalking/cyclist crossing, traffic congestion and waiting red light if they want to complete the experiments ahead of basic time for extra paid. Subsequently, significance analysis was used to select relative energy spectrum of β band (β% and relative energy spectrum of θ band (θ% for discriminating the different driving anger states. Finally, according to receiver operating characteristic (ROC curve analysis, the optimal thresholds (best cut-off points of β% and θ% for identifying none anger state (i.e., neutral were determined to be 0.2183 ≤ θ% < 1, 0 < β% < 0.2586; low anger state is 0.1539 ≤ θ% < 0.2183, 0.2586 ≤ β% < 0.3269; moderate anger state is 0.1216 ≤ θ% < 0.1539, 0.3269 ≤ β% < 0.3674; high anger state is 0 < θ% < 0.1216, 0.3674 ≤ β% < 1. Moreover, the discrimination performances of verification indicate that, the overall accuracy (Acc of the optimal thresholds of β% for discriminating the four driving anger states is 80.21%, while 75.20% for that of θ%. The results can provide theoretical foundation for developing driving anger detection or warning devices based on the relevant optimal thresholds.
International Nuclear Information System (INIS)
Quenzer, A.
1977-01-01
The pion form factor is measured in the reaction e + e - →π + π - for center of mass energies in the range 480-900 MeV. The results are first analysed in terms of the conventional Vector Meson Dominance formalism, and then taking into account the ωπ inelastic channel. The results of this later formalisms is a pion form factor (F) which fits quite well all the existing data on F both in the timelike and spacelike regions, and a pion mean square radius [fr
International Nuclear Information System (INIS)
Celerier, Marie-Noeelle; Szekeres, Peter
2002-01-01
Extending the study of spherically symmetric metrics satisfying the dominant energy condition and exhibiting singularities of power-law type initiated by Szekeres and Iyer, we identify two classes of peculiar interest: focusing timelike singularity solutions with the stress-energy tensor of a radiative perfect fluid (equation of state: p=(1/3)ρ) and a set of null singularity classes verifying identical properties. We consider two important applications of these results: to cosmology, as regards the possibility of solving the horizon problem with no need to resort to any inflationary scenario, and to the strong cosmic censorship hypothesis to which we propose a class of physically consistent counterexamples
Mazza, Fabio
2017-08-01
The curved surface sliding (CSS) system is one of the most in-demand techniques for the seismic isolation of buildings; yet there are still important aspects of its behaviour that need further attention. The CSS system presents variation of friction coefficient, depending on the sliding velocity of the CSS bearings, while friction force and lateral stiffness during the sliding phase are proportional to the axial load. Lateral-torsional response needs to be better understood for base-isolated structures located in near-fault areas, where fling-step and forward-directivity effects can produce long-period (horizontal) velocity pulses. To analyse these aspects, a six-storey reinforced concrete (r.c.) office framed building, with an L-shaped plan and setbacks in elevation, is designed assuming three values of the radius of curvature for the CSS system. Seven in-plan distributions of dynamic-fast friction coefficient for the CSS bearings, ranging from a constant value for all isolators to a different value for each, are considered in the case of low- and medium-type friction properties. The seismic analysis of the test structures is carried out considering an elastic-linear behaviour of the superstructure, while a nonlinear force-displacement law of the CSS bearings is considered in the horizontal direction, depending on sliding velocity and axial load. Given the lack of knowledge of the horizontal direction at which near-fault ground motions occur, the maximum torsional effects and residual displacements are evaluated with reference to different incidence angles, while the orientation of the strongest observed pulses is considered to obtain average values.
Directory of Open Access Journals (Sweden)
Iman Eshraghi
2016-09-01
Full Text Available Imperfection sensitivity of large amplitude vibration of curved single-walled carbon nanotubes (SWCNTs is considered in this study. The SWCNT is modeled as a Timoshenko nano-beam and its curved shape is included as an initial geometric imperfection term in the displacement field. Geometric nonlinearities of von Kármán type and nonlocal elasticity theory of Eringen are employed to derive governing equations of motion. Spatial discretization of governing equations and associated boundary conditions is performed using differential quadrature (DQ method and the corresponding nonlinear eigenvalue problem is iteratively solved. Effects of amplitude and location of the geometric imperfection, and the nonlocal small-scale parameter on the nonlinear frequency for various boundary conditions are investigated. The results show that the geometric imperfection and non-locality play a significant role in the nonlinear vibration characteristics of curved SWCNTs.
Lavini, Cristina; Verhoeff, Joost J. C.; Majoie, Charles B.; Stalpers, Lukas J. A.; Richel, Dick J.; Maas, Mario
2011-01-01
To compare time intensity curve (TIC)-shape analysis of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) data with model-based analysis and semiquantitative analysis in patients with high-grade glioma treated with the antiangiogenic drug bevacizumab. Fifteen patients had a pretreatment
Wafa Chouaib; Peter V. Caldwell; Younes Alila
2018-01-01
This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the...
Knoope, M.M.J.; Meerman, J.C.; Ramirez, C.A.; Faaij, A.P.C.
2013-01-01
This study aims to investigate the technological and economic prospects of integrated gasification facilities for power (IGCC) and Fischer–Tropsch (FT) liquid production with and without CCS over time. For this purpose, a component based experience curve was constructed and applied to identify the
International Nuclear Information System (INIS)
Moon, Won Joo; Min, Oak Key; Kim, Yong Woo
1998-01-01
To improve the convergence and the accuracy of a finite element, the finite element has to describe not only displacement and stress distributions in a static analysis but also rigid body displacements. In this paper, we consider the in-plane-deformable curved beam element to understand the descriptive capability of rigid body displacements of a finite element. We derive the rigid body displacement fields of a single finite element under various essential boundary conditions when the nodal displacements are caused by the rigid body displacement. We also examine the rigid body displacement fields of a quadratic curved beam element by employing the reduced minimization theory
Lagrangian Curves on Spectral Curves of Monopoles
International Nuclear Information System (INIS)
Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.
2010-01-01
We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.
Payande, Abolfazl; Tabesh, Hamed; Shakeri, Mohammad Taghi; Saki, Azadeh; Safarian, Mohammad
2013-01-14
Growth charts are widely used to assess children's growth status and can provide a trajectory of growth during early important months of life. The objectives of this study are going to construct growth charts and normal values of weight-for-age for children aged 0 to 5 years using a powerful and applicable methodology. The results compare with the World Health Organization (WHO) references and semi-parametric LMS method of Cole and Green. A total of 70737 apparently healthy boys and girls aged 0 to 5 years were recruited in July 2004 for 20 days from those attending community clinics for routine health checks as a part of a national survey. Anthropometric measurements were done by trained health staff using WHO methodology. The nonparametric quantile regression method obtained by local constant kernel estimation of conditional quantiles curves using for estimation of curves and normal values. The weight-for-age growth curves for boys and girls aged from 0 to 5 years were derived utilizing a population of children living in the northeast of Iran. The results were similar to the ones obtained by the semi-parametric LMS method in the same data. Among all age groups from 0 to 5 years, the median values of children's weight living in the northeast of Iran were lower than the corresponding values in WHO reference data. The weight curves of boys were higher than those of girls in all age groups. The differences between growth patterns of children living in the northeast of Iran versus international ones necessitate using local and regional growth charts. International normal values may not properly recognize the populations at risk for growth problems in Iranian children. Quantile regression (QR) as a flexible method which doesn't require restricted assumptions, proposed for estimation reference curves and normal values.
Walker, Judy L
2000-01-01
When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...
International Nuclear Information System (INIS)
Kim, Woo Gon; Yin, Song Nan; Kim, Yong Wan
2008-01-01
Alloy 617 is a principal candidate alloy for the high temperature gas-cooled reactor (HTGR) components, because of its high creep rupture strength coupled with its good corrosion behavior in simulated HTGR-helium and its sufficient workability. To describe a creep strain-time curve well, various constitutive equations have been proposed by Kachanov-Rabotnov, Andrade, Garofalo, Evans and Maruyama, et al.. Among them, the K-R model has been used frequently, because a secondary creep resulting from a balance between a softening and a hardening of materials and a tertiary creep resulting from an appearance and acceleration of the internal or external damage processes are adequately considered. In the case of nickel-base alloys, it has been reported that a tertiary creep at a low strain range may be generated, and this tertiary stage may govern the total creep deformation. Therefore, a creep curve for nickel-based Alloy 617 will be predicted appropriately by using the K-R model that can reflect a tertiary creep. In this paper, the long-term creep curves for Alloy 617 were predicted by using the nonlinear least square fitting (NLSF) method in the K-R model. The modified K-R model was introduced to fit the full creep curves well. The values for the λ and K parameters in the modified K-R model were obtained with stresses
Modeling the CO2 emissions and energy saved from new energy vehicles based on the logistic-curve
International Nuclear Information System (INIS)
Tang, Bao-jun; Wu, Xiao-feng; Zhang, Xian
2013-01-01
The Chinese government has outlined plans for developing new energy vehicles (NEVs) to achieve energy conservation and emission reduction. This paper used a logistic-curve to predict the market share of NEVs in the next decade, and then calculated the potential environment benefits of each and every car or the total according to the report of IPCC (2006). The results indicated that NEVs were of benefit in achieving above goals, particularly electric vehicles (EVs). However, they will have a limited impact in the short term. Finally, considering the empirical results and the Chinese reality, this paper proposed corresponding recommendations. - Highlights: ► This paper predicted the number of vehicles in China. ► This paper used a logistic-curve to predict the market share of NEVs. ► The potential environment benefits of every car or the total were calculated. ► China's NEVs would produce more CO 2 than those of other countries
Directory of Open Access Journals (Sweden)
Yu Xiu-Juan
2007-10-01
Full Text Available Abstract Background The nucleotide compositional asymmetry between the leading and lagging strands in bacterial genomes has been the subject of intensive study in the past few years. It is interesting to mention that almost all bacterial genomes exhibit the same kind of base asymmetry. This work aims to investigate the strand biases in Chlamydia muridarum genome and show the potential of the Z curve method for quantitatively differentiating genes on the leading and lagging strands. Results The occurrence frequencies of bases of protein-coding genes in C. muridarum genome were analyzed by the Z curve method. It was found that genes located on the two strands of replication have distinct base usages in C. muridarum genome. According to their positions in the 9-D space spanned by the variables u1 – u9 of the Z curve method, K-means clustering algorithm can assign about 94% of genes to the correct strands, which is a few percent higher than those correctly classified by K-means based on the RSCU. The base usage and codon usage analyses show that genes on the leading strand have more G than C and more T than A, particularly at the third codon position. For genes on the lagging strand the biases is reverse. The y component of the Z curves for the complete chromosome sequences show that the excess of G over C and T over A are more remarkable in C. muridarum genome than in other bacterial genomes without separating base and/or codon usages. Furthermore, for the genomes of Borrelia burgdorferi, Treponema pallidum, Chlamydia muridarum and Chlamydia trachomatis, in which distinct base and/or codon usages have been observed, closer phylogenetic distance is found compared with other bacterial genomes. Conclusion The nature of the strand biases of base composition in C. muridarum is similar to that in most other bacterial genomes. However, the base composition asymmetry between the leading and lagging strands in C. muridarum is more significant than that in
Czech Academy of Sciences Publication Activity Database
Nold, A.; Malijevský, Alexandr; Kalliadasis, S.
2011-01-01
Roč. 197, č. 1 (2011), s. 185-191 ISSN 1951-6355 R&D Projects: GA AV ČR IAA400720710 Grant - others:EPSRC(GB) EP/E046029; FP7 ITN(XE) 214919; ERC(XE) 247301 Institutional research plan: CEZ:AV0Z40720504 Keywords : wetting phenomena * curved substrates * theory Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.562, year: 2011
Directory of Open Access Journals (Sweden)
Ismed Jauhar
2016-03-01
Full Text Available Along with the many environmental changes, it enables a disaster either natural or man-made objects. One of the efforts made to prevent disasters from happening is to make a system that is able to provide information about the status of the environment that is around. Many developments in the sensor system makes it possible to load a system that will supply real-time on the status of environmental conditions with a good security system. This study created a supply system status data of environmental conditions, especially on bridges by using Ubiquitous Sensor Network. Sensor used to detect vibrations are using an accelerometer. Supply of data between sensors and servers using ZigBee communication protocol wherein the data communication will be done using the Elliptic Curve Integrated security mechanisms Encryption Scheme and on the use of Elliptic Curve key aggrement Menezes-Qu-Vanstone. Test results show the limitation of distance for communication is as far as 55 meters, with the computation time for encryption and decryption with 97 and 42 seconds extra time for key exchange is done at the beginning of communication . Keywords: Ubiquitous Sensor Network, Accelerometer, ZigBee,Elliptic Curve Menezes-Qu-Vanstone
Rasner, P I; Pushkar', D Iu; Kolontarev, K B; Kotenkov, D V
2014-01-01
The appearance of new surgical technique always requires evaluation of its effectiveness and ease of acquisition. A comparative study of the results of the first three series of successive robot-assisted radical prostatectomy (RARP) performed on at time by three surgeons, was conducted. The series consisted of 40 procedures, and were divided into 4 groups of 10 operations for the analysis. When comparing data, statistically significant improvement of intra- and postoperative performance in each series was revealed, with increase in the number of operations performed, and in each subsequent series compared with the preceding one. We recommend to perform the planned conversion at the first operation. In our study, previous laparoscopic experience did not provide any significant advantages in the acquisition of robot-assisted technology. To characterize the individual learning curve, we recommend the use of the number of operations that the surgeon looked in the life-surgery regimen and/or in which he participated as an assistant before his own surgical activity, as well as the indicator "technical defect". In addition to the term "individual learning curve", we propose to introduce the terms "surgeon's individual training phase", and "clinic's learning curve".
International Nuclear Information System (INIS)
Shi, F; Tian, Z; Jia, X; Jiang, S; Zarepisheh, M; Cervino, L
2014-01-01
Purpose: In treatment plan optimization for Intensity Modulated Radiation Therapy (IMRT), after a plan is initially developed by a dosimetrist, the attending physician evaluates its quality and often would like to improve it. As opposed to having the dosimetrist implement the improvements, it is desirable to have the physician directly and efficiently modify the plan for a more streamlined and effective workflow. In this project, we developed an interactive optimization system for physicians to conveniently and efficiently fine-tune iso-dose curves. Methods: An interactive interface is developed under C++/Qt. The physician first examines iso-dose lines. S/he then picks an iso-dose curve to be improved and drags it to a more desired configuration using a computer mouse or touchpad. Once the mouse is released, a voxel-based optimization engine is launched. The weighting factors corresponding to voxels between the iso-dose lines before and after the dragging are modified. The underlying algorithm then takes these factors as input to re-optimize the plan in near real-time on a GPU platform, yielding a new plan best matching the physician's desire. The re-optimized DVHs and iso-dose curves are then updated for the next iteration of modifications. This process is repeated until a physician satisfactory plan is achieved. Results: We have tested this system for a series of IMRT plans. Results indicate that our system provides the physicians an intuitive and efficient tool to edit the iso-dose curves according to their preference. The input information is used to guide plan re-optimization, which is achieved in near real-time using our GPU-based optimization engine. Typically, a satisfactory plan can be developed by a physician in a few minutes using this tool. Conclusion: With our system, physicians are able to manipulate iso-dose curves according to their preferences. Preliminary results demonstrate the feasibility and effectiveness of this tool
Directory of Open Access Journals (Sweden)
Bo You
2015-01-01
Full Text Available In order to predict pressing quality of precision press-fit assembly, press-fit curves and maximum press-mounting force of press-fit assemblies were investigated by finite element analysis (FEA. The analysis was based on a 3D Solidworks model using the real dimensions of the microparts and the subsequent FEA model that was built using ANSYS Workbench. The press-fit process could thus be simulated on the basis of static structure analysis. To verify the FEA results, experiments were carried out using a press-mounting apparatus. The results show that the press-fit curves obtained by FEA agree closely with the curves obtained using the experimental method. In addition, the maximum press-mounting force calculated by FEA agrees with that obtained by the experimental method, with the maximum deviation being 4.6%, a value that can be tolerated. The comparison shows that the press-fit curve and max press-mounting force calculated by FEA can be used for predicting the pressing quality during precision press-fit assembly.
Directory of Open Access Journals (Sweden)
Constantinos A. Tsipis
2010-03-01
Full Text Available The NICSzz-scan curves of aromatic organic, inorganic and “all-metal” molecules in conjunction with symmetry-based selection rules provide efficient diagnostic tools of the σ-, π- and/or double (σ + π-aromaticity. The NICSzz-scan curves of σ-aromatic molecules are symmetric around the z-axis, having half-band widths approximately less than 3 Å with the induced diatropic ring current arising from Tx,y-allowed transitions involving exclusively σ-type molecular orbitals. Broad NICSzz-scan curves (half-band width approximately higher than 3 Å characterize double (σ + π-aromaticity, the chief contribution to the induced diatropic ring current arising from Tx,y-allowed transitions involving both σ- and π-type molecular orbitals. NICSzz-scan curves exhibiting two maxima at a certain distance above and below the molecular plane are typical for (σ + π-aromatics where the π-diatropic ring current overwhelms the σ-type one. In the absence of any contribution from the σ-diatropic ring current, the NICSzz(0 value is close to zero and the molecule exhibits pure π-aromaticity.
Energy Technology Data Exchange (ETDEWEB)
Chiriac, Horia [National Institute of Research and Development for Technical Physics, 47 Mangeron Boulevard, 700050, Iasi (Romania); Lupu, Nicoleta [National Institute of Research and Development for Technical Physics, 47 Mangeron Boulevard, 700050, Iasi (Romania); Stoleriu, Laurentiu [Al. I. Cuza University, Department of Solid State and Theoretical Physics, Blvd. Carol I, 11, 700506, Iasi (Romania)]. E-mail: lstoler@uaic.ro; Postolache, Petronel [Al. I. Cuza University, Department of Solid State and Theoretical Physics, Blvd. Carol I, 11, 700506, Iasi (Romania); Stancu, Alexandru [Al. I. Cuza University, Department of Solid State and Theoretical Physics, Blvd. Carol I, 11, 700506, Iasi (Romania)
2007-09-15
In this paper we present the results of applying the first-order reversal curves (FORC) diagram experimental method to the analysis of the magnetization processes of NdFeB-based permanents magnets. The FORC diagrams for this kind of exchange spring magnets show the existence of two magnetic phases-a soft magnetic phase and a hard magnetic one. Micromagnetic modeling is used for validating the hypotheses regarding the origin of the different features of the experimental FORC diagrams.
International Nuclear Information System (INIS)
Jesenik, M.; Gorican, V.; Trlep, M.; Hamler, A.; Stumberger, B.
2006-01-01
A lot of magnetic materials are anisotropic. In the 3D finite element method calculation, anisotropy of the material is taken into account. Anisotropic magnetic material is described with magnetization curves for different magnetization directions. The 3D transient calculation of the rotational magnetic field in the sample of the round rotational single sheet tester with circular sample considering eddy currents is made and compared with the measurement to verify the correctness of the method and to analyze the magnetic field in the sample
International Nuclear Information System (INIS)
Amendolia, S.R.; Badelek, B.; Bertolucci, E.; Bettoni, D.; Bizzeti, A.; Bosisio, L.; Bradaschia, C.; Dell'Orso, M.; Foa, L.; Focardi, E.; Giannetti, P.; Giazotto, A.; Giorgi, M.A.; Marrocchesi, P.S.; Menzione, A.; Ristori, L.; Scribano, A.; Tenchini, R.; Tonelli, G.; Triggiani, G.; Frank, S.G.F.; Harvey, J.; Menasce, D.; Meroni, E.; Moroni, L.; Milan Univ.
1984-01-01
The EM form factor of the pion has been studied in the time-like region by measuring sigma(e + e - ->π + π - ) normalization to sigma(e + e - ->μ + μ - ). Results have been obtained for q 2 down to the physical threshold. (orig.)
Directory of Open Access Journals (Sweden)
Farajollah Zare Jouneghani
2017-12-01
Full Text Available Due to some technical issues that can appear during the manufacturing process of Functionally Graded Materials (FGMs, it can be extremely difficult to produce perfect materials. Indeed, one of the biggest problems is the presence of porosities. For this purpose, the vibrational behavior of doubly-curved shells made of FGM including porosities is investigated in this paper. With respect to previous research, the porosity has been added to the mechanical model that characterizes the through-the-thickness distribution of the graded constituents and applied to doubly-curved shell structures. Few papers have been published on this topic. In fact, it is easier to find works related to one-dimensional structures and beam models that take account the effect of porosities. The First-order Shear Deformation Theory (FSDT is considered as the theoretical framework. In addition, the mechanical properties of the constituents vary along the thickness direction. For this purpose, two power-law distributions are employed to characterize their volume fraction. Strain components are established in an orthogonal curvilinear coordinate system and the governing equations are derived according to the Hamilton’s principle. Finally, Navier’s solution method is used and the numerical results concerning three different types of shell structures are presented.
Energy Technology Data Exchange (ETDEWEB)
Zambelli, Monica de S.; Cicogna, Marcelo A.; Soares, Secundino [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Eletrica
2006-07-01
The proposal of this work is to present a long term hydrothermal scheduling operating policy based on the concept of storage guide-curves. According to this policy, at each stage of the planning period the decision of the amount of water to be discharged by each hydrothermal unit must be such that keep its reservoir at levels pre-determined by curves obtained by an optimization method. The performance analysis for this operating policy is given by simulation with historical inflow data, considering a single hydrothermal system, constituted by a single hydro plant, and a composite system, constituted by hydro plants in cascade, adopting as performance criteria the minimization of the expected operating cost. The results demonstrate that, although simple and clear, this operating policy presents a competitive performance in the long term hydrothermal scheduling. (author)
Curved Microneedle Array-Based sEMG Electrode for Robust Long-Term Measurements and High Selectivity
Directory of Open Access Journals (Sweden)
Minjae Kim
2015-07-01
Full Text Available Surface electromyography is widely used in many fields to infer human intention. However, conventional electrodes are not appropriate for long-term measurements and are easily influenced by the environment, so the range of applications of sEMG is limited. In this paper, we propose a flexible band-integrated, curved microneedle array electrode for robust long-term measurements, high selectivity, and easy applicability. Signal quality, in terms of long-term usability and sensitivity to perspiration, was investigated. Its motion-discriminating performance was also evaluated. The results show that the proposed electrode is robust to perspiration and can maintain a high-quality measuring ability for over 8 h. The proposed electrode also has high selectivity for motion compared with a commercial wet electrode and dry electrode.
International Nuclear Information System (INIS)
Wise, M.E.
1978-01-01
Many hundreds of clearance curves for plasma and urine after a single injection of tracer are well fitted by y=Σsub(i=1)sup(r)Asub(i)exp(-Bsub(i)t),r=2 or 3, based on models with homogeneous compartments. Reanalyzing such sums as in a plot of log y versus log t shows that many of the original curves would fit y=Atsup(-α) or Atsup(-α)exp(-βt) over wide ranges of time and specific activity. Results of such reanalyses for a complete published series for serum albumin 131 I are given, and an outline of those for various compounds in the human body labeled by 3 H. For radiocalcium two such power laws can be fitted in one curve, with a transition between about 1 and 3 days, so that much of the log y versus log t plot consists of two straight lines. These lines are used for starting a numerical analysis that splits the curve into 2 non-linear components, plus a third one that is negligible after 5 min from injection. An outline of the iteration method is given. The components are interpreted physiologically and used to predict total bone activities by (de)convolution, and these are compared with observed ankle activities and with excretion rates. The bone accretion rate is obtained mainly from the middle component and comes to 2 to 3 g Ca/day, while return of 47 Ca from bone to plasma begins at about 1/2 day. These results seem incompatible while any based on compartments. The concept of biological half-life then needs to be reconsidered. (Auth.)
Su, Chiu-Wen; Yen, Amy Ming-Fang; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng
2017-12-01
The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area under a receiver operating characteristics (AUROC) curve. How the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiologic study, and affects the performance in a prediction model, has not been researched yet. A two-stage design was conducted by first proposing a validation study to calibrate CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected performance of the updated prediction model was quantified by comparing AUROC curves between the original and updated models. Estimates regarding calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% confidence interval [CI]: 61.7% to 63.6%) for the non-updated model to 68.9% (95% CI: 68.0% to 69.6%) for the updated one, reaching a statistically significant difference (P prediction model was demonstrated for periodontal disease as measured by the calibrated CPI derived from a large epidemiologic survey.
Flow over riblet curved surfaces
Energy Technology Data Exchange (ETDEWEB)
Loureiro, J B R; Freire, A P Silva, E-mail: atila@mecanica.ufrj.br [Mechanical Engineering Program, Federal University of Rio de Janeiro (COPPE/UFRJ), C.P. 68503, 21.941-972, Rio de Janeiro, RJ (Brazil)
2011-12-22
The present work studies the mechanics of turbulent drag reduction over curved surfaces by riblets. The effects of surface modification on flow separation over steep and smooth curved surfaces are investigated. Four types of two-dimensional surfaces are studied based on the morphometric parameters that describe the body of a blue whale. Local measurements of mean velocity and turbulence profiles are obtained through laser Doppler anemometry (LDA) and particle image velocimetry (PIV).
Method for determining scan timing based on analysis of formation process of the time-density curve
International Nuclear Information System (INIS)
Yamaguchi, Isao; Ishida, Tomokazu; Kidoya, Eiji; Higashimura, Kyoji; Suzuki, Masayuki
2005-01-01
A strict determination of scan timing is needed for dynamic multi-phase scanning and 3D-CT angiography (3D-CTA) by multi-detector row CT (MDCT). In the present study, contrast media arrival time (T AR ) was measured in the abdominal aorta at the bifurcation of the celiac artery for confirmation of circulatory differences in patients. In addition, we analyzed the process of formation of the time-density curve (TDC) and examined factors that affect the time to peak aortic enhancement (T PA ). Mean T AR was 15.57±3.75 s. TDCs were plotted for each duration of injection. The rising portions of TDCs were superimposed on one another. TDCs with longer injection durations were piled up upon one another. Rise angle was approximately constant in response to each flow rate. Rise time (T R ) showed a good correlation with injection duration (T ID ). T R was 1.01 T ID (R 2 =0.994) in the phantom study and 0.94 T lD -0.60 (R 2 =0.988) in the clinical study. In conclusion, for the selection of optimal scan timing it is useful to determine T R at a given point and to determine the time from T AR . (author)
Directory of Open Access Journals (Sweden)
M. E. Shimpi
2012-01-01
Full Text Available Efforts have been directed to study and analyze the squeeze film performance between rotating transversely rough curved porous annular plates in the presence of a magnetic fluid lubricant considering the effect of elastic deformation. A stochastic random variable with nonzero mean, variance, and skewness characterizes the random roughness of the bearing surfaces. With the aid of suitable boundary conditions, the associated stochastically averaged Reynolds' equation is solved to obtain the pressure distribution in turn, which results in the calculation of the load-carrying capacity. The graphical representations establish that the transverse roughness, in general, adversely affects the performance characteristics. However, the magnetization registers a relatively improved performance. It is found that the deformation causes reduced load-carrying capacity which gets further decreased by the porosity. This investigation tends to indicate that the adverse effect of porosity, standard deviation and deformation can be compensated to certain extent by the positive effect of the magnetic fluid lubricant in the case of negatively skewed roughness by choosing the rotational inertia and the aspect ratio, especially for suitable ratio of curvature parameters.
Directory of Open Access Journals (Sweden)
Janusz Charatonik
1991-11-01
Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.
Dias-Silva, Diogo; Pimentel-Nunes, Pedro; Magalhães, Joana; Magalhães, Ricardo; Veloso, Nuno; Ferreira, Carlos; Figueiredo, Pedro; Moutinho, Pedro; Dinis-Ribeiro, Mário
2014-06-01
A simplified narrow-band imaging (NBI) endoscopy classification of gastric precancerous and cancerous lesions was derived and validated in a multicenter study. This classification comes with the need for dissemination through adequate training. To address the learning curve of this classification by endoscopists with differing expertise and to assess the feasibility of a YouTube-based learning program to disseminate it. Prospective study. Five centers. Six gastroenterologists (3 trainees, 3 fully trained endoscopists [FTs]). Twenty tests provided through a Web-based program containing 10 randomly ordered NBI videos of gastric mucosa were taken. Feedback was sent 7 days after every test submission. Measures of accuracy of the NBI classification throughout the time. From the first to the last 50 videos, a learning curve was observed with a 10% increase in global accuracy, for both trainees (from 64% to 74%) and FTs (from 56% to 65%). After 200 videos, sensitivity and specificity of 80% and higher for intestinal metaplasia were observed in half the participants, and a specificity for dysplasia greater than 95%, along with a relevant likelihood ratio for a positive result of 7 to 28 and likelihood ratio for a negative result of 0.21 to 0.82, were achieved by all of the participants. No constant learning curve was observed for the identification of Helicobacter pylori gastritis and sensitivity to dysplasia. The trainees had better results in all of the parameters, except specificity for dysplasia, compared with the FTs. Globally, participants agreed that the program's structure was adequate, except on the feedback, which should have consisted of a more detailed explanation of each answer. No formal sample size estimate. A Web-based learning program could be used to teach and disseminate classifications in the endoscopy field. In this study, an NBI classification for gastric mucosal features seems to be easily learned for the identification of gastric preneoplastic
International Nuclear Information System (INIS)
Haverkamp, U.; Wiezorek, C.; Poetter, R.
1990-01-01
Lyoluminescence dosimetry is based upon light emission during dissolution of previously irradiated dosimetric materials. The lyoluminescence signal is expressed in the dissolution glow curve. These curves begin, depending on the dissolution system, with a high peak followed by an exponentially decreasing intensity. System parameters that influence the graph of the dissolution glow curve, are, for example, injection speed, temperature and pH value of the solution and the design of the dissolution cell. The initial peak does not significantly correlate with the absorbed dose, it is mainly an effect of the injection. The decay of the curve consists of two exponential components: one fast and one slow. The components depend on the absorbed dose and the dosimetric materials used. In particular, the slow component correlates with the absorbed dose. In contrast to the fast component the argument of the exponential function of the slow component is independent of the dosimetric materials investigated: trehalose, glucose and mannitol. The maximum value, following the peak of the curve, and the integral light output are a measure of the absorbed dose. The reason for the different light outputs of various dosimetric materials after irradiation with the same dose is the differing solubility. The character of the dissolution glow curves is the same following irradiation with photons, electrons or neutrons. (author)
Fuchs, Susanne I; Junge, Sibylle; Ellemunter, Helmut; Ballmann, Manfred; Gappa, Monika
2013-05-01
Volumetric capnography reflecting the course of CO2-exhalation is used to assess ventilation inhomogeneity. Calculation of the slope of expiratory phase 3 and the capnographic index (KPIv) from expirograms allows quantification of extent and severity of small airway impairment. However, technical limitations have hampered more widespread use of this technique. Using expiratory molar mass-volume-curves sampled with a handheld ultrasonic flow sensor during tidal breathing is a novel approach to extract similar information from expirograms in a simpler manner possibly qualifying as a screening tool for clinical routine. The aim of the present study was to evaluate calculation of the KPIv based on molar mass-volume-curves sampled with an ultrasonic flow sensor in patients with CF and controls by assessing feasibility, reproducibility and comparability with the Lung Clearance Index (LCI) derived from multiple breath washout (MBW) used as the reference method. Measurements were performed in patients with CF and healthy controls during a single test occasion using the EasyOne Pro, MBW Module (ndd Medical Technologies, Switzerland). Capnography and MBW were performed in 87/96 patients with CF and 38/42 controls, with a success rate of 90.6% for capnography. Mean age (range) was 12.1 (4-25) years. Mean (SD) KPIv was 6.94 (3.08) in CF and 5.10 (2.06) in controls (p=0.001). Mean LCI (SD) was 8.0 (1.4) in CF and 6.2 (0.4) in controls (p=molar mass-volume-curves is feasible. KPIv is significantly different between patients with CF and controls and correlates with the LCI. However, individual data revealed a relevant overlap between patients and controls requiring further evaluation, before this method can be recommended for clinical use. Copyright © 2012 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Dobrowolski, Tomasz
2012-01-01
The constant curvature one and quasi-one dimensional Josephson junction is considered. On the base of Maxwell equations, the sine–Gordon equation that describes an influence of curvature on the kink motion was obtained. It is showed that the method of geometrical reduction of the sine–Gordon model from three to lower dimensional manifold leads to an identical form of the sine–Gordon equation. - Highlights: ► The research on dynamics of the phase in a curved Josephson junction is performed. ► The geometrical reduction is applied to the sine–Gordon model. ► The results of geometrical reduction and the fundamental research are compared.
Directory of Open Access Journals (Sweden)
Hamed Bashirpour
2018-03-01
Full Text Available In wireless sensor networks (WSNs, users can use broadcast authentication mechanisms to connect to the target network and disseminate their messages within the network. Since data transfer for sensor networks is wireless, as a result, attackers can easily eavesdrop deployed sensor nodes and the data sent between them or modify the content of eavesdropped data and inject false data into the sensor network. Hence, the implementation of the message authentication mechanisms (in order to avoid changes and injecting messages into the network of wireless sensor networks is essential. In this paper, we present an improved protocol based on elliptic curve cryptography (ECC to accelerate authentication of multi-user message broadcasting. In comparison with previous ECC-based schemes, complexity and computational overhead of proposed scheme is significantly decreased. Also, the proposed scheme supports user anonymity, which is an important property in broadcast authentication schemes for WSNs to preserve user privacy and user untracking.
Directory of Open Access Journals (Sweden)
A. M. Hashemi
2000-01-01
Full Text Available Regionalized and at-site flood frequency curves exhibit considerable variability in their shapes, but the factors controlling the variability (other than sampling effects are not well understood. An application of the Monte Carlo simulation-based derived distribution approach is presented in this two-part paper to explore the influence of climate, described by simulated rainfall and evapotranspiration time series, and basin factors on the flood frequency curve (ffc. The sensitivity analysis conducted in the paper should not be interpreted as reflecting possible climate changes, but the results can provide an indication of the changes to which the flood frequency curve might be sensitive. A single site Neyman Scott point process model of rainfall, with convective and stratiform cells (Cowpertwait, 1994; 1995, has been employed to generate synthetic rainfall inputs to a rainfall runoff model. The time series of the potential evapotranspiration (ETp demand has been represented through an AR(n model with seasonal component, while a simplified version of the ARNO rainfall-runoff model (Todini, 1996 has been employed to simulate the continuous discharge time series. All these models have been parameterised in a realistic manner using observed data and results from previous applications, to obtain ‘reference’ parameter sets for a synthetic case study. Subsequently, perturbations to the model parameters have been made one-at-a-time and the sensitivities of the generated annual maximum rainfall and flood frequency curves (unstandardised, and standardised by the mean have been assessed. Overall, the sensitivity analysis described in this paper suggests that the soil moisture regime, and, in particular, the probability distribution of soil moisture content at the storm arrival time, can be considered as a unifying link between the perturbations to the several parameters and their effects on the standardised and unstandardised ffcs, thus revealing the
Directory of Open Access Journals (Sweden)
Anup Kumar Maurya
2017-10-01
Full Text Available To improve the quality of service and reduce the possibility of security attacks, a secure and efficient user authentication mechanism is required for Wireless Sensor Networks (WSNs and the Internet of Things (IoT. Session key establishment between the sensor node and the user is also required for secure communication. In this paper, we perform the security analysis of A.K.Das’s user authentication scheme (given in 2015, Choi et al.’s scheme (given in 2016, and Park et al.’s scheme (given in 2016. The security analysis shows that their schemes are vulnerable to various attacks like user impersonation attack, sensor node impersonation attack and attacks based on legitimate users. Based on the cryptanalysis of these existing protocols, we propose a secure and efficient authenticated session key establishment protocol which ensures various security features and overcomes the drawbacks of existing protocols. The formal and informal security analysis indicates that the proposed protocol withstands the various security vulnerabilities involved in WSNs. The automated validation using AVISPA and Scyther tool ensures the absence of security attacks in our scheme. The logical verification using the Burrows-Abadi-Needham (BAN logic confirms the correctness of the proposed protocol. Finally, the comparative analysis based on computational overhead and security features of other existing protocol indicate that the proposed user authentication system is secure and efficient. In future, we intend to implement the proposed protocol in real-world applications of WSNs and IoT.
Directory of Open Access Journals (Sweden)
René Pellissier
2012-01-01
Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.
Method of construction spatial transition curve
Directory of Open Access Journals (Sweden)
S.V. Didanov
2013-04-01
Full Text Available Purpose. The movement of rail transport (speed rolling stock, traffic safety, etc. is largely dependent on the quality of the track. In this case, a special role is the transition curve, which ensures smooth insertion of the transition from linear to circular section of road. The article deals with modeling of spatial transition curve based on the parabolic distribution of the curvature and torsion. This is a continuation of research conducted by the authors regarding the spatial modeling of curved contours. Methodology. Construction of the spatial transition curve is numerical methods for solving nonlinear integral equations, where the initial data are taken coordinate the starting and ending points of the curve of the future, and the inclination of the tangent and the deviation of the curve from the tangent plane at these points. System solutions for the numerical method are the partial derivatives of the equations of the unknown parameters of the law of change of torsion and length of the transition curve. Findings. The parametric equations of the spatial transition curve are calculated by finding the unknown coefficients of the parabolic distribution of the curvature and torsion, as well as the spatial length of the transition curve. Originality. A method for constructing the spatial transition curve is devised, and based on this software geometric modeling spatial transition curves of railway track with specified deviations of the curve from the tangent plane. Practical value. The resulting curve can be applied in any sector of the economy, where it is necessary to ensure a smooth transition from linear to circular section of the curved space bypass. An example is the transition curve in the construction of the railway line, road, pipe, profile, flat section of the working blades of the turbine and compressor, the ship, plane, car, etc.
Chen, Hui; Cai, Li-Xun
2018-04-01
Based on the power-law stress-strain relation and equivalent energy principle, theoretical equations for converting between Brinell hardness (HB), Rockwell hardness (HR), and Vickers hardness (HV) were established. Combining the pre-existing relation between the tensile strength ( σ b ) and Hollomon parameters ( K, N), theoretical conversions between hardness (HB/HR/HV) and tensile strength ( σ b ) were obtained as well. In addition, to confirm the pre-existing σ b -( K, N) relation, a large number of uniaxial tensile tests were conducted in various ductile materials. Finally, to verify the theoretical conversions, plenty of statistical data listed in ASTM and ISO standards were adopted to test the robustness of the converting equations with various hardness and tensile strength. The results show that both hardness conversions and hardness-strength conversions calculated from the theoretical equations accord well with the standard data.
Chaudhry, Shehzad Ashraf; Mahmood, Khalid; Naqvi, Husnain; Khan, Muhammad Khurram
2015-11-01
Telecare medicine information system (TMIS) offers the patients convenient and expedite healthcare services remotely anywhere. Patient security and privacy has emerged as key issues during remote access because of underlying open architecture. An authentication scheme can verify patient's as well as TMIS server's legitimacy during remote healthcare services. To achieve security and privacy a number of authentication schemes have been proposed. Very recently Lu et al. (J. Med. Syst. 39(3):1-8, 2015) proposed a biometric based three factor authentication scheme for TMIS to confiscate the vulnerabilities of Arshad et al.'s (J. Med. Syst. 38(12):136, 2014) scheme. Further, they emphasized the robustness of their scheme against several attacks. However, in this paper we establish that Lu et al.'s scheme is vulnerable to numerous attacks including (1) Patient anonymity violation attack, (2) Patient impersonation attack, and (3) TMIS server impersonation attack. Furthermore, their scheme does not provide patient untraceability. We then, propose an improvement of Lu et al.'s scheme. We have analyzed the security of improved scheme using popular automated tool ProVerif. The proposed scheme while retaining the plusses of Lu et al.'s scheme is also robust against known attacks.
Tian, Dayong; Lin, Zhifen; Yin, Daqiang
2013-01-01
The present study proposed a QSAR model to predict joint effects at non-equitoxic ratios for binary mixtures containing reactive toxicants, cyanogenic compounds and aldehydes. Toxicity of single and binary mixtures was measured by quantifying the decrease in light emission from the Photobacterium phosphoreum for 15 min. The joint effects of binary mixtures (TU sum) can thus be obtained. The results showed that the relationships between toxic ratios of the individual chemicals and their joint effects can be described by normal distribution function. Based on normal distribution equations, the joint effects of binary mixtures at non-equitoxic ratios ( [Formula: see text]) can be predicted quantitatively using the joint effects at equitoxic ratios ( [Formula: see text]). Combined with a QSAR model of [Formula: see text]in our previous work, a novel QSAR model can be proposed to predict the joint effects of mixtures at non-equitoxic ratios ( [Formula: see text]). The proposed model has been validated using additional mixtures other than the one used for the development of the model. Predicted and observed results were similar (p>0.05). This study provides an approach to the prediction of joint effects for binary mixtures at non-equitoxic ratios.
Directory of Open Access Journals (Sweden)
Chase A. Klingaman
2017-02-01
Full Text Available The data presented in this article are related to the research article, “HPLC-based enzyme kinetics assay for glucosinolate hydrolysis facilitate analysis of systems with both multiple reaction products and thermal enzyme denaturation” (C.K. Klingaman, M.J. Wagner, J.R. Brown, J.B. Klecker, E.H. Pauley, C.J. Noldner, J.R. Mays, [1]. This data article describes (1 the synthesis and spectral characterization data of a non-natural glucosinolate analogue, 2,2-diphenylethyl glucosinolate, (2 HPLC standardization data for glucosinolate, isothiocyanate, nitrile, and amine analytes, (3 reaction progress curve data for enzymatic hydrolysis reactions with variable substrate concentration, enzyme concentration, buffer pH, and temperature, and (4 normalized initial velocities of hydrolysis/formation for analytes. These data provide a comprehensive description of the enzyme-catalyzed hydrolysis of 2,2-diphenylethyl glucosinolate (5 and glucotropaeolin (6 under widely varied conditions.
Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio
2017-01-01
In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.
Simulating Supernova Light Curves
International Nuclear Information System (INIS)
Even, Wesley Paul; Dolence, Joshua C.
2016-01-01
This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth's atmosphere.
Simulating Supernova Light Curves
Energy Technology Data Exchange (ETDEWEB)
Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-05
This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.
Image scaling curve generation
2012-01-01
The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then
Image scaling curve generation.
2011-01-01
The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then
Tempo curves considered harmful
Desain, P.; Honing, H.
1993-01-01
In the literature of musicology, computer music research and the psychology of music, timing or tempo measurements are mostly presented in the form of continuous curves. The notion of these tempo curves is dangerous, despite its widespread use, because it lulls its users into the false impression
Optimization on Spaces of Curves
DEFF Research Database (Denmark)
Møller-Andersen, Jakob
in Rd, and methods to solve the initial and boundary value problem for geodesics allowing us to compute the Karcher mean and principal components analysis of data of curves. We apply the methods to study shape variation in synthetic data in the Kimia shape database, in HeLa cell nuclei and cycles...... of cardiac deformations. Finally we investigate a new application of Riemannian shape analysis in shape optimization. We setup a simple elliptic model problem, and describe how to apply shape calculus to obtain directional derivatives in the manifold of planar curves. We present an implementation based...
Chou, Kai-Seng
2001-01-01
Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.
A versatile curve-fit model for linear to deeply concave rank abundance curves
Neuteboom, J.H.; Struik, P.C.
2005-01-01
A new, flexible curve-fit model for linear to concave rank abundance curves was conceptualized and validated using observational data. The model links the geometric-series model and log-series model and can also fit deeply concave rank abundance curves. The model is based ¿ in an unconventional way
DEFF Research Database (Denmark)
Brücker, Herbert; Jahn, Elke J.
in a general equilibrium framework. For the empirical analysis we employ the IABS, a two percent sample of the German labor force. We find that the elasticity of the wage curve is particularly high for young workers and workers with a university degree, while it is low for older workers and workers...... Based on a wage curve approach we examine the labor market effects of migration in Germany. The wage curve relies on the assumption that wages respond to a change in the unemployment rate, albeit imperfectly. This allows one to derive the wage and employment effects of migration simultaneously...... with a vocational degree. The wage and employment effects of migration are moderate: a 1 percent increase in the German labor force through immigration increases the aggregate unemployment rate by less than 0.1 percentage points and reduces average wages by less 0.1 percent. While native workers benefit from...
Laffer Curves and Home Production
Directory of Open Access Journals (Sweden)
Kotamäki Mauri
2017-06-01
Full Text Available In the earlier related literature, consumption tax rate Laffer curve is found to be strictly increasing (see Trabandt and Uhlig (2011. In this paper, a general equilibrium macro model is augmented by introducing a substitute for private consumption in the form of home production. The introduction of home production brings about an additional margin of adjustment – an increase in consumption tax rate not only decreases labor supply and reduces the consumption tax base but also allows a substitution of market goods with home-produced goods. The main objective of this paper is to show that, after the introduction of home production, the consumption tax Laffer curve exhibits an inverse U-shape. Also the income tax Laffer curves are significantly altered. The result shown in this paper casts doubt on some of the earlier results in the literature.
Energy Technology Data Exchange (ETDEWEB)
Lippoldt, Stefan
2016-01-21
In this thesis we study a formulation of Dirac fermions in curved spacetime that respects general coordinate invariance as well as invariance under local spin base transformations. We emphasize the advantages of the spin base invariant formalism both from a conceptual as well as from a practical viewpoint. This suggests that local spin base invariance should be added to the list of (effective) properties of (quantum) gravity theories. We find support for this viewpoint by the explicit construction of a global realization of the Clifford algebra on a 2-sphere which is impossible in the spin-base non-invariant vielbein formalism. The natural variables for this formulation are spacetime-dependent Dirac matrices subject to the Clifford-algebra constraint. In particular, a coframe, i.e. vielbein field is not required. We disclose the hidden spin base invariance of the vielbein formalism. Explicit formulas for the spin connection as a function of the Dirac matrices are found. This connection consists of a canonical part that is completely fixed in terms of the Dirac matrices and a free part that can be interpreted as spin torsion. The common Lorentz symmetric gauge for the vielbein is constructed for the Dirac matrices, even for metrics which are not linearly connected. Under certain criteria, it constitutes the simplest possible gauge, demonstrating why this gauge is so useful. Using the spin base formulation for building a field theory of quantized gravity and matter fields, we show that it suffices to quantize the metric and the matter fields. This observation is of particular relevance for field theory approaches to quantum gravity, as it can serve for a purely metric-based quantization scheme for gravity even in the presence of fermions. Hence, in the second part of this thesis we critically examine the gauge, and the field-parametrization dependence of renormalization group flows in the vicinity of non-Gaussian fixed points in quantum gravity. While physical
International Nuclear Information System (INIS)
Pradhan, S.M.; Sneha, C.; Adtani, M.M.
2010-01-01
The facility of glow curve storage and recall provided in the reader software is helpful for manual screening of the glow curves; however no further analysis is possible due to absence of numerical TL data at the sampling intervals. In the present study glow curves are digitized by modifying the reader software and then normalized to make them independent of the dose. The normalized glow curves are then analyzed by dividing them into five equal parts on time scale. This method of analysis is used to correlate the variation of total TL counts of the three discs with time elapsed post irradiation
Directory of Open Access Journals (Sweden)
Paulo Prochno
2004-07-01
Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.
Consistent Valuation across Curves Using Pricing Kernels
Directory of Open Access Journals (Sweden)
Andrea Macrina
2018-03-01
Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.
Fach, S; Sitzenfrei, R; Rauch, W
2009-01-01
It is state of the art to evaluate and optimise sewer systems with urban drainage models. Since spill flow data is essential in the calibration process of conceptual models it is important to enhance the quality of such data. A wide spread approach is to calculate the spill flow volume by using standard weir equations together with measured water levels. However, these equations are only applicable to combined sewer overflow (CSO) structures, whose weir constructions correspond with the standard weir layout. The objective of this work is to outline an alternative approach to obtain spill flow discharge data based on measurements with a sonic depth finder. The idea is to determine the relation between water level and rate of spill flow by running a detailed 3D computational fluid dynamics (CFD) model. Two real world CSO structures have been chosen due to their complex structure, especially with respect to the weir construction. In a first step the simulation results were analysed to identify flow conditions for discrete steady states. It will be shown that the flow conditions in the CSO structure change after the spill flow pipe acts as a controlled outflow and therefore the spill flow discharge cannot be described with a standard weir equation. In a second step the CFD results will be used to derive rating curves which can be easily applied in everyday practice. Therefore the rating curves are developed on basis of the standard weir equation and the equation for orifice-type outlets. Because the intersection of both equations is not known, the coefficients of discharge are regressed from CFD simulation results. Furthermore, the regression of the CFD simulation results are compared with the one of the standard weir equation by using historic water levels and hydrographs generated with a hydrodynamic model. The uncertainties resulting of the wide spread use of the standard weir equation are demonstrated.
Khobragade, P.; Fan, Jiahua; Rupcich, Franco; Crotty, Dominic J.; Gilat Schmidt, Taly
2016-03-01
This study quantitatively evaluated the performance of the exponential transformation of the free-response operating characteristic curve (EFROC) metric, with the Channelized Hotelling Observer (CHO) as a reference. The CHO has been used for image quality assessment of reconstruction algorithms and imaging systems and often it is applied to study the signal-location-known cases. The CHO also requires a large set of images to estimate the covariance matrix. In terms of clinical applications, this assumption and requirement may be unrealistic. The newly developed location-unknown EFROC detectability metric is estimated from the confidence scores reported by a model observer. Unlike the CHO, EFROC does not require a channelization step and is a non-parametric detectability metric. There are few quantitative studies available on application of the EFROC metric, most of which are based on simulation data. This study investigated the EFROC metric using experimental CT data. A phantom with four low contrast objects: 3mm (14 HU), 5mm (7HU), 7mm (5 HU) and 10 mm (3 HU) was scanned at dose levels ranging from 25 mAs to 270 mAs and reconstructed using filtered backprojection. The area under the curve values for CHO (AUC) and EFROC (AFE) were plotted with respect to different dose levels. The number of images required to estimate the non-parametric AFE metric was calculated for varying tasks and found to be less than the number of images required for parametric CHO estimation. The AFE metric was found to be more sensitive to changes in dose than the CHO metric. This increased sensitivity and the assumption of unknown signal location may be useful for investigating and optimizing CT imaging methods. Future work is required to validate the AFE metric against human observers.
Buonanno, Paolo; Fergusson, Leopoldo; Vargas, Juan Fernando
2014-01-01
We document the existence of a Crime Kuznets Curve in US states since the 1970s. As income levels have risen, crime has followed an inverted U-shaped pattern, first increasing and then dropping. The Crime Kuznets Curve is not explained by income inequality. In fact, we show that during the sample period inequality has risen monotonically with income, ruling out the traditional Kuznets Curve. Our finding is robust to adding a large set of controls that are used in the literature to explain the...
De Luca, Michele; Ioele, Giuseppina; Mas, Sílvia; Tauler, Romà; Ragno, Gaetano
2012-11-21
Amiloride photostability at different pH values was studied in depth by applying Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) to the UV spectrophotometric data from drug solutions exposed to stressing irradiation. Resolution of all degradation photoproducts was possible by simultaneous spectrophotometric analysis of kinetic photodegradation and acid-base titration experiments. Amiloride photodegradation showed to be strongly dependent on pH. Two hard modelling constraints were sequentially used in MCR-ALS for the unambiguous resolution of all the species involved in the photodegradation process. An amiloride acid-base system was defined by using the equilibrium constraint, and the photodegradation pathway was modelled taking into account the kinetic constraint. The simultaneous analysis of photodegradation and titration experiments revealed the presence of eight different species, which were differently distributed according to pH and time. Concentration profiles of all the species as well as their pure spectra were resolved and kinetic rate constants were estimated. The values of rate constants changed with pH and under alkaline conditions the degradation pathway and photoproducts also changed. These results were compared to those obtained by LC-MS analysis from drug photodegradation experiments. MS analysis allowed the identification of up to five species and showed the simultaneous presence of more than one acid-base equilibrium.
Johnson, L. E.; Kim, J.; Cifelli, R.; Chandra, C. V.
2016-12-01
Potential water retention, S, is one of parameters commonly used in hydrologic modeling for soil moisture accounting. Physically, S indicates total amount of water which can be stored in soil and is expressed in units of depth. S can be represented as a change of soil moisture content and in this context is commonly used to estimate direct runoff, especially in the Soil Conservation Service (SCS) curve number (CN) method. Generally, the lumped and the distributed hydrologic models can easily use the SCS-CN method to estimate direct runoff. Changes in potential water retention have been used in previous SCS-CN studies; however, these studies have focused on long-term hydrologic simulations where S is allowed to vary at the daily time scale. While useful for hydrologic events that span multiple days, the resolution is too coarse for short-term applications such as flash flood events where S may not recover its full potential. In this study, a new method for estimating a time-variable potential water retention at hourly time-scales is presented. The methodology is applied for the Napa River basin, California. The streamflow gage at St Helena, located in the upper reaches of the basin, is used as the control gage site to evaluate the model performance as it is has minimal influences by reservoirs and diversions. Rainfall events from 2011 to 2012 are used for estimating the event-based SCS CN to transfer to S. As a result, we have derived the potential water retention curve and it is classified into three sections depending on the relative change in S. The first is a negative slope section arising from the difference in the rate of moving water through the soil column, the second is a zero change section representing the initial recovery the potential water retention, and the third is a positive change section representing the full recovery of the potential water retention. Also, we found that the soil water moving has traffic jam within 24 hours after finished first
Directory of Open Access Journals (Sweden)
Kožul Nataša
2014-01-01
Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.
U.S. Environmental Protection Agency — an UV calibration curve for SRHA quantitation. This dataset is associated with the following publication: Chang, X., and D. Bouchard. Surfactant-Wrapped Multiwalled...
International Nuclear Information System (INIS)
Gruhn, C.R.
1981-05-01
An alternative utilization is presented for the gaseous ionization chamber in the detection of energetic heavy ions, which is called Bragg Curve Spectroscopy (BCS). Conceptually, BCS involves using the maximum data available from the Bragg curve of the stopping heavy ion (HI) for purposes of identifying the particle and measuring its energy. A detector has been designed that measures the Bragg curve with high precision. From the Bragg curve the range from the length of the track, the total energy from the integral of the specific ionization over the track, the dE/dx from the specific ionization at the beginning of the track, and the Bragg peak from the maximum of the specific ionization of the HI are determined. This last signal measures the atomic number, Z, of the HI unambiguously
Directory of Open Access Journals (Sweden)
Sutawanir Darwis
2012-05-01
Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.
Chouaib, Wafa; Caldwell, Peter V.; Alila, Younes
2018-04-01
This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the Sacramento model (SAC-SMA) to simulate soil moisture and flow components FDCs. The catchments classification based on storm characteristics pointed to the effect of catchments landscape properties on the precipitation variability and consequently on the FDC shapes. The landscape properties effect was pronounce such that low value of the slope of FDC (SFDC)-hinting at limited flow variability-were present in regions of high precipitation variability. Whereas, in regions with low precipitation variability the SFDCs were of larger values. The topographic index distribution, at the catchment scale, indicated that saturation excess overland flow mitigated the flow variability under conditions of low elevations with large soil moisture storage capacity and high infiltration rates. The SFDCs increased due to the predominant subsurface stormflow in catchments at high elevations with limited soil moisture storage capacity and low infiltration rates. Our analyses also highlighted the major role of soil infiltration rates on the FDC despite the impact of the predominant runoff generation mechanism and catchment elevation. In conditions of slow infiltration rates in soils of large moisture storage capacity (at low elevations) and predominant saturation excess, the SFDCs were of larger values. On the other hand, the SFDCs decreased in catchments of prevalent subsurface stormflow and poorly drained soils of small soil moisture storage capacity. The analysis of the flow components FDCs demonstrated that the interflow contribution to the response was the higher in catchments with large value of slope of the FDC. The surface flow
DEFF Research Database (Denmark)
Georgieva Yankova, Ginka; Federici, Paolo
This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2.......This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2....
Alexeev, Valery; Clemens, C Herbert; Beauville, Arnaud
2008-01-01
This book is devoted to recent progress in the study of curves and abelian varieties. It discusses both classical aspects of this deep and beautiful subject as well as two important new developments, tropical geometry and the theory of log schemes. In addition to original research articles, this book contains three surveys devoted to singularities of theta divisors, of compactified Jacobians of singular curves, and of "strange duality" among moduli spaces of vector bundles on algebraic varieties.
Park, Joonhong; Song, Minsik; Jang, Woori; Chae, Hyojin; Lee, Gun Dong; Kim, KyungTak; Park, Heekyung; Kim, Myungshin; Kim, Yonggoo
2017-02-01
We developed and evaluated the feasibility of peptide nucleic acid (PNA)-based fluorescence melting curve analysis (FMCA) to detect common mutations in myeloproliferative neoplasms (MPNs). We have set up two separate reactions of PNA-based FMCA: JAK2 V617F &CALR p.Leu367fs*46 (set A) and MPL W515L/K &CALR p.Lys385fs*47 (set B). Clinical usefulness was validated with allele-specific real-time PCR, fragment analysis, Sanger sequencing in 57 BCR-ABL1-negative MPNs. The limit of detection (LOD) of PNA-based FMCA was approximately 10% for each mutation and interference reactions using mixtures of different mutations were not observed. Non-specific amplification was not observed in normal control. PNA-based FMCA was able to detect all JAK2 V617F (n=20), CALR p.Leu367fs*46 (n=10) and p.Lys385fs*47 (n=8). Three of six MPL mutations were detected except three samples with low mutant concentration in out of LOD. JAK2 exon 12 mutations (n=7) were negative without influencing V617F results. Among six variant CALR exon 9 mutations, two were detected by this method owing to invading of probe binding site. PNA-based FMCA for detecting common JAK2, MPL, and CALR mutations is a rapid, simple, and sensitive technique in BCR-ABL1-negative MPNs with >10% mutant allele at the time of initial diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Jimit R Patel
2014-12-01
Full Text Available Efforts have been made to analyze the Shliomis model based ferrofluid lubrication of a squeeze film between rotating rough curved circular plates where the upper plate has a porous facing. Different models of porosity are treated. The stochastic modeling of Christensen and Tonder has been employed to evaluate the effect of surface roughness. The related stochastically averaged Reynolds type equation is numerically solved to obtain the pressure distribution, leading to the calculation of load carrying capacity. The results presented in graphical form establish that the Kozeny-Carman model is more favorable as compared to the Irmay one from the design point of view. It is observed that the Shliomis model based ferrofluid lubrication performs relatively better than the Neuringer-Rosensweig one. Although the bearing suffers due to transverse surface roughness, with a suitable choice of curvature parameters and rotational ratio, the negative effect of porosity and standard deviation can be minimized by the ferrofluid lubrication at least in the case of negatively skewed roughness.
Inverse Diffusion Curves Using Shape Optimization.
Zhao, Shuang; Durand, Fredo; Zheng, Changxi
2018-07-01
The inverse diffusion curve problem focuses on automatic creation of diffusion curve images that resemble user provided color fields. This problem is challenging since the 1D curves have a nonlinear and global impact on resulting color fields via a partial differential equation (PDE). We introduce a new approach complementary to previous methods by optimizing curve geometry. In particular, we propose a novel iterative algorithm based on the theory of shape derivatives. The resulting diffusion curves are clean and well-shaped, and the final image closely approximates the input. Our method provides a user-controlled parameter to regularize curve complexity, and generalizes to handle input color fields represented in a variety of formats.
Hyper-and-elliptic-curve cryptography
Bernstein, D.J.; Lange, T.
2014-01-01
This paper introduces ‘hyper-and-elliptic-curve cryptography’, in which a single high-security group supports fast genus-2-hyperelliptic-curve formulas for variable-base-point single-scalar multiplication (for example, Diffie–Hellman shared-secret computation) and at the same time supports fast
Approximation by planar elastic curves
DEFF Research Database (Denmark)
Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge
2016-01-01
We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....
Li, Ying; Shi, Xiaohu; Liang, Yanchun; Xie, Juan; Zhang, Yu; Ma, Qin
2017-01-21
RNAs have been found to carry diverse functionalities in nature. Inferring the similarity between two given RNAs is a fundamental step to understand and interpret their functional relationship. The majority of functional RNAs show conserved secondary structures, rather than sequence conservation. Those algorithms relying on sequence-based features usually have limitations in their prediction performance. Hence, integrating RNA structure features is very critical for RNA analysis. Existing algorithms mainly fall into two categories: alignment-based and alignment-free. The alignment-free algorithms of RNA comparison usually have lower time complexity than alignment-based algorithms. An alignment-free RNA comparison algorithm was proposed, in which novel numerical representations RNA-TVcurve (triple vector curve representation) of RNA sequence and corresponding secondary structure features are provided. Then a multi-scale similarity score of two given RNAs was designed based on wavelet decomposition of their numerical representation. In support of RNA mutation and phylogenetic analysis, a web server (RNA-TVcurve) was designed based on this alignment-free RNA comparison algorithm. It provides three functional modules: 1) visualization of numerical representation of RNA secondary structure; 2) detection of single-point mutation based on secondary structure; and 3) comparison of pairwise and multiple RNA secondary structures. The inputs of the web server require RNA primary sequences, while corresponding secondary structures are optional. For the primary sequences alone, the web server can compute the secondary structures using free energy minimization algorithm in terms of RNAfold tool from Vienna RNA package. RNA-TVcurve is the first integrated web server, based on an alignment-free method, to deliver a suite of RNA analysis functions, including visualization, mutation analysis and multiple RNAs structure comparison. The comparison results with two popular RNA
Yamamoto, Yoshiaki; Tsunedomi, Ryouichi; Fujita, Yusuke; Otori, Toru; Ohba, Mitsuyoshi; Kawai, Yoshihisa; Hirata, Hiroshi; Matsumoto, Hiroaki; Haginaka, Jun; Suzuki, Shigeo; Dahiya, Rajvir; Hamamoto, Yoshihiko; Matsuyama, Kenji; Hazama, Shoichi; Nagano, Hiroaki; Matsuyama, Hideyasu
2018-03-30
We investigated the relationship between axitinib pharmacogenetics and clinical efficacy/adverse events in advanced renal cell carcinoma (RCC) and established a model to predict clinical efficacy and adverse events using pharmacokinetic and gene polymorphisms related to drug metabolism and efflux in a phase II trial. We prospectively evaluated the area under the plasma concentration-time curve (AUC) of axitinib, objective response rate, and adverse events in 44 consecutive advanced RCC patients treated with axitinib. To establish a model for predicting clinical efficacy and adverse events, polymorphisms in genes including ABC transporters ( ABCB1 and ABCG2 ), UGT1A , and OR2B11 were analyzed by whole-exome sequencing, Sanger sequencing, and DNA microarray. To validate this prediction model, calculated AUC by 6 gene polymorphisms was compared with actual AUC in 16 additional consecutive patients prospectively. Actual AUC significantly correlated with the objective response rate ( P = 0.0002) and adverse events (hand-foot syndrome, P = 0.0055; and hypothyroidism, P = 0.0381). Calculated AUC significantly correlated with actual AUC ( P treatment precisely predicted actual AUC after axitinib treatment ( P = 0.0066). Our pharmacogenetics-based AUC prediction model may determine the optimal initial dose of axitinib, and thus facilitate better treatment of patients with advanced RCC.
International Nuclear Information System (INIS)
Zhou, J; Lasio, G; Chen, S; Zhang, B; Langen, K; Prado, K; D’Souza, W; Yi, B; Huang, J
2015-01-01
Purpose: To develop a CBCT HU correction method using a patient specific HU to mass density conversion curve based on a novel image registration and organ mapping method for head-and-neck radiation therapy. Methods: There are three steps to generate a patient specific CBCT HU to mass density conversion curve. First, we developed a novel robust image registration method based on sparseness analysis to register the planning CT (PCT) and the CBCT. Second, a novel organ mapping method was developed to transfer the organs at risk (OAR) contours from the PCT to the CBCT and corresponding mean HU values of each OAR were measured in both the PCT and CBCT volumes. Third, a set of PCT and CBCT HU to mass density conversion curves were created based on the mean HU values of OARs and the corresponding mass density of the OAR in the PCT. Then, we compared our proposed conversion curve with the traditional Catphan phantom based CBCT HU to mass density calibration curve. Both curves were input into the treatment planning system (TPS) for dose calculation. Last, the PTV and OAR doses, DVH and dose distributions of CBCT plans are compared to the original treatment plan. Results: One head-and-neck cases which contained a pair of PCT and CBCT was used. The dose differences between the PCT and CBCT plans using the proposed method are −1.33% for the mean PTV, 0.06% for PTV D95%, and −0.56% for the left neck. The dose differences between plans of PCT and CBCT corrected using the CATPhan based method are −4.39% for mean PTV, 4.07% for PTV D95%, and −2.01% for the left neck. Conclusion: The proposed CBCT HU correction method achieves better agreement with the original treatment plan compared to the traditional CATPhan based calibration method
Energy Technology Data Exchange (ETDEWEB)
Zhou, J [University of Maryland School of Medicine, Bel Air, MD (United States); Lasio, G; Chen, S; Zhang, B; Langen, K; Prado, K; D’Souza, W [University of Maryland School of Medicine, Baltimore, MD (United States); Yi, B [Univ. of Maryland School Of Medicine, Baltimore, MD (United States); Huang, J [University of Texas at Arlington, Arlington, TX (United States)
2015-06-15
Purpose: To develop a CBCT HU correction method using a patient specific HU to mass density conversion curve based on a novel image registration and organ mapping method for head-and-neck radiation therapy. Methods: There are three steps to generate a patient specific CBCT HU to mass density conversion curve. First, we developed a novel robust image registration method based on sparseness analysis to register the planning CT (PCT) and the CBCT. Second, a novel organ mapping method was developed to transfer the organs at risk (OAR) contours from the PCT to the CBCT and corresponding mean HU values of each OAR were measured in both the PCT and CBCT volumes. Third, a set of PCT and CBCT HU to mass density conversion curves were created based on the mean HU values of OARs and the corresponding mass density of the OAR in the PCT. Then, we compared our proposed conversion curve with the traditional Catphan phantom based CBCT HU to mass density calibration curve. Both curves were input into the treatment planning system (TPS) for dose calculation. Last, the PTV and OAR doses, DVH and dose distributions of CBCT plans are compared to the original treatment plan. Results: One head-and-neck cases which contained a pair of PCT and CBCT was used. The dose differences between the PCT and CBCT plans using the proposed method are −1.33% for the mean PTV, 0.06% for PTV D95%, and −0.56% for the left neck. The dose differences between plans of PCT and CBCT corrected using the CATPhan based method are −4.39% for mean PTV, 4.07% for PTV D95%, and −2.01% for the left neck. Conclusion: The proposed CBCT HU correction method achieves better agreement with the original treatment plan compared to the traditional CATPhan based calibration method.
Energy Technology Data Exchange (ETDEWEB)
Smith, A. M. S.; Anderson, D. R.; Hellier, C.; Maxted, P. F. L.; Smalley, B.; Southworth, J. [Astrophysics Group, Keele University, Staffordshire, ST5 5BG (United Kingdom); Collier Cameron, A. [SUPA, School of Physics and Astronomy, University of St Andrews, North Haugh, Fife, KY16 9SS (United Kingdom); Gillon, M.; Jehin, E. [Institut d' Astrophysique et de Geophysique, Universite de Liege, Allee du 6 Aout, 17 Bat. B5C, Liege 1 (Belgium); Lendl, M.; Queloz, D.; Triaud, A. H. M. J.; Pepe, F.; Segransan, D.; Udry, S. [Observatoire de Geneve, Universite de Geneve, 51 Chemin des Maillettes, 1290 Sauverny (Switzerland); West, R. G. [Department of Physics and Astronomy, University of Leicester, Leicester, LE1 7RH (United Kingdom); Barros, S. C. C.; Pollacco, D. [Astrophysics Research Centre, School of Mathematics and Physics, Queen' s University, University Road, Belfast, BT7 1NN (United Kingdom); Street, R. A., E-mail: amss@astro.keele.ac.uk [Las Cumbres Observatory, 6740 Cortona Drive Suite 102, Goleta, CA 93117 (United States)
2012-04-15
We report the discovery, from WASP and CORALIE, of a transiting exoplanet in a 1.54 day orbit. The host star, WASP-36, is a magnitude V = 12.7, metal-poor G2 dwarf (T{sub eff} = 5959 {+-} 134 K), with [Fe/H] =-0.26 {+-} 0.10. We determine the planet to have mass and radius, respectively, 2.30 {+-} 0.07 and 1.28 {+-} 0.03 times that of Jupiter. We have eight partial or complete transit light curves, from four different observatories, which allow us to investigate the potential effects on the fitted system parameters of using only a single light curve. We find that the solutions obtained by analyzing each of these light curves independently are consistent with our global fit to all the data, despite the apparent presence of correlated noise in at least two of the light curves.
Tao, Laifa; Lu, Chen; Noktehdan, Azadeh
2015-10-01
Battery capacity estimation is a significant recent challenge given the complex physical and chemical processes that occur within batteries and the restrictions on the accessibility of capacity degradation data. In this study, we describe an approach called dynamic spatial time warping, which is used to determine the similarities of two arbitrary curves. Unlike classical dynamic time warping methods, this approach can maintain the invariance of curve similarity to the rotations and translations of curves, which is vital in curve similarity search. Moreover, it utilizes the online charging or discharging data that are easily collected and do not require special assumptions. The accuracy of this approach is verified using NASA battery datasets. Results suggest that the proposed approach provides a highly accurate means of estimating battery capacity at less time cost than traditional dynamic time warping methods do for different individuals and under various operating conditions.
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Vesth, Allan
This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere......This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here......, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...
Curved electromagnetic missiles
International Nuclear Information System (INIS)
Myers, J.M.; Shen, H.M.; Wu, T.T.
1989-01-01
Transient electromagnetic fields can exhibit interesting behavior in the limit of great distances from their sources. In situations of finite total radiated energy, the energy reaching a distant receiver can decrease with distance much more slowly than the usual r - 2 . Cases of such slow decrease have been referred to as electromagnetic missiles. All of the wide variety of known missiles propagate in essentially straight lines. A sketch is presented here of a missile that can follow a path that is strongly curved. An example of a curved electromagnetic missile is explicitly constructed and some of its properties are discussed. References to details available elsewhere are given
Algebraic curves and cryptography
Murty, V Kumar
2010-01-01
It is by now a well-known paradigm that public-key cryptosystems can be built using finite Abelian groups and that algebraic geometry provides a supply of such groups through Abelian varieties over finite fields. Of special interest are the Abelian varieties that are Jacobians of algebraic curves. All of the articles in this volume are centered on the theme of point counting and explicit arithmetic on the Jacobians of curves over finite fields. The topics covered include Schoof's \\ell-adic point counting algorithm, the p-adic algorithms of Kedlaya and Denef-Vercauteren, explicit arithmetic on
Learning from uncertain curves
DEFF Research Database (Denmark)
Mallasto, Anton; Feragen, Aasa
2017-01-01
We introduce a novel framework for statistical analysis of populations of nondegenerate Gaussian processes (GPs), which are natural representations of uncertain curves. This allows inherent variation or uncertainty in function-valued data to be properly incorporated in the population analysis. Us...
DEFF Research Database (Denmark)
Federici, Paolo; Kock, Carsten Weber
This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...
DEFF Research Database (Denmark)
Vesth, Allan; Kock, Carsten Weber
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....
DEFF Research Database (Denmark)
Federici, Paolo; Vesth, Allan
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....
DEFF Research Database (Denmark)
Villanueva, Héctor; Gómez Arranz, Paula
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...
Groot, L.F.M.|info:eu-repo/dai/nl/073642398
2008-01-01
The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across
Hunter, Walter M.
This document contains detailed directions for constructing a device that mechanically produces the three-dimensional shape resulting from the rotation of any algebraic line or curve around either axis on the coordinate plant. The device was developed in response to student difficulty in visualizing, and thus grasping the mathematical principles…
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Wagner, Rozenn
This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...
DEFF Research Database (Denmark)
Vesth, Allan; Kock, Carsten Weber
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...
Textbook Factor Demand Curves.
Davis, Joe C.
1994-01-01
Maintains that teachers and textbook graphics follow the same basic pattern in illustrating changes in demand curves when product prices increase. Asserts that the use of computer graphics will enable teachers to be more precise in their graphic presentation of price elasticity. (CFR)
Bernstein, D.J.; Birkner, P.; Lange, T.; Peters, C.P.
2013-01-01
This paper introduces EECM-MPFQ, a fast implementation of the elliptic-curve method of factoring integers. EECM-MPFQ uses fewer modular multiplications than the well-known GMP-ECM software, takes less time than GMP-ECM, and finds more primes than GMP-ECM. The main improvements above the
DEFF Research Database (Denmark)
Federici, Paolo; Kock, Carsten Weber
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...
LINS Curve in Romanian Economy
Directory of Open Access Journals (Sweden)
Emilian Dobrescu
2016-02-01
Full Text Available The paper presents theoretical considerations and empirical evidence to test the validity of the Laffer in Narrower Sense (LINS curve as a parabola with a maximum. Attention is focused on the so-called legal-effective tax gap (letg. The econometric application is based on statistical data (1990-2013 for Romania as an emerging European economy. Three cointegrating regressions (fully modified least squares, canonical cointegrating regression and dynamic least squares and three algorithms, which are based on instrumental variables (two-stage least squares, generalized method of moments, and limited information maximum likelihood, are involved.
Ait-Haddou, Rachid; Sakane, Yusuke; Nomura, Taishin
2013-01-01
We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.
Ait-Haddou, Rachid
2013-02-01
We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.
Directory of Open Access Journals (Sweden)
M. Franchini
2000-01-01
Full Text Available The sensitivity analysis described in Hashemi et al. (2000 is based on one-at-a-time perturbations to the model parameters. This type of analysis cannot highlight the presence of parameter interactions which might indeed affect the characteristics of the flood frequency curve (ffc even more than the individual parameters. For this reason, the effects of the parameters of the rainfall, rainfall runoff models and of the potential evapotranspiration demand on the ffc are investigated here through an analysis of the results obtained from a factorial experimental design, where all the parameters are allowed to vary simultaneously. This latter, more complex, analysis confirms the results obtained in Hashemi et al. (2000 thus making the conclusions drawn there of wider validity and not related strictly to the reference set selected. However, it is shown that two-factor interactions are present not only between different pairs of parameters of an individual model, but also between pairs of parameters of different models, such as rainfall and rainfall-runoff models, thus demonstrating the complex interaction between climate and basin characteristics affecting the ffc and in particular its curvature. Furthermore, the wider range of climatic regime behaviour produced within the factorial experimental design shows that the probability distribution of soil moisture content at the storm arrival time is no longer sufficient to explain the link between the perturbations to the parameters and their effects on the ffc, as was suggested in Hashemi et al. (2000. Other factors have to be considered, such as the probability distribution of the soil moisture capacity, and the rainfall regime, expressed through the annual maximum rainfalls over different durations. Keywords: Monte Carlo simulation; factorial experimental design; analysis of variance (ANOVA
Matsumoto, Atsushi; Tobias, Irwin; Olson, Wilma K
2005-01-01
Fine structural and energetic details embedded in the DNA base sequence, such as intrinsic curvature, are important to the packaging and processing of the genetic material. Here we investigate the internal dynamics of a 200 bp closed circular molecule with natural curvature using a newly developed normal-mode treatment of DNA in terms of neighboring base-pair "step" parameters. The intrinsic curvature of the DNA is described by a 10 bp repeating pattern of bending distortions at successive base-pair steps. We vary the degree of intrinsic curvature and the superhelical stress on the molecule and consider the normal-mode fluctuations of both the circle and the stable figure-8 configuration under conditions where the energies of the two states are similar. To extract the properties due solely to curvature, we ignore other important features of the double helix, such as the extensibility of the chain, the anisotropy of local bending, and the coupling of step parameters. We compare the computed normal modes of the curved DNA model with the corresponding dynamical features of a covalently closed duplex of the same chain length constructed from naturally straight DNA and with the theoretically predicted dynamical properties of a naturally circular, inextensible elastic rod, i.e., an O-ring. The cyclic molecules with intrinsic curvature are found to be more deformable under superhelical stress than rings formed from naturally straight DNA. As superhelical stress is accumulated in the DNA, the frequency, i.e., energy, of the dominant bending mode decreases in value, and if the imposed stress is sufficiently large, a global configurational rearrangement of the circle to the figure-8 form takes place. We combine energy minimization with normal-mode calculations of the two states to decipher the configurational pathway between the two states. We also describe and make use of a general analytical treatment of the thermal fluctuations of an elastic rod to characterize the
Directory of Open Access Journals (Sweden)
Yang Shuai
2016-01-01
Full Text Available Firstly, using the Monte Carlo method and simulation analysis, this paper builds models for the behaviour of electric vehicles, the conventional charging model and the fast charging model. Secondly, this paper studies the impact that the number of electric vehicles which get access to power grid has on the daily load curve. Then, the paper put forwards a dynamic pricing mechanism of electricity, and studies how this dynamic pricing mechanism guides the electric vehicles to charge orderly. Last but not the least, the paper presents a V2G mechanism. Under this mechanism, electric vehicles can charge orderly and take part in the peak shaving. Research finds that massive electric vehicles’ access to the power grid will increase the peak-valley difference of daily load curve. Dynamic pricing mechanism and V2G mechanism can effectively lead the electric vehicles to take part in peak-shaving, and optimize the daily load curve.
Remote sensing used for power curves
International Nuclear Information System (INIS)
Wagner, R; Joergensen, H E; Paulsen, U S; Larsen, T J; Antoniou, I; Thesbjerg, L
2008-01-01
Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviation in the power curve significantly. Two LiDARs and a SoDAR are used to measure the wind profile in front of a wind turbine. These profiles are used to calculate the equivalent wind speed. The comparison of the power curves obtained with the three instruments to the traditional power curve, obtained using a cup anemometer measurement, confirms the results obtained from the simulations. Using LiDAR profiles reduces the error in power curve measurement, when these are used as relative instrument together with a cup anemometer. Results from the SoDAR do not show such promising results, probably because of noisy measurements resulting in distorted profiles
Energy Technology Data Exchange (ETDEWEB)
Groot, L. [Utrecht University, Utrecht School of Economics, Janskerkhof 12, 3512 BL Utrecht (Netherlands)
2008-11-15
The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across countries. These tools allow policy-makers and the general public to grasp at a single glance the impact of conventional distribution rules such as equal caps or grandfathering, or more sophisticated ones, on the distribution of greenhouse gas emissions. Second, using the Samuelson rule for the optimal provision of a public good, the Pareto-optimal distribution of carbon emissions is compared with the distribution that follows if countries follow Nash-Cournot abatement strategies. It is shown that the Pareto-optimal distribution under the Samuelson rule can be approximated by the equal cap division, represented by the diagonal in the Lorenz curve diagram.
Pelce, Pierre
1989-01-01
In recent years, much progress has been made in the understanding of interface dynamics of various systems: hydrodynamics, crystal growth, chemical reactions, and combustion. Dynamics of Curved Fronts is an important contribution to this field and will be an indispensable reference work for researchers and graduate students in physics, applied mathematics, and chemical engineering. The book consist of a 100 page introduction by the editor and 33 seminal articles from various disciplines.
David G. Blanchflower; Andrew J. Oswald
1992-01-01
The paper provides evidence for the existence of a negatively sloped locus linking the level of pay to the rate of regional (or industry) unemployment. This "wage curve" is estimated using microeconomic data for Britain, the US, Canada, Korea, Austria, Italy, Holland, Switzerland, Norway, and Germany, The average unemployment elasticity of pay is approximately -0.1. The paper sets out a multi-region efficiency wage model and argues that its predictions are consistent with the data.
Anatomical curve identification
Bowman, Adrian W.; Katina, Stanislav; Smith, Joanna; Brown, Denise
2015-01-01
Methods for capturing images in three dimensions are now widely available, with stereo-photogrammetry and laser scanning being two common approaches. In anatomical studies, a number of landmarks are usually identified manually from each of these images and these form the basis of subsequent statistical analysis. However, landmarks express only a very small proportion of the information available from the images. Anatomically defined curves have the advantage of providing a much richer expression of shape. This is explored in the context of identifying the boundary of breasts from an image of the female torso and the boundary of the lips from a facial image. The curves of interest are characterised by ridges or valleys. Key issues in estimation are the ability to navigate across the anatomical surface in three-dimensions, the ability to recognise the relevant boundary and the need to assess the evidence for the presence of the surface feature of interest. The first issue is addressed by the use of principal curves, as an extension of principal components, the second by suitable assessment of curvature and the third by change-point detection. P-spline smoothing is used as an integral part of the methods but adaptations are made to the specific anatomical features of interest. After estimation of the boundary curves, the intermediate surfaces of the anatomical feature of interest can be characterised by surface interpolation. This allows shape variation to be explored using standard methods such as principal components. These tools are applied to a collection of images of women where one breast has been reconstructed after mastectomy and where interest lies in shape differences between the reconstructed and unreconstructed breasts. They are also applied to a collection of lip images where possible differences in shape between males and females are of interest. PMID:26041943
Estimating Corporate Yield Curves
Antionio Diaz; Frank Skinner
2001-01-01
This paper represents the first study of retail deposit spreads of UK financial institutions using stochastic interest rate modelling and the market comparable approach. By replicating quoted fixed deposit rates using the Black Derman and Toy (1990) stochastic interest rate model, we find that the spread between fixed and variable rates of interest can be modeled (and priced) using an interest rate swap analogy. We also find that we can estimate an individual bank deposit yield curve as a spr...
Vo, Martin
2017-08-01
Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.
Potentiometric titration curves of aluminium salt solutions and its ...
African Journals Online (AJOL)
Potentiometric titration curves of aluminium salt solutions and its species conversion ... of aluminium salt solutions under the moderate slow rate of base injection. ... silicate radical, and organic acid radical on the titration curves and its critical ...
Uniformization of elliptic curves
Ülkem, Özge; Ulkem, Ozge
2015-01-01
Every elliptic curve E defined over C is analytically isomorphic to C*=qZ for some q ∊ C*. Similarly, Tate has shown that if E is defined over a p-adic field K, then E is analytically isomorphic to K*=qZ for some q ∊ K . Further the isomorphism E(K) ≅ K*/qZ respects the action of the Galois group GK/K, where K is the algebraic closure of K. I will explain the construction of this isomorphism.
Modelling curves of manufacturing feasibilities and demand
Directory of Open Access Journals (Sweden)
Soloninko K.S.
2017-03-01
Full Text Available The authors research the issue of functional properties of curves of manufacturing feasibilities and demand. Settlement of the problem, and its connection with important scientific and practical tasks. According to its nature, the market economy is unstable and is in constant movement. Economy has an effective instrument for explanation of changes in economic environment; this tool is called the modelling of economic processes. The modelling of economic processes depends first and foremost on the building of economic model which is the base for the formalization of economic process, that is, the building of mathematical model. The effective means for formalization of economic process is the creation of the model of hypothetic or imaginary economy. The building of demand model is significant for the market of goods and services. The problem includes the receiving (as the result of modelling definite functional properties of curves of manufacturing feasibilities and demand according to which one can determine their mathematical model. Another problem lies in obtaining majorant properties of curves of joint demand on the market of goods and services. Analysis of the latest researches and publications. Many domestic and foreign scientists dedicated their studies to the researches and building of the models of curves of manufacturing feasibilities and demand. In spite of considerable work of the scientists, such problems as functional properties of the curves and their practical use in modelling. The purpose of the article is to describe functional properties of curves of manufacturing feasibilities and demand on the market of goods and services on the base of modelling of their building. Scientific novelty and practical value. The theoretical regulations (for functional properties of curves of manufacturing feasibilities and demand received as a result of the present research, that is convexity, give extra practical possibilities in a microeconomic
Roc curves for continuous data
Krzanowski, Wojtek J
2009-01-01
Since ROC curves have become ubiquitous in many application areas, the various advances have been scattered across disparate articles and texts. ROC Curves for Continuous Data is the first book solely devoted to the subject, bringing together all the relevant material to provide a clear understanding of how to analyze ROC curves.The fundamental theory of ROC curvesThe book first discusses the relationship between the ROC curve and numerous performance measures and then extends the theory into practice by describing how ROC curves are estimated. Further building on the theory, the authors prese
Liu, Xiao-Hui; Wang, Wei-Liang; Lu, Shao-Yong; Wang, Yu-Fan; Ren, Zongming
2016-08-01
In Zaozhuang, economic development affects the discharge amount of industrial wastewater, chemical oxygen demand (COD), and ammonia nitrogen (NH3-N). To reveal the trend of water environmental quality related to the economy in Zaozhuang, this paper simulated the relationships between industrial wastewater discharge, COD, NH3-N load, and gross domestic product (GDP) per capita for Zaozhuang (2002-2012) using environmental Kuznets curve (EKC) models. The results showed that the added value of industrial GDP, the per capita GDP, and wastewater emission had average annual growth rates of 16.62, 16.19, and 17.89 %, respectively, from 2002 to 2012, while COD and NH3-N emission in 2012, compared with 2002, showed average annual decreases of 10.70 and 31.12 %, respectively. The export of EKC models revealed that industrial wastewater discharge had a typical inverted-U-shaped relationship with per capita GDP. However, both COD and NH3-N showed the binding curve of the left side of the "U" curve and left side U-shaped curve. The economy in Zaozhuang had been at the "fast-growing" stage, with low environmental pollution according to the industrial pollution level. In recent years, Zaozhuang has abated these heavy-pollution industries emphatically, so pollutants have been greatly reduced. Thus, Zaozhuang industrial wastewater treatment has been quite effective, with water quality improved significantly. The EKC models provided scientific evidence for estimating industrial wastewater discharge, COD, and NH3-N load as well as their changeable trends for Zaozhuang from an economic perspective.
International Nuclear Information System (INIS)
Mikhajlova, N.N.; Aristova, I.L.; Germanova, T.I.
2001-01-01
A large amount of digital seismic data from the permanent and temporary seismic stations was acquired in the result of detonation of large chemical explosions at Semipalatinsk Test Site. All the records were collected, systematized and processed, and databases were created. Travel-time curves for regional Pn, Pg, Sn and Lg waves were created and compared with the ones used in routine earthquake processing practice. (author)
Directory of Open Access Journals (Sweden)
Je Hyun Baekt
2000-01-01
Full Text Available A numerical study is conducted on the fully-developed laminar flow of an incompressible viscous fluid in a square duct rotating about a perpendicular axis to the axial direction of the duct. At the straight duct, the rotation produces vortices due to the Coriolis force. Generally two vortex cells are formed and the axial velocity distribution is distorted by the effect of this Coriolis force. When a convective force is weak, two counter-rotating vortices are shown with a quasi-parabolic axial velocity profile for weak rotation rates. As the rotation rate increases, the axial velocity on the vertical centreline of the duct begins to flatten and the location of vorticity center is moved near to wall by the effect of the Coriolis force. When the convective inertia force is strong, a double-vortex secondary flow appears in the transverse planes of the duct for weak rotation rates but as the speed of rotation increases the secondary flow is shown to split into an asymmetric configuration of four counter-rotating vortices. If the rotation rates are increased further, the secondary flow restabilizes to a slightly asymmetric double-vortex configuration. Also, a numerical study is conducted on the laminar flow of an incompressible viscous fluid in a 90°-bend square duct that rotates about axis parallel to the axial direction of the inlet. At a 90°-bend square duct, the feature of flow by the effect of a Coriolis force and a centrifugal force, namely a secondary flow by the centrifugal force in the curved region and the Coriolis force in the downstream region, is shown since the centrifugal force in curved region and the Coriolis force in downstream region are dominant respectively.
Elliptic curves for applications (Tutorial)
Lange, T.; Bernstein, D.J.; Chatterjee, S.
2011-01-01
More than 25 years ago, elliptic curves over finite fields were suggested as a group in which the Discrete Logarithm Problem (DLP) can be hard. Since then many researchers have scrutinized the security of the DLP on elliptic curves with the result that for suitably chosen curves only exponential
Global experience curves for wind farms
International Nuclear Information System (INIS)
Junginger, M.; Faaij, A.; Turkenburg, W.C.
2005-01-01
In order to forecast the technological development and cost of wind turbines and the production costs of wind electricity, frequent use is made of the so-called experience curve concept. Experience curves of wind turbines are generally based on data describing the development of national markets, which cause a number of problems when applied for global assessments. To analyze global wind energy price development more adequately, we compose a global experience curve. First, underlying factors for past and potential future price reductions of wind turbines are analyzed. Also possible implications and pitfalls when applying the experience curve methodology are assessed. Second, we present and discuss a new approach of establishing a global experience curve and thus a global progress ratio for the investment cost of wind farms. Results show that global progress ratios for wind farms may lie between 77% and 85% (with an average of 81%), which is significantly more optimistic than progress ratios applied in most current scenario studies and integrated assessment models. While the findings are based on a limited amount of data, they may indicate faster price reduction opportunities than so far assumed. With this global experience curve we aim to improve the reliability of describing the speed with which global costs of wind power may decline
Directory of Open Access Journals (Sweden)
Zeyu Liu
2018-01-01
Full Text Available A new fractional two-dimensional triangle function combination discrete chaotic map (2D-TFCDM with the discrete fractional difference is proposed. We observe the bifurcation behaviors and draw the bifurcation diagrams, the largest Lyapunov exponent plot, and the phase portraits of the proposed map, respectively. On the application side, we apply the proposed discrete fractional map into image encryption with the secret keys ciphered by Menezes-Vanstone Elliptic Curve Cryptosystem (MVECC. Finally, the image encryption algorithm is analysed in four main aspects that indicate the proposed algorithm is better than others.
Learning curves in health professions education.
Pusic, Martin V; Boutis, Kathy; Hatala, Rose; Cook, David A
2015-08-01
Learning curves, which graphically show the relationship between learning effort and achievement, are common in published education research but are not often used in day-to-day educational activities. The purpose of this article is to describe the generation and analysis of learning curves and their applicability to health professions education. The authors argue that the time is right for a closer look at using learning curves-given their desirable properties-to inform both self-directed instruction by individuals and education management by instructors.A typical learning curve is made up of a measure of learning (y-axis), a measure of effort (x-axis), and a mathematical linking function. At the individual level, learning curves make manifest a single person's progress towards competence including his/her rate of learning, the inflection point where learning becomes more effortful, and the remaining distance to mastery attainment. At the group level, overlaid learning curves show the full variation of a group of learners' paths through a given learning domain. Specifically, they make overt the difference between time-based and competency-based approaches to instruction. Additionally, instructors can use learning curve information to more accurately target educational resources to those who most require them.The learning curve approach requires a fine-grained collection of data that will not be possible in all educational settings; however, the increased use of an assessment paradigm that explicitly includes effort and its link to individual achievement could result in increased learner engagement and more effective instructional design.
Directory of Open Access Journals (Sweden)
Sergey A. Cherkis
2007-03-01
Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.
DEFF Research Database (Denmark)
Akbari, Abolghasem; Samah, Azizan Abu; Daryabor, Farshid
2016-01-01
This study aims to develop a methodology for generating a flood runoff susceptibility (FRS) map using a revised curve number (CN) method. The study area is in the Kuantan watershed (KW), Malaysia, which was seriously affected by floods in December 2013 and December 2014. A revised runoff CN map w......, the finding of this research provides a road map for government agencies to effectively implement flood mitigation projects in the study area.......This study aims to develop a methodology for generating a flood runoff susceptibility (FRS) map using a revised curve number (CN) method. The study area is in the Kuantan watershed (KW), Malaysia, which was seriously affected by floods in December 2013 and December 2014. A revised runoff CN map....... Approximately 5% of the study area was identified as a very high-risk zone and 13% as high-risk zone. However, the spatial extent of a high-risk zone in the downstream end and lowland areas of the KW could be considered to be the main cause of flood damage in recent years. From practical point of view...
Directory of Open Access Journals (Sweden)
Wei Zhao
2017-01-01
Full Text Available The ability to quantitatively evaluate the visual feedback of drivers has been considered as the primary research for reducing crashes in snow and ice environments. Different colored Chevron alignment signs cause diverse visual effect. However, the effect of Chevrons on visual feedback and on the driving reaction while navigating curves in SI environments has not been adequately evaluated. The objective of this study is twofold: (1 an effective and long-term experiment was designed and developed to test the effect of colored Chevrons on drivers’ vision and vehicle speed; (2 a new quantitative effect evaluation model is employed to measure the effect of different colors of the Chevrons. Fixation duration and pupil size were used to describe the driver’s visual response, and Cohen’s d was used to evaluate the colors’ psychological effect on drivers. The results showed the following: (1 after choosing the proper color for Chevrons, drivers reduced the speed of the vehicle while approaching the curves. (2 It was easier for drivers to identify the road alignment after setting the Chevrons. (3 Cohen’s d related to different colors of Chevrons have different effect sizes. The conclusions provide evident references for freeway warning products and the design of intelligent vehicles.
Directory of Open Access Journals (Sweden)
Saad Sharawi
2013-10-01
Full Text Available Aim: Newcastle disease is still one of the major threats for poultry industry allover the world. Therefore, attempt was made in this study to use the SYBR Green I real-time PCR with melting curves analysis as for detection and differentiation of NDV strains in suspected infected birds. Materials and Methods: Two sets of primers were used to amplify matrix and fusion genes in samples collected from suspectly infected birds (chickens and pigeons. Melting curve analysis in conjunction with real time PCR was conducted for identifying different pathotypes of the isolated NDVs. Clinical samples were propagated on specific pathogen free ECE and tested for MDT and ICIP. Results: The velogenic NDVs isolated from chickens and pigeons were distinguished with mean T 85.03±0.341 and m 83.78±0.237 respectively for M-gene amplification and for F-gene amplification the mean T were 84.04±0.037 and m 84.53±0.223. On the other hand the lentogenic NDV isolates including the vaccinal strains (HB1 and LaSota have a higher mean T (86.99±0.021 for M-gene amplification and 86.50±0.063 for F-gene amplification. The test showed no reaction with m unrelated RNA samples. In addition, the results were in good agreement with both virus isolation and biological pathotyping (MDT and ICIP. The assay offers an attractive alternative method for the diagnosis of NDV that can be easily applied in laboratory diagnosis as a screening test for the detection and differentiation of NDV infections. Conclusion: As was shown by the successful rapid detection and pathotyping of 15 NDV strains in clinical samples representing velogenic and lentogenic NDV strains, and the agreement with the results of virus isolation , biological pathotyping and pathogenicity indices. The results of this report suggests that the described SybrGreen I real-time RT-PCR assay in conjunction with Melting curve analysis used as a rapid, specific and simple diagnostic tools for detection and pathotyping of
International Nuclear Information System (INIS)
Szuta, M.; Dąbrowski, L.
2013-01-01
Crossing the experimental critical fuel temperature dependent on burn-up, an onset of fission gas burst release is observed. This observed phenomena can be explained by assumption that the fission gas immobilization in the uranium dioxide irradiated to a fluency of greater than 10 19 fissions/cm 3 is mainly due to radiation induced chemical activity. Application of the “ab initio” method show that the bond energy of Xenon and Krypton is equal to –1.23 eV, and –3.42 eV respectively. Assuming further that the gas chemically bound can be released mainly in the process of re-crystallization and modifying the differential equation of Ainscough of grain growth by including the burn-up dependence and the experimental data of limiting grain size in function of the fuel temperature for the un-irradiated and irradiated fuel we can re-construct the experimental curve of Vitanza. (authors)
Sadek, Mohammad
2010-01-01
In this thesis we give insight into the minimisation problem of genus one curves defined by equations other than Weierstrass equations. We are interested in genus one curves given as double covers of P1, plane cubics, or complete intersections of two quadrics in P3. By minimising such a curve we mean making the invariants associated to its defining equations as small as possible using a suitable change of coordinates. We study the non-uniqueness of minimisations of the genus one curves des...
International Nuclear Information System (INIS)
Yang, Xiaoli; Hofmann, Ralf; Dapp, Robin; Van de Kamp, Thomas; Rolo, Tomy dos Santos; Xiao, Xianghui; Moosmann, Julian; Kashef, Jubin; Stotzka, Rainer
2015-01-01
High-resolution, three-dimensional (3D) imaging of soft tissues requires the solution of two inverse problems: phase retrieval and the reconstruction of the 3D image from a tomographic stack of two-dimensional (2D) projections. The number of projections per stack should be small to accommodate fast tomography of rapid processes and to constrain X-ray radiation dose to optimal levels to either increase the duration o fin vivo time-lapse series at a given goal for spatial resolution and/or the conservation of structure under X-ray irradiation. In pursuing the 3D reconstruction problem in the sense of compressive sampling theory, we propose to reduce the number of projections by applying an advanced algebraic technique subject to the minimisation of the total variation (TV) in the reconstructed slice. This problem is formulated in a Lagrangian multiplier fashion with the parameter value determined by appealing to a discrete L-curve in conjunction with a conjugate gradient method. The usefulness of this reconstruction modality is demonstrated for simulated and in vivo data, the latter acquired in parallel-beam imaging experiments using synchrotron radiation
Zarzycki, Piotr; Thomas, Fabien
2006-10-15
The parallel shape of the potentiometric titration curves for montmorillonite suspension is explained using the surface complexation model and taking into account the surface heterogeneity. The homogeneous models give accurate predictions only if they assume unphysically large values of the equilibrium constants for the exchange process occurring on the basal plane. However, the assumption that the basal plane is energetically heterogeneous allows to fit the experimental data (reported by Avena and De Pauli [M. Avena, C.P. De Pauli, J. Colloid Interface Sci. 202 (1998) 195-204]) for reasonable values of exchange equilibrium constant equal to 1.26 (suggested by Fletcher and Sposito [P. Fletcher, G. Sposito, Clay Miner. 24 (1989) 375-391]). Moreover, we observed the typical behavior of point of zero net proton charge (pznpc) as a function of logarithm of the electrolyte concentration (log[C]). We showed that the slope of the linear dependence, pznpc=f(log[C]), is proportional to the number of isomorphic substitutions in the crystal phase, which was also observed in the experimental studies.
Yang, Xiaoli; Hofmann, Ralf; Dapp, Robin; van de Kamp, Thomas; dos Santos Rolo, Tomy; Xiao, Xianghui; Moosmann, Julian; Kashef, Jubin; Stotzka, Rainer
2015-03-09
High-resolution, three-dimensional (3D) imaging of soft tissues requires the solution of two inverse problems: phase retrieval and the reconstruction of the 3D image from a tomographic stack of two-dimensional (2D) projections. The number of projections per stack should be small to accommodate fast tomography of rapid processes and to constrain X-ray radiation dose to optimal levels to either increase the duration of in vivo time-lapse series at a given goal for spatial resolution and/or the conservation of structure under X-ray irradiation. In pursuing the 3D reconstruction problem in the sense of compressive sampling theory, we propose to reduce the number of projections by applying an advanced algebraic technique subject to the minimisation of the total variation (TV) in the reconstructed slice. This problem is formulated in a Lagrangian multiplier fashion with the parameter value determined by appealing to a discrete L-curve in conjunction with a conjugate gradient method. The usefulness of this reconstruction modality is demonstrated for simulated and in vivo data, the latter acquired in parallel-beam imaging experiments using synchrotron radiation.
Remote sensing used for power curves
DEFF Research Database (Denmark)
Wagner, Rozenn; Ejsing Jørgensen, Hans; Schmidt Paulsen, Uwe
2008-01-01
Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviat...
Numerical analysis of thermoluminescence glow curves
International Nuclear Information System (INIS)
Gomez Ros, J. M.; Delgado, A.
1989-01-01
This report presents a method for the numerical analysis of complex thermoluminescence glow curves resolving the individual glow peak components. The method employs first order kinetics analytical expressions and is based In a Marquart-Levenberg minimization procedure. A simplified version of this method for thermoluminescence dosimetry (TLD) is also described and specifically developed to operate whit Lithium Fluoride TLD-100. (Author). 36 refs
Quantum fields in curved space
International Nuclear Information System (INIS)
Birrell, N.D.; Davies, P.C.W.
1982-01-01
The book presents a comprehensive review of the subject of gravitational effects in quantum field theory. Quantum field theory in Minkowski space, quantum field theory in curved spacetime, flat spacetime examples, curved spacetime examples, stress-tensor renormalization, applications of renormalization techniques, quantum black holes and interacting fields are all discussed in detail. (U.K.)
The Environmental Kuznets Curve. An empirical analysis for OECD countries
Energy Technology Data Exchange (ETDEWEB)
Georgiev, E.
2008-09-15
This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.
The Environmental Kuznets Curve. An empirical analysis for OECD countries
International Nuclear Information System (INIS)
Georgiev, E.
2008-09-01
This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.
Experience Curves: A Tool for Energy Policy Assessment
Energy Technology Data Exchange (ETDEWEB)
Neij, Lena; Helby, Peter [Lund Univ. (Sweden). Environmental and Energy Systems Studies; Dannemand Andersen, Per; Morthorst, Poul Erik [Riso National Laboratory, Roskilde (Denmark); Durstewitz, Michael; Hoppe-Kilpper, Martin [Inst. fuer Solare Energieversorgungstechnik e.V., Kassel (DE); and others
2003-07-01
The objective of the project, Experience curves: a tool for energy policy assessment (EXTOOL), was to analyse the experience curve as a tool for the assessment of energy policy measures. This is of special interest, since the use of experience curves for the assessment of energy policy measures requires the development of the established experience curve methodology. This development raises several questions which have been addressed and analysed in this project. The analysis is based on case studies of wind power, an area with considerable experience in technology development, deployment and policy measures. Therefore, a case study based on wind power provides a good opportunity to study the usefulness of experience curves as a tool for the assessment of energy policy measures. However, the results are discussed in terms of using experience curves for the assessment of any energy technology. The project shows that experience curves can be used to assess the effect of combined policy measures in terms of cost reductions. Moreover, the result of the project show that experience curves could be used to analyse international 'learning systems', i.e. cost reductions brought about by the development of wind power and policy measures used in other countries. Nevertheless, the use of experience curves for the assessment of policy programmes has several limitations. First, the analysis and assessment of policy programmes cannot be achieved unless relevant experience curves based on good data can be developed. The authors are of the opinion that only studies that provide evidence of the validity, reliability and relevance of experience curves should be taken into account in policy making. Second, experience curves provide an aggregated picture of the situation and more detailed analysis of various sources of cost reduction, and cost reductions resulting from individual policy measures, requires additional data and analysis tools. Third, we do not recommend the use of
Extended analysis of cooling curves
International Nuclear Information System (INIS)
Djurdjevic, M.B.; Kierkus, W.T.; Liliac, R.E.; Sokolowski, J.H.
2002-01-01
Thermal Analysis (TA) is the measurement of changes in a physical property of a material that is heated through a phase transformation temperature range. The temperature changes in the material are recorded as a function of the heating or cooling time in such a manner that allows for the detection of phase transformations. In order to increase accuracy, characteristic points on the cooling curve have been identified using the first derivative curve plotted versus time. In this paper, an alternative approach to the analysis of the cooling curve has been proposed. The first derivative curve has been plotted versus temperature and all characteristic points have been identified with the same accuracy achieved using the traditional method. The new cooling curve analysis also enables the Dendrite Coherency Point (DCP) to be detected using only one thermocouple. (author)
Comparison of power curve monitoring methods
Directory of Open Access Journals (Sweden)
Cambron Philippe
2017-01-01
Full Text Available Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM of wind turbines (WT. In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.
Deep-learnt classification of light curves
DEFF Research Database (Denmark)
Mahabal, Ashish; Gieseke, Fabian; Pai, Akshay Sadananda Uppinakudru
2017-01-01
is to derive statistical features from the time series and to use machine learning methods, generally supervised, to separate objects into a few of the standard classes. In this work, we transform the time series to two-dimensional light curve representations in order to classify them using modern deep......Astronomy light curves are sparse, gappy, and heteroscedastic. As a result standard time series methods regularly used for financial and similar datasets are of little help and astronomers are usually left to their own instruments and techniques to classify light curves. A common approach...... learning techniques. In particular, we show that convolutional neural networks based classifiers work well for broad characterization and classification. We use labeled datasets of periodic variables from CRTS survey and show how this opens doors for a quick classification of diverse classes with several...
Energy and GHG abatement cost curves
Energy Technology Data Exchange (ETDEWEB)
Alvarenga, Rafael [BHP Billiton Base Metals (Australia)
2010-07-01
Global warming due to various reasons but especially to emission of green house gases (GHGs) has become a cause for serious concern. This paper discusses the steps taken by BHP Billiton to reduce energy consumption and GHG emissions using cost curves. According to forecasts, global warming is expected to impact Chile badly and the rise in temperature could be between 1 and more than 5 degrees Celsius. Mining in Chile consumes a lot of energy, particularly electricity. Total energy and electricity consumption in 2007 was 13 and 36 % respectively. BHP base metals developed a set of abatement cost curves for energy and GHG in Chile and these are shown in figures. The methodology for the curves consisted of consultant visits to each mine operation. The study also includes mass energy balance and feasibility maps. The paper concludes that it is important to evaluate the potential for reducing emissions and energy and their associated costs.
According to Jim: The Flawed Normal Curve of Intelligence
Gallagher, James J.
2008-01-01
In this article, the author talks about the normal curve of intelligence which he thinks is flawed and contends that wrong conclusions have been drawn based on this spurious normal curve. An example is that of racial and ethnic differences wherein some authors maintain that some ethnic and racial groups are clearly superior to others based on…
Computational aspects of algebraic curves
Shaska, Tanush
2005-01-01
The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove
Take, Makoto; Takeuchi, Tetsuya; Haresaku, Mitsuru; Matsumoto, Michiharu; Nagano, Kasuke; Yamamoto, Seigo; Takamura-Enya, Takeji; Fukushima, Shoji
2014-01-01
The present study investigated the time-course changes of concentration of chloroform (CHCl3) in the blood during and after exposure of male rats to CHCl3 by inhalation. Increasing the dose of CHCl3 in the inhalation exposed groups caused a commensurate increase in the concentration of CHCl3 in the blood and the area under the blood concentration-time curve (AUC). There was good correlation (r = 0.988) between the inhalation dose and the AUC/kg body weight. Based on the AUC/kg body weight-inhalation dose curve and the AUC/kg body weight after oral administration, inhalation equivalent doses of orally administered CHCl3 were calculated. Calculation of inhalation equivalent doses allows the body burden due to CHCl3 by inhalation exposure and oral exposure to be directly compared. This type of comparison facilitates risk assessment in humans exposed to CHCl3 by different routes. Our results indicate that when calculating inhalation equivalent doses of CHCl3, it is critical to include the AUC from the exposure period in addition to the AUC after the end of the exposure period. Thus, studies which measure the concentration of volatile organic compounds in the blood during the inhalation exposure period are crucial. The data reported here makes an important contribution to the physiologically based pharmacokinetic (PBPK) database of CHCl3 in rodents.
Energy Technology Data Exchange (ETDEWEB)
Calabri, L [CNR-INFM-National Research Center on nanoStructures and bioSystems at Surfaces (S3), Via Campi 213/a, 41100 Modena (Italy); Pugno, N [Department of Structural Engineering and Geotechnics, Politecnico di Torino, Turin (Italy); Ding, W [Department of Mechanical Engineering, Northwestern University, Evanston, IL 60208-3111 (United States); Ruoff, R S [Department of Mechanical Engineering, Northwestern University, Evanston, IL 60208-3111 (United States)
2006-08-23
The effects of non-ideal experimental configuration on the mechanical resonance of boron (B) nanowires (NWs) were studied to obtain the corrected value for the Young's modulus. The following effects have been theoretically considered: (i) the presence of intrinsic curvature (ii) non-ideal clamps (iii) spurious masses (iv) coating layer, and (v) large displacements. An energy-based analytical analysis was developed to treat such effects and their interactions. Here, we focus on treating the effect of the intrinsic curvature on the mechanical resonance. The analytical approach has been confirmed by numerical FEM analysis. A parallax method was used to obtain the three-dimensional geometry of the NW.
Precision-Recall-Gain Curves:PR Analysis Done Right
Flach, Peter; Kull, Meelis
2015-01-01
Precision-Recall analysis abounds in applications of binary classification where true negatives do not add value and hence should not affect assessment of the classifier's performance. Perhaps inspired by the many advantages of receiver operating characteristic (ROC) curves and the area under such curves for accuracy-based performance assessment, many researchers have taken to report Precision-Recall (PR) curves and associated areas as performance metric. We demonstrate in this paper that thi...
Path integrals on curved manifolds
International Nuclear Information System (INIS)
Grosche, C.; Steiner, F.
1987-01-01
A general framework for treating path integrals on curved manifolds is presented. We also show how to perform general coordinate and space-time transformations in path integrals. The main result is that one has to subtract a quantum correction ΔV ∝ ℎ 2 from the classical Lagrangian L, i.e. the correct effective Lagrangian to be used in the path integral is L eff = L-ΔV. A general prescription for calculating the quantum correction ΔV is given. It is based on a canonical approach using Weyl-ordering and the Hamiltonian path integral defined by the midpoint prescription. The general framework is illustrated by several examples: The d-dimensional rotator, i.e. the motion on the sphere S d-1 , the path integral in d-dimensional polar coordinates, the exact treatment of the hydrogen atom in R 2 and R 3 by performing a Kustaanheimo-Stiefel transformation, the Langer transformation and the path integral for the Morse potential. (orig.)
Designing the Alluvial Riverbeds in Curved Paths
Macura, Viliam; Škrinár, Andrej; Štefunková, Zuzana; Muchová, Zlatica; Majorošová, Martina
2017-10-01
The paper presents the method of determining the shape of the riverbed in curves of the watercourse, which is based on the method of Ikeda (1975) developed for a slightly curved path in sandy riverbed. Regulated rivers have essentially slightly and smoothly curved paths; therefore, this methodology provides the appropriate basis for river restoration. Based on the research in the experimental reach of the Holeška Brook and several alluvial mountain streams the methodology was adjusted. The method also takes into account other important characteristics of bottom material - the shape and orientation of the particles, settling velocity and drag coefficients. Thus, the method is mainly meant for the natural sand-gravel material, which is heterogeneous and the particle shape of the bottom material is very different from spherical. The calculation of the river channel in the curved path provides the basis for the design of optimal habitat, but also for the design of foundations of armouring of the bankside of the channel. The input data is adapted to the conditions of design practice.
51Cr - erythrocyte survival curves
International Nuclear Information System (INIS)
Paiva Costa, J. de.
1982-07-01
Sixteen patients were studied, being fifteen patients in hemolytic state, and a normal individual as a witness. The aim was to obtain better techniques for the analysis of the erythrocytes, survival curves, according to the recommendations of the International Committee of Hematology. It was used the radiochromatic method as a tracer. Previously a revisional study of the International Literature was made in its aspects inherent to the work in execution, rendering possible to establish comparisons and clarify phonomena observed in cur investigation. Several parameters were considered in this study, hindering both the exponential and the linear curves. The analysis of the survival curves of the erythrocytes in the studied group, revealed that the elution factor did not present a homogeneous answer quantitatively to all, though, the result of the analysis of these curves have been established, through listed programs in the electronic calculator. (Author) [pt
Melting curves of gammairradiated DNA
International Nuclear Information System (INIS)
Hofer, H.; Altmann, H.; Kehrer, M.
1978-08-01
Melting curves of gammairradiated DNA and data derived of them, are reported. The diminished stability is explained by basedestruction. DNA denatures completely at room temperature, if at least every fifth basepair is broken or weakened by irradiation. (author)
Management of the learning curve
DEFF Research Database (Denmark)
Pedersen, Peter-Christian; Slepniov, Dmitrij
2016-01-01
Purpose – This paper focuses on the management of the learning curve in overseas capacity expansions. The purpose of this paper is to unravel the direct as well as indirect influences on the learning curve and to advance the understanding of how these affect its management. Design...... the dimensions of the learning process involved in a capacity expansion project and identified the direct and indirect labour influences on the production learning curve. On this basis, the study proposes solutions to managing learning curves in overseas capacity expansions. Furthermore, the paper concludes...... with measures that have the potential to significantly reduce the non-value-added time when establishing new capacities overseas. Originality/value – The paper uses a longitudinal in-depth case study of a Danish wind turbine manufacturer and goes beyond a simplistic treatment of the lead time and learning...
Growth curves for Laron syndrome.
Laron, Z; Lilos, P; Klinger, B
1993-01-01
Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls co...
Intersection numbers of spectral curves
Eynard, B.
2011-01-01
We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the Marino-Vafa formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV, Marino-Vafa formula, and Mumford formula.
Smith, Garon C.; Hossain, Md Mainul
2017-01-01
Species TOPOS is a free software package for generating three-dimensional (3-D) topographic surfaces ("topos") for acid-base equilibrium studies. This upgrade adds 3-D species distribution topos to earlier surfaces that showed pH and buffer capacity behavior during titration and dilution procedures. It constructs topos by plotting…
Modeling Patterns of Activities using Activity Curves.
Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen
2016-06-01
Pervasive computing offers an unprecedented opportunity to unobtrusively monitor behavior and use the large amount of collected data to perform analysis of activity-based behavioral patterns. In this paper, we introduce the notion of an activity curve , which represents an abstraction of an individual's normal daily routine based on automatically-recognized activities. We propose methods to detect changes in behavioral routines by comparing activity curves and use these changes to analyze the possibility of changes in cognitive or physical health. We demonstrate our model and evaluate our change detection approach using a longitudinal smart home sensor dataset collected from 18 smart homes with older adult residents. Finally, we demonstrate how big data-based pervasive analytics such as activity curve-based change detection can be used to perform functional health assessment. Our evaluation indicates that correlations do exist between behavior and health changes and that these changes can be automatically detected using smart homes, machine learning, and big data-based pervasive analytics.
Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.
Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M
2014-12-01
In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.
Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves
International Nuclear Information System (INIS)
Wallin, K.; Rintamaa, R.
1998-01-01
Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)
Designing an ASIP for cryptographic pairings over Barreto-Naehrig curves
Kammler, D.; Zhang, D.; Schwabe, P.; Scharwaechter, H.; Langenberg, M.; Auras, D.; Ascheid, G.; Mathar, R.; Clavier, C.; Gaj, K.
2009-01-01
This paper presents a design-space exploration of an application-specific instruction-set processor (ASIP) for the computation of various cryptographic pairings over Barreto-Naehrig curves (BN curves). Cryptographic pairings are based on elliptic curves over finite fields—in the case of BN curves a
Potekaev, A. I.; Kondratyuk, A. A.; Porobova, S. A.; Klopotov, A. A.; Markova, T. N.; Kakushkin, Yu A.; Klopotov, V. D.
2016-11-01
The paper presents the analysis of binary state diagrams based on elements VIIIA and IB of the periodic table and crystal geometry parameters of solid solutions and intermetallic compositions. The analysis shows an explicit correlation between the type of the evolution of phase diagrams classified by Lebedev depending on the nature of atomic volume deviations observed in solid solutions and intermetallic compounds from Zen law.
Mengtong Jin; Haiquan Liu; Wenshuo Sun; Qin Li; Zhaohuan Zhang; Jibing Li; Yingjie Pan; Yong Zhao
2015-01-01
Vibrio parahemolyticus is an important pathogen that leads to food illness associated seafood. Therefore, rapid and reliable methods to detect and quantify the total viable V. parahaemolyticus in seafood are needed. In this assay, a RNA-based real-time reverse-transcriptase PCR (RT-qPCR) without an enrichment step has been developed for detection and quantification of the total viable V. parahaemolyticus in shrimp. RNA standards with the target segments were synthesized in vitro with T7 RNA p...
Speed Choice and Curve Radius on Rural Roads
DEFF Research Database (Denmark)
Rimme, Nicolai; Nielsen, Lea; Kjems, Erik
2016-01-01
with informative speed-calming measures as traffic signs, reflectors or surface painting. However, it has been the hypothesis that people are reducing their speed insufficiently and are driving too fast in most curved alignments – especially when they are driving there frequently. By knowing the speed near...... and in the curved alignments compared to the geometry of the curved alignments, it can be clarified, if and which speed-calming measures that are required. Using GNSS-based floating car data (FCD) from driving cars the speed near and in curved alignments is found. Single observation of FCD are connected to trips...
Considerations for reference pump curves
International Nuclear Information System (INIS)
Stockton, N.B.
1992-01-01
This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point
PV experience curves for the Netherlands
International Nuclear Information System (INIS)
Gerwig, R.
2005-01-01
Experience curves are one of several tools used by policy makers to take a look at market development. Numerous curves have been constructed for PV but none specific to the Netherlands. The objective of this report is to take a look at the price development of grid-connected PV systems in the Netherlands using the experience curve theory. After a literature and internet search and attempts to acquire information from PV companies information on 51% of the totally installed capacity was found. Curves for the period 1991-2001 were constructed based on system price, BOS (balance-of-system) price and inverter price. The progress ratio of the locally learning BOS was similar to the globally learning module market. This indicates that the pace of development of the Dutch PV market is similar to the globally followed pace. Improvement of the detail of the data might help to get a better idea of which BOS components have declined most. The similar progress ratio also shows the importance of investing both in module and system research as is the case in the Netherlands
Dual kinetic curves in reversible electrochemical systems.
Directory of Open Access Journals (Sweden)
Michael J Hankins
Full Text Available We introduce dual kinetic chronoamperometry, in which reciprocal relations are established between the kinetic curves of electrochemical reactions that start from symmetrical initial conditions. We have performed numerical and experimental studies in which the kinetic curves of the electron-transfer processes are analyzed for a reversible first order reaction. Experimental tests were done with the ferrocyanide/ferricyanide system in which the concentrations of each component could be measured separately using the platinum disk/gold ring electrode. It is shown that the proper ratio of the transient kinetic curves obtained from cathodic and anodic mass transfer limited regions give thermodynamic time invariances related to the reaction quotient of the bulk concentrations. Therefore, thermodynamic time invariances can be observed at any time using the dual kinetic curves for reversible reactions. The technique provides a unique possibility to extract the non-steady state trajectory starting from one initial condition based only on the equilibrium constant and the trajectory which starts from the symmetrical initial condition. The results could impact battery technology by predicting the concentrations and currents of the underlying non-steady state processes in a wide domain from thermodynamic principles and limited kinetic information.
Curve Digitizer – A software for multiple curves digitizing
Directory of Open Access Journals (Sweden)
Florentin ŞPERLEA
2010-06-01
Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document
Whiley, David M; Jacob, Kevin; Nakos, Jennifer; Bletchly, Cheryl; Nimmo, Graeme R; Nissen, Michael D; Sloots, Theo P
2012-06-01
Numerous real-time PCR assays have been described for detection of the influenza A H275Y alteration. However, the performance of these methods can be undermined by sequence variation in the regions flanking the codon of interest. This is a problem encountered more broadly in microbial diagnostics. In this study, we developed a modification of hybridization probe-based melting curve analysis, whereby primers are used to mask proximal mutations in the sequence targets of hybridization probes, so as to limit the potential for sequence variation to interfere with typing. The approach was applied to the H275Y alteration of the influenza A (H1N1) 2009 strain, as well as a Neisseria gonorrhoeae mutation associated with antimicrobial resistance. Assay performances were assessed using influenza A and N. gonorrhoeae strains characterized by DNA sequencing. The modified hybridization probe-based approach proved successful in limiting the effects of proximal mutations, with the results of melting curve analyses being 100% consistent with the results of DNA sequencing for all influenza A and N. gonorrhoeae strains tested. Notably, these included influenza A and N. gonorrhoeae strains exhibiting additional mutations in hybridization probe targets. Of particular interest was that the H275Y assay correctly typed influenza A strains harbouring a T822C nucleotide substitution, previously shown to interfere with H275Y typing methods. Overall our modified hybridization probe-based approach provides a simple means of circumventing problems caused by sequence variation, and offers improved detection of the influenza A H275Y alteration and potentially other resistance mechanisms.
Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves
International Nuclear Information System (INIS)
Wallin, K.
1999-01-01
Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)
Growth curve models and statistical diagnostics
Pan, Jian-Xin
2002-01-01
Growth-curve models are generalized multivariate analysis-of-variance models. These models are especially useful for investigating growth problems on short times in economics, biology, medical research, and epidemiology. This book systematically introduces the theory of the GCM with particular emphasis on their multivariate statistical diagnostics, which are based mainly on recent developments made by the authors and their collaborators. The authors provide complete proofs of theorems as well as practical data sets and MATLAB code.
Bezier Curve Modeling for Neutrosophic Data Problem
Directory of Open Access Journals (Sweden)
Ferhat Tas
2017-02-01
Full Text Available Neutrosophic set concept is defined with membership, non-membership and indeterminacy degrees. This concept is the solution and representation of the problems with various fields. In this paper, a geometric model is introduced for Neutrosophic data problem for the first time. This model is based on neutrosophic sets and neutrosophic relations. Neutrosophic control points are defined according to these points, resulting in neutrosophic Bezier curves.
2018-02-01
Horizontal curves are unavoidable in rural roads and are a serious crash risk to vehicle occupants. This study investigates the impact and effectiveness of three curve-based perceptual speed-calming countermeasures (advance curve warning signs, chevr...
Calibration curves for biological dosimetry
International Nuclear Information System (INIS)
Guerrero C, C.; Brena V, M. . E-mail cgc@nuclear.inin.mx
2004-01-01
The generated information by the investigations in different laboratories of the world, included the ININ, in which settles down that certain class of chromosomal leisure it increases in function of the dose and radiation type, has given by result the obtaining of calibrated curves that are applied in the well-known technique as biological dosimetry. In this work is presented a summary of the work made in the laboratory that includes the calibrated curves for gamma radiation of 60 Cobalt and X rays of 250 k Vp, examples of presumed exposure to ionizing radiation, resolved by means of aberration analysis and the corresponding dose estimate through the equations of the respective curves and finally a comparison among the dose calculations in those people affected by the accident of Ciudad Juarez, carried out by the group of Oak Ridge, USA and those obtained in this laboratory. (Author)
Vertex algebras and algebraic curves
Frenkel, Edward
2004-01-01
Vertex algebras are algebraic objects that encapsulate the concept of operator product expansion from two-dimensional conformal field theory. Vertex algebras are fast becoming ubiquitous in many areas of modern mathematics, with applications to representation theory, algebraic geometry, the theory of finite groups, modular functions, topology, integrable systems, and combinatorics. This book is an introduction to the theory of vertex algebras with a particular emphasis on the relationship with the geometry of algebraic curves. The notion of a vertex algebra is introduced in a coordinate-independent way, so that vertex operators become well defined on arbitrary smooth algebraic curves, possibly equipped with additional data, such as a vector bundle. Vertex algebras then appear as the algebraic objects encoding the geometric structure of various moduli spaces associated with algebraic curves. Therefore they may be used to give a geometric interpretation of various questions of representation theory. The book co...
Curve collection, extension of databases
International Nuclear Information System (INIS)
Gillemot, F.
1992-01-01
Full text: Databases: generally calculated data only. The original measurements: diagrams. Information loss between them Expensive research eg. irradiation, aging, creep etc. Original curves should be stored for reanalysing. The format of the stored curves: a. Data in ASCII files, only numbers b. Other information in strings in a second file Same name, but different extension. Extensions shows the type of the test and the type of the file. EXAMPLES. TEN is tensile information, TED is tensile data, CHN is Charpy informations, CHD is Charpy data. Storing techniques: digitalised measurements, digitalising old curves stored on paper. Use: making catalogues, reanalysing, comparison with new data. Tools: mathematical software packages like quattro, genplot, exel, mathcad, qbasic, pascal, fortran, mathlab, grapher etc. (author)
Rational points on elliptic curves
Silverman, Joseph H
2015-01-01
The theory of elliptic curves involves a pleasing blend of algebra, geometry, analysis, and number theory. This book stresses this interplay as it develops the basic theory, thereby providing an opportunity for advanced undergraduates to appreciate the unity of modern mathematics. At the same time, every effort has been made to use only methods and results commonly included in the undergraduate curriculum. This accessibility, the informal writing style, and a wealth of exercises make Rational Points on Elliptic Curves an ideal introduction for students at all levels who are interested in learning about Diophantine equations and arithmetic geometry. Most concretely, an elliptic curve is the set of zeroes of a cubic polynomial in two variables. If the polynomial has rational coefficients, then one can ask for a description of those zeroes whose coordinates are either integers or rational numbers. It is this number theoretic question that is the main subject of this book. Topics covered include the geometry and ...
Theoretical melting curve of caesium
International Nuclear Information System (INIS)
Simozar, S.; Girifalco, L.A.; Pennsylvania Univ., Philadelphia
1983-01-01
A statistical-mechanical model is developed to account for the complex melting curve of caesium. The model assumes the existence of three different species of caesium defined by three different electronic states. On the basis of this model, the free energy of melting and the melting curve are computed up to 60 kbar, using the solid-state data and the initial slope of the fusion curve as input parameters. The calculated phase diagram agrees with experiment to within the experimental error. Other thermodynamic properties including the entropy and volume of melting were also computed, and they agree with experiment. Since the theory requires only one adjustable constant, this is taken as strong evidence that the three-species model is satisfactory for caesium. (author)
Complexity of Curved Glass Structures
Kosić, T.; Svetel, I.; Cekić, Z.
2017-11-01
Despite the increasing number of research on the architectural structures of curvilinear forms and technological and practical improvement of the glass production observed over recent years, there is still a lack of comprehensive codes and standards, recommendations and experience data linked to real-life curved glass structures applications regarding design, manufacture, use, performance and economy. However, more and more complex buildings and structures with the large areas of glass envelope geometrically complex shape are built every year. The aim of the presented research is to collect data on the existing design philosophy on curved glass structure cases. The investigation includes a survey about how architects and engineers deal with different design aspects of curved glass structures with a special focus on the design and construction process, glass types and structural and fixing systems. The current paper gives a brief overview of the survey findings.
Zhu, Linqi; Zhang, Chong; Zhang, Chaomo; Wei, Yang; Zhou, Xueqing; Cheng, Yuan; Huang, Yuyang; Zhang, Le
2018-06-01
There is increasing interest in shale gas reservoirs due to their abundant reserves. As a key evaluation criterion, the total organic carbon content (TOC) of the reservoirs can reflect its hydrocarbon generation potential. The existing TOC calculation model is not very accurate and there is still the possibility for improvement. In this paper, an integrated hybrid neural network (IHNN) model is proposed for predicting the TOC. This is based on the fact that the TOC information on the low TOC reservoir, where the TOC is easy to evaluate, comes from a prediction problem, which is the inherent problem of the existing algorithm. By comparing the prediction models established in 132 rock samples in the shale gas reservoir within the Jiaoshiba area, it can be seen that the accuracy of the proposed IHNN model is much higher than that of the other prediction models. The mean square error of the samples, which were not joined to the established models, was reduced from 0.586 to 0.442. The results show that TOC prediction is easier after logging prediction has been improved. Furthermore, this paper puts forward the next research direction of the prediction model. The IHNN algorithm can help evaluate the TOC of a shale gas reservoir.
Noor, M. J. Md; Ibrahim, A.; Rahman, A. S. A.
2018-04-01
Small strain triaxial test measurement is considered to be significantly accurate compared to the external strain measurement using conventional method due to systematic errors normally associated with the test. Three submersible miniature linear variable differential transducer (LVDT) mounted on yokes which clamped directly onto the soil sample at equally 120° from the others. The device setup using 0.4 N resolution load cell and 16 bit AD converter was capable of consistently resolving displacement of less than 1µm and measuring axial strains ranging from less than 0.001% to 2.5%. Further analysis of small strain local measurement data was performed using new Normalized Multiple Yield Surface Framework (NRMYSF) method and compared with existing Rotational Multiple Yield Surface Framework (RMYSF) prediction method. The prediction of shear strength based on combined intrinsic curvilinear shear strength envelope using small strain triaxial test data confirmed the significant improvement and reliability of the measurement and analysis methods. Moreover, the NRMYSF method shows an excellent data prediction and significant improvement toward more reliable prediction of soil strength that can reduce the cost and time of experimental laboratory test.
IDF-curves for precipitation In Belgium
International Nuclear Information System (INIS)
Mohymont, Bernard; Demarde, Gaston R.
2004-01-01
The Intensity-Duration-Frequency (IDF) curves for precipitation constitute a relationship between the intensity, the duration and the frequency of rainfall amounts. The intensity of precipitation is expressed in mm/h, the duration or aggregation time is the length of the interval considered while the frequency stands for the probability of occurrence of the event. IDF-curves constitute a classical and useful tool that is primarily used to dimension hydraulic structures in general, as e.g., sewer systems and which are consequently used to assess the risk of inundation. In this presentation, the IDF relation for precipitation is studied for different locations in Belgium. These locations correspond to two long-term, high-quality precipitation networks of the RMIB: (a) the daily precipitation depths of the climatological network (more than 200 stations, 1951-2001 baseline period); (b) the high-frequency 10-minutes precipitation depths of the hydro meteorological network (more than 30 stations, 15 to 33 years baseline period). For the station of Uccle, an uninterrupted time-series of more than one hundred years of 10-minutes rainfall data is available. The proposed technique for assessing the curves is based on maximum annual values of precipitation. A new analytical formula for the IDF-curves was developed such that these curves stay valid for aggregation times ranging from 10 minutes to 30 days (when fitted with appropriate data). Moreover, all parameters of this formula have physical dimensions. Finally, adequate spatial interpolation techniques are used to provide nationwide extreme values precipitation depths for short- to long-term durations With a given return period. These values are estimated on the grid points of the Belgian ALADIN-domain used in the operational weather forecasts at the RMIB.(Author)
TELECOMMUNICATIONS INFRASTRUCTURE AND GDP /JIPP CURVE/
Directory of Open Access Journals (Sweden)
Mariana Kaneva
2016-07-01
Full Text Available The relationship between telecommunications infrastructure and economic activity is under discussion in many scientific papers. Most of the authors use for research and analysis the Jipp curve. A lot of doubts about the correctness of the Jipp curve appear in terms of applying econometric models. The aim of this study is a review of the Jipp curve, refining the possibility of its application in modern conditions. The methodology used in the study is based on dynamic econometric models, including tests for nonstationarity and tests for causality. The focus of this study is directed to methodological problems in measuring the local density types of telecommunication networks. This study offers a specific methodology for assessing the Jipp law, through VAR-approach and Granger causality tests. It is proved that mechanical substitution of momentary aggregated variables (such as the number of subscribers of a telecommunication network at the end of the year and periodically aggregated variables (such as GDP per capita in the Jipp�s curve is methodologically wrong. Researchers have to reconsider the relationship set in the Jipp�s curve by including additional variables that characterize the Telecommunications sector and the economic activity in a particular country within a specified time period. GDP per capita should not be regarded as a single factor for the local density of telecommunications infrastructure. New econometric models studying the relationship between the investments in telecommunications infrastructure and economic development may be not only linear regression models, but also other econometric models. New econometric models should be proposed after testing and validating with sound economic theory and econometric methodology.
Deshmukh, Dhananjay Suresh; Chaube, Umesh Chandra; Ekube Hailu, Ambaye; Aberra Gudeta, Dida; Tegene Kassa, Melaku
2013-06-01
The CN represents runoff potential is estimated using three different methods for three watersheds namely Barureva, Sher and Umar watershed located in Narmada basin. Among three watersheds, Sher watershed has gauging site for the runoff measurements. The CN computed from the observed rainfall-runoff events is termed as CN(PQ), land use and land cover (LULC) is termed as CN(LU) and the CN based on land slope is termed as SACN2. The estimated annual CN(PQ) varies from 69 to 87 over the 26 years data period with median 74 and average 75. The range of CN(PQ) from 70 to 79 are most significant values and these truly represent the AMC II condition for the Sher watershed. The annual CN(LU) was computed for all three watersheds using GIS and the years are 1973, 1989 and 2000. Satellite imagery of MSS, TM and ETM+ sensors are available for these years and obtained from the Global Land Cover Facility Data Center of Maryland University USA. The computed CN(LU) values show rising trend with the time and this trend is attributed to expansion of agriculture area in all watersheds. The predicted values of CN(LU) with time (year) can be used to predict runoff potential under the effect of change in LULC. Comparison of CN(LU) and CN(PQ) values shows close agreement and it also validates the classification of LULC. The estimation of slope adjusted SA-CN2 shows the significant difference over conventional CN for the hilly forest lands. For the micro watershed planning, SCS-CN method should be modified to incorporate the effect of change in land use and land cover along with effect of land slope.
Tracing a planar algebraic curve
International Nuclear Information System (INIS)
Chen Falai; Kozak, J.
1994-09-01
In this paper, an algorithm that determines a real algebraic curve is outlined. Its basic step is to divide the plane into subdomains that include only simple branches of the algebraic curve without singular points. Each of the branches is then stably and efficiently traced in the particular subdomain. Except for the tracing, the algorithm requires only a couple of simple operations on polynomials that can be carried out exactly if the coefficients are rational, and the determination of zeros of several polynomials of one variable. (author). 5 refs, 4 figs
The New Keynesian Phillips Curve
DEFF Research Database (Denmark)
Ólafsson, Tjörvi
This paper provides a survey on the recent literature on the new Keynesian Phillips curve: the controversies surrounding its microfoundation and estimation, the approaches that have been tried to improve its empirical fit and the challenges it faces adapting to the open-economy framework. The new......, learning or state-dependant pricing. The introduction of openeconomy factors into the new Keynesian Phillips curve complicate matters further as it must capture the nexus between price setting, inflation and the exchange rate. This is nevertheless a crucial feature for any model to be used for inflation...... forecasting in a small open economy like Iceland....
Signature Curves Statistics of DNA Supercoils
Shakiban, Cheri; Lloyd, Peter
2004-01-01
In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...
A semiparametric separation curve approach for comparing correlated ROC data from multiple markers
Tang, Liansheng Larry; Zhou, Xiao-Hua
2012-01-01
In this article we propose a separation curve method to identify the range of false positive rates for which two ROC curves differ or one ROC curve is superior to the other. Our method is based on a general multivariate ROC curve model, including interaction terms between discrete covariates and false positive rates. It is applicable with most existing ROC curve models. Furthermore, we introduce a semiparametric least squares ROC estimator and apply the estimator to the separation curve method. We derive a sandwich estimator for the covariance matrix of the semiparametric estimator. We illustrate the application of our separation curve method through two real life examples. PMID:23074360
A simple transformation for converting CW-OSL curves to LM-OSL curves
DEFF Research Database (Denmark)
Bulur, E.
2000-01-01
A simple mathematical transformation is introduced to convert from OSL decay curves obtained in the conventional way to those obtained using a linear modulation technique based on a linear increase of the stimulation light intensity during OSL measurement. The validity of the transformation...... was tested by the IR-stimulated luminescence curves from feldspars, recorded using both the conventional and the linear modulation techniques. The transformation was further applied to green-light-stimulated OSL from K and Na feldspars. (C) 2000 Elsevier Science Ltd. All rights reserved....
Energy Technology Data Exchange (ETDEWEB)
Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)
2014-05-15
To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)
INVESTIGATION OF CURVES SET BY CUBIC DISTRIBUTION OF CURVATURE
Directory of Open Access Journals (Sweden)
S. A. Ustenko
2014-03-01
Full Text Available Purpose. Further development of the geometric modeling of curvelinear contours of different objects based on the specified cubic curvature distribution and setpoints of curvature in the boundary points. Methodology. We investigate the flat section of the curvilinear contour generating under condition that cubic curvature distribution is set. Curve begins and ends at the given points, where angles of tangent slope and curvature are also determined. It was obtained the curvature equation of this curve, depending on the section length and coefficient c of cubic curvature distribution. The analysis of obtained equation was carried out. As well as, it was investigated the conditions, in which the inflection points of the curve are appearing. One should find such an interval of parameter change (depending on the input data and the section length, in order to place the inflection point of the curvature graph outside the curve section borders. It was determined the dependence of tangent slope of angle to the curve at its arbitrary point, as well as it was given the recommendations to solve a system of integral equations that allow finding the length of the curve section and the coefficient c of curvature cubic distribution. Findings. As the result of curves research, it is found that the criterion for their selection one can consider the absence of inflection points of the curvature on the observed section. Influence analysis of the parameter c on the graph of tangent slope angle to the curve showed that regardless of its value, it is provided the same rate of angle increase of tangent slope to the curve. Originality. It is improved the approach to geometric modeling of curves based on cubic curvature distribution with its given values at the boundary points by eliminating the inflection points from the observed section of curvilinear contours. Practical value. Curves obtained using the proposed method can be used for geometric modeling of curvilinear
Electro-Mechanical Resonance Curves
Greenslade, Thomas B., Jr.
2018-01-01
Recently I have been investigating the frequency response of galvanometers. These are direct-current devices used to measure small currents. By using a low-frequency function generator to supply the alternating-current signal and a stopwatch smartphone app to measure the period, I was able to take data to allow a resonance curve to be drawn. This…
2013-01-01
This software can be used to assist with the assessment of margin of safety for a horizontal curve. It is intended for use by engineers and technicians responsible for safety analysis or management of rural highway pavement or traffic control devices...
Principal Curves on Riemannian Manifolds.
Hauberg, Soren
2016-09-01
Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimizes a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend to imply that the methods only work well when the manifold is mostly flat within the support of the generating distribution. We argue that instead of generalizing linear Euclidean models, it is more fruitful to generalize non-linear Euclidean models. Specifically, we extend the classic Principal Curves from Hastie & Stuetzle to data residing on a complete Riemannian manifold. We show that for elliptical distributions in the tangent of spaces of constant curvature, the standard principal geodesic is a principal curve. The proposed model is simple to compute and avoids many of the pitfalls of traditional geodesic approaches. We empirically demonstrate the effectiveness of the Riemannian principal curves on several manifolds and datasets.
Elliptic curves and primality proving
Atkin, A. O. L.; Morain, F.
1993-07-01
The aim of this paper is to describe the theory and implementation of the Elliptic Curve Primality Proving algorithm. Problema, numeros primos a compositis dignoscendi, hosque in factores suos primos resolvendi, ad gravissima ac utilissima totius arithmeticae pertinere, et geometrarum tum veterum tum recentiorum industriam ac sagacitatem occupavisse, tam notum est, ut de hac re copiose loqui superfluum foret.
Indian Academy of Sciences (India)
from biology, feel that every pattern in the living world, ranging from the folding of ... curves band c have the same rate of increase but reach different asymptotes. If these .... not at x = 0, but at xo' which is the minimum size at birth that will permit ...
Survival curves for irradiated cells
International Nuclear Information System (INIS)
Gibson, D.K.
1975-01-01
The subject of the lecture is the probability of survival of biological cells which have been subjected to ionising radiation. The basic mathematical theories of cell survival as a function of radiation dose are developed. A brief comparison with observed survival curves is made. (author)
Shape optimization of self-avoiding curves
Walker, Shawn W.
2016-04-01
This paper presents a softened notion of proximity (or self-avoidance) for curves. We then derive a sensitivity result, based on shape differential calculus, for the proximity. This is combined with a gradient-based optimization approach to compute three-dimensional, parameterized curves that minimize the sum of an elastic (bending) energy and a proximity energy that maintains self-avoidance by a penalization technique. Minimizers are computed by a sequential-quadratic-programming (SQP) method where the bending energy and proximity energy are approximated by a finite element method. We then apply this method to two problems. First, we simulate adsorbed polymer strands that are constrained to be bound to a surface and be (locally) inextensible. This is a basic model of semi-flexible polymers adsorbed onto a surface (a current topic in material science). Several examples of minimizing curve shapes on a variety of surfaces are shown. An advantage of the method is that it can be much faster than using molecular dynamics for simulating polymer strands on surfaces. Second, we apply our proximity penalization to the computation of ideal knots. We present a heuristic scheme, utilizing the SQP method above, for minimizing rope-length and apply it in the case of the trefoil knot. Applications of this method could be for generating good initial guesses to a more accurate (but expensive) knot-tightening algorithm.
Mentorship, learning curves, and balance.
Cohen, Meryl S; Jacobs, Jeffrey P; Quintessenza, James A; Chai, Paul J; Lindberg, Harald L; Dickey, Jamie; Ungerleider, Ross M
2007-09-01
Professionals working in the arena of health care face a variety of challenges as their careers evolve and develop. In this review, we analyze the role of mentorship, learning curves, and balance in overcoming challenges that all such professionals are likely to encounter. These challenges can exist both in professional and personal life. As any professional involved in health care matures, complex professional skills must be mastered, and new professional skills must be acquired. These skills are both technical and judgmental. In most circumstances, these skills must be learned. In 2007, despite the continued need for obtaining new knowledge and learning new skills, the professional and public tolerance for a "learning curve" is much less than in previous decades. Mentorship is the key to success in these endeavours. The success of mentorship is two-sided, with responsibilities for both the mentor and the mentee. The benefits of this relationship must be bidirectional. It is the responsibility of both the student and the mentor to assure this bidirectional exchange of benefit. This relationship requires time, patience, dedication, and to some degree selflessness. This mentorship will ultimately be the best tool for mastering complex professional skills and maturing through various learning curves. Professional mentorship also requires that mentors identify and explicitly teach their mentees the relational skills and abilities inherent in learning the management of the triad of self, relationships with others, and professional responsibilities.Up to two decades ago, a learning curve was tolerated, and even expected, while professionals involved in healthcare developed the techniques that allowed for the treatment of previously untreatable diseases. Outcomes have now improved to the point that this type of learning curve is no longer acceptable to the public. Still, professionals must learn to perform and develop independence and confidence. The responsibility to
Graphical evaluation of complexometric titration curves.
Guinon, J L
1985-04-01
A graphical method, based on logarithmic concentration diagrams, for construction, without any calculations, of complexometric titration curves is examined. The titration curves obtained for different kinds of unidentate, bidentate and quadridentate ligands clearly show why only chelating ligands are usually used in titrimetric analysis. The method has also been applied to two practical cases where unidentate ligands are used: (a) the complexometric determination of mercury(II) with halides and (b) the determination of cyanide with silver, which involves both a complexation and a precipitation system; for this purpose construction of the diagrams for the HgCl(2)/HgCl(+)/Hg(2+) and Ag(CN)(2)(-)/AgCN/CN(-) systems is considered in detail.
Ground reaction curve based upon block theory
International Nuclear Information System (INIS)
Yow, J.L. Jr.; Goodman, R.E.
1985-09-01
Discontinuities in a rock mass can intersect an excavation surface to form discrete blocks (keyblocks) which can be unstable. Once a potentially unstable block is identified, the forces affecting it can be calculated to assess its stability. The normal and shear stresses on each block face before displacement are calculated using elastic theory and are modified in a nonlinear way by discontinuity deformations as the keyblock displaces. The stresses are summed into resultant forces to evaluate block stability. Since the resultant forces change with displacement, successive increments of block movement are examined to see whether the block ultimately becomes stable or fails. Two-dimensional (2D) and three-dimensional (3D) analytic models for the stability of simple pyramidal keyblocks were evaluated. Calculated stability is greater for 3D analyses than for 2D analyses. Calculated keyblock stability increases with larger in situ stress magnitudes, larger lateral stress ratios, and larger shear strengths. Discontinuity stiffness controls blocks displacement more strongly than it does stability itself. Large keyblocks are less stable than small ones, and stability increases as blocks become more slender
Rational Points on Curves of Genus 2: Experiments and Speculations
Stoll, Michael
2009-01-01
I will present results of computations providing statistics on rational points on (small) curves of genus 2 and use them to present several conjectures. Some of them are based on heuristic considerations, others are not.
Climbing the health learning curve together | IDRC - International ...
International Development Research Centre (IDRC) Digital Library (Canada)
2011-01-25
Jan 25, 2011 ... Climbing the health learning curve together ... Many of the projects are creating master's programs at their host universities ... Formerly based in the high Arctic, Atlantis is described by Dr Martin Forde of St George's University ...
Rectification of light refraction in curved waveguide arrays.
Longhi, Stefano
2009-02-15
An "optical ratchet" for discretized light in photonic lattices, which enables observing rectification of light refraction at any input beam conditions, is theoretically presented, and a possible experimental implementation based on periodically curved zigzag waveguide arrays is proposed.
Rectification of light refraction in curved waveguide arrays
Longhi, S.
2010-01-01
An 'optical ratchet' for discretized light in photonic lattices, which enables to observe rectification of light refraction at any input beam conditions, is theoretically presented, and a possible experimental implementation based on periodically-curved zigzag waveguide arrays is proposed.
Parameter Deduction and Accuracy Analysis of Track Beam Curves in Straddle-type Monorail Systems
Directory of Open Access Journals (Sweden)
Xiaobo Zhao
2015-12-01
Full Text Available The accuracy of the bottom curve of a PC track beam is strongly related to the production quality of the entire beam. Many factors may affect the parameters of the bottom curve, such as the superelevation of the curve and the deformation of a PC track beam. At present, no effective method has been developed to determine the bottom curve of a PC track beam; therefore, a new technique is presented in this paper to deduce the parameters of such a curve and to control the accuracy of the computation results. First, the domain of the bottom curve of a PC track beam is assumed to be a spindle plane. Then, the corresponding supposed top curve domain is determined based on a geometrical relationship that is the opposite of that identified by the conventional method. Second, several optimal points are selected from the supposed top curve domain according to the dichotomy algorithm; the supposed top curve is thus generated by connecting these points. Finally, one rigorous criterion is established in the fractal dimension to assess the accuracy of the assumed top curve deduced in the previous step. If this supposed curve coincides completely with the known top curve, then the assumed bottom curve corresponding to the assumed top curve is considered to be the real bottom curve. This technique of determining the bottom curve of a PC track beam is thus proven to be efficient and accurate.
A catalog of special plane curves
Lawrence, J Dennis
2014-01-01
Among the largest, finest collections available-illustrated not only once for each curve, but also for various values of any parameters present. Covers general properties of curves and types of derived curves. Curves illustrated by a CalComp digital incremental plotter. 12 illustrations.
Computation of undulator tuning curves
International Nuclear Information System (INIS)
Dejus, Roger J.
1997-01-01
Computer codes for fast computation of on-axis brilliance tuning curves and flux tuning curves have been developed. They are valid for an ideal device (regular planar device or a helical device) using the Bessel function formalism. The effects of the particle beam emittance and the beam energy spread on the spectrum are taken into account. The applicability of the codes and the importance of magnetic field errors of real insertion devices are addressed. The validity of the codes has been experimentally verified at the APS and observed discrepancies are in agreement with predicted reduction of intensities due to magnetic field errors. The codes are distributed as part of the graphical user interface XOP (X-ray OPtics utilities), which simplifies execution and viewing of the results
Curved canals: Ancestral files revisited
Directory of Open Access Journals (Sweden)
Jain Nidhi
2008-01-01
Full Text Available The aim of this article is to provide an insight into different techniques of cleaning and shaping of curved root canals with hand instruments. Although a plethora of root canal instruments like ProFile, ProTaper, LightSpeed ® etc dominate the current scenario, the inexpensive conventional root canal hand files such as K-files and flexible files can be used to get optimum results when handled meticulously. Special emphasis has been put on the modifications in biomechanical canal preparation in a variety of curved canal cases. This article compiles a series of clinical cases of root canals with curvatures in the middle and apical third and with S-shaped curvatures that were successfully completed by employing only conventional root canal hand instruments.
Invariance for Single Curved Manifold
Castro, Pedro Machado Manhaes de
2012-01-01
Recently, it has been shown that, for Lambert illumination model, solely scenes composed by developable objects with a very particular albedo distribution produce an (2D) image with isolines that are (almost) invariant to light direction change. In this work, we provide and investigate a more general framework, and we show that, in general, the requirement for such in variances is quite strong, and is related to the differential geometry of the objects. More precisely, it is proved that single curved manifolds, i.e., manifolds such that at each point there is at most one principal curvature direction, produce invariant is surfaces for a certain relevant family of energy functions. In the three-dimensional case, the associated energy function corresponds to the classical Lambert illumination model with albedo. This result is also extended for finite-dimensional scenes composed by single curved objects. © 2012 IEEE.
Invariance for Single Curved Manifold
Castro, Pedro Machado Manhaes de
2012-08-01
Recently, it has been shown that, for Lambert illumination model, solely scenes composed by developable objects with a very particular albedo distribution produce an (2D) image with isolines that are (almost) invariant to light direction change. In this work, we provide and investigate a more general framework, and we show that, in general, the requirement for such in variances is quite strong, and is related to the differential geometry of the objects. More precisely, it is proved that single curved manifolds, i.e., manifolds such that at each point there is at most one principal curvature direction, produce invariant is surfaces for a certain relevant family of energy functions. In the three-dimensional case, the associated energy function corresponds to the classical Lambert illumination model with albedo. This result is also extended for finite-dimensional scenes composed by single curved objects. © 2012 IEEE.
Incorporating Experience Curves in Appliance Standards Analysis
Energy Technology Data Exchange (ETDEWEB)
Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit
2011-10-31
The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.
Microvascular Anastomosis: Proposition of a Learning Curve.
Mokhtari, Pooneh; Tayebi Meybodi, Ali; Benet, Arnau; Lawton, Michael T
2018-04-14
Learning to perform a microvascular anastomosis is one of the most difficult tasks in cerebrovascular surgery. Previous studies offer little regarding the optimal protocols to maximize learning efficiency. This failure stems mainly from lack of knowledge about the learning curve of this task. To delineate this learning curve and provide information about its various features including acquisition, improvement, consistency, stability, and recall. Five neurosurgeons with an average surgical experience history of 5 yr and without any experience in bypass surgery performed microscopic anastomosis on progressively smaller-caliber silastic tubes (Biomet, Palm Beach Gardens, Florida) during 24 consecutive sessions. After a 1-, 2-, and 8-wk retention interval, they performed recall test on 0.7-mm silastic tubes. The anastomoses were rated based on anastomosis patency and presence of any leaks. Improvement rate was faster during initial sessions compared to the final practice sessions. Performance decline was observed in the first session of working on a smaller-caliber tube. However, this rapidly improved during the following sessions of practice. Temporary plateaus were seen in certain segments of the curve. The retention interval between the acquisition and recall phase did not cause a regression to the prepractice performance level. Learning the fine motor task of microvascular anastomosis adapts to the basic rules of learning such as the "power law of practice." Our results also support the improvement of performance during consecutive sessions of practice. The objective evidence provided may help in developing optimized learning protocols for microvascular anastomosis.
Wheelset curving guidance using H∞ control
Qazizadeh, Alireza; Stichel, Sebastian; Feyzmahdavian, Hamid Reza
2018-03-01
This study shows how to design an active suspension system for guidance of a rail vehicle wheelset in curve. The main focus of the study is on designing the controller and afterwards studying its effect on the wheel wear behaviour. The controller is designed based on the closed-loop transfer function shaping method and ? control strategy. The study discusses designing of the controller for both nominal and uncertain plants and considers both stability and performance. The designed controllers in Simulink are then applied to the vehicle model in Simpack to study the wheel wear behaviour in curve. The vehicle type selected for this study is a two-axle rail vehicle. This is because this type of vehicle is known to have very poor curving performance and high wheel wear. On the other hand, the relative simpler structure of this type of vehicle compared to bogie vehicles make it a more economic choice. Hence, equipping this type of vehicle with the active wheelset steering is believed to show high enough benefit to cost ratio to remain attractive to rail vehicle manufacturers and operators.
Modelling stochastic chances in curve shape, with an application to cancer diagnostics
DEFF Research Database (Denmark)
Hobolth, A; Jensen, Eva B. Vedel
2000-01-01
Often, the statistical analysis of the shape of a random planar curve is based on a model for a polygonal approximation to the curve. In the present paper, we instead describe the curve as a continuous stochastic deformation of a template curve. The advantage of this continuous approach is that t......Often, the statistical analysis of the shape of a random planar curve is based on a model for a polygonal approximation to the curve. In the present paper, we instead describe the curve as a continuous stochastic deformation of a template curve. The advantage of this continuous approach...... is that the parameters in the model do not relate to a particular polygonal approximation. A somewhat similar approach has been used by Kent et al. (1996), who describe the limiting behaviour of a model with a first-order Markov property as the landmarks on the curve become closely spaced; see also Grenander(1993...
Curved Folded Plate Timber Structures
Buri, Hans Ulrich; Stotz, Ivo; Weinand, Yves
2011-01-01
This work investigates the development of a Curved Origami Prototype made with timber panels. In the last fifteen years the timber industry has developed new, large size, timber panels. Composition and dimensions of these panels and the possibility of milling them with Computer Numerical Controlled machines shows great potential for folded plate structures. To generate the form of these structures we were inspired by Origami, the Japanese art of paper folding. Common paper tessellations are c...
Growth curves for Laron syndrome.
Laron, Z; Lilos, P; Klinger, B
1993-01-01
Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls completed their growth between the age of 16-19 years to a final mean (SD) height of 119 (8.5) cm whereas the boys continued growing beyond the age of 20 years, achieving a final height of 124 (8.5) cm. At all ages the upper to lower body segment ratio was more than 2 SD above the normal mean. These growth curves constitute a model not only for primary, hereditary insulin-like growth factor-I (IGF-I) deficiency (Laron syndrome) but also for untreated secondary IGF-I deficiencies such as growth hormone gene deletion and idiopathic congenital isolated growth hormone deficiency. They should also be useful in the follow up of children with Laron syndrome treated with biosynthetic recombinant IGF-I. PMID:8333769
Elementary particles in curved spaces
International Nuclear Information System (INIS)
Lazanu, I.
2004-01-01
The theories in particle physics are developed currently, in Minkowski space-time starting from the Poincare group. A physical theory in flat space can be seen as the limit of a more general physical theory in a curved space. At the present time, a theory of particles in curved space does not exist, and thus the only possibility is to extend the existent theories in these spaces. A formidable obstacle to the extension of physical models is the absence of groups of motion in more general Riemann spaces. A space of constant curvature has a group of motion that, although differs from that of a flat space, has the same number of parameters and could permit some generalisations. In this contribution we try to investigate some physical implications of the presumable existence of elementary particles in curved space. In de Sitter space (dS) the invariant rest mass is a combination of the Poincare rest mass and the generalised angular momentum of a particle and it permits to establish a correlation with the vacuum energy and with the cosmological constant. The consequences are significant because in an experiment the local structure of space-time departs from the Minkowski space and becomes a dS or AdS space-time. Discrete symmetry characteristics of the dS/AdS group suggest some arguments for the possible existence of the 'mirror matter'. (author)
Projection of curves on B-spline surfaces using quadratic reparameterization
Yang, Yijun; Zeng, Wei; Zhang, Hui; Yong, Junhai; Paul, Jean Claude
2010-01-01
Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying
Dual Smarandache Curves and Smarandache Ruled Surfaces
Tanju KAHRAMAN; Mehmet ÖNDER; H. Hüseyin UGURLU
2013-01-01
In this paper, by considering dual geodesic trihedron (dual Darboux frame) we define dual Smarandache curves lying fully on dual unit sphere S^2 and corresponding to ruled surfaces. We obtain the relationships between the elements of curvature of dual spherical curve (ruled surface) x(s) and its dual Smarandache curve (Smarandache ruled surface) x1(s) and we give an example for dual Smarandache curves of a dual spherical curve.
Schlösser, Tom P C; van Stralen, Marijn; Chu, Winnie C W; Lam, Tsz-Ping; Ng, Bobby K W; Vincken, Koen L; Cheng, Jack C Y; Castelein, René M
2016-01-01
Although much attention has been given to the global three-dimensional aspect of adolescent idiopathic scoliosis (AIS), the accurate three-dimensional morphology of the primary and compensatory curves, as well as the intervening junctional segments, in the scoliotic spine has not been described before. A unique series of 77 AIS patients with high-resolution CT scans of the spine, acquired for surgical planning purposes, were included and compared to 22 healthy controls. Non-idiopathic curves were excluded. Endplate segmentation and local longitudinal axis in endplate plane enabled semi-automatic geometric analysis of the complete three-dimensional morphology of the spine, taking inter-vertebral rotation, intra-vertebral torsion and coronal and sagittal tilt into account. Intraclass correlation coefficients for interobserver reliability were 0.98-1.00. Coronal deviation, axial rotation and the exact length discrepancies in the reconstructed sagittal plane, as defined per vertebra and disc, were analyzed for each primary and compensatory curve as well as for the junctional segments in-between. The anterior-posterior difference of spinal length, based on "true" anterior and posterior points on endplates, was +3.8% for thoracic and +9.4% for (thoraco)lumbar curves, while the junctional segments were almost straight. This differed significantly from control group thoracic kyphosis (-4.1%; P<0.001) and lumbar lordosis (+7.8%; P<0.001). For all primary as well as compensatory curves, we observed linear correlations between the coronal Cobb angle, axial rotation and the anterior-posterior length difference (r≥0.729 for thoracic curves; r≥0.485 for (thoraco)lumbar curves). Excess anterior length of the spine in AIS has been described as a generalized growth disturbance, causing relative anterior spinal overgrowth. This study is the first to demonstrate that this anterior overgrowth is not a generalized phenomenon. It is confined to the primary as well as the
Learning curves for mutual information maximization
International Nuclear Information System (INIS)
Urbanczik, R.
2003-01-01
An unsupervised learning procedure based on maximizing the mutual information between the outputs of two networks receiving different but statistically dependent inputs is analyzed [S. Becker and G. Hinton, Nature (London) 355, 161 (1992)]. For a generic data model, I show that in the large sample limit the structure in the data is recognized by mutual information maximization. For a more restricted model, where the networks are similar to perceptrons, I calculate the learning curves for zero-temperature Gibbs learning. These show that convergence can be rather slow, and a way of regularizing the procedure is considered
Developing Novel Reservoir Rule Curves Using Seasonal Inflow Projections
Tseng, Hsin-yi; Tung, Ching-pin
2015-04-01
Due to significant seasonal rainfall variations, reservoirs and their flexible operational rules are indispensable to Taiwan. Furthermore, with the intensifying impacts of climate change on extreme climate, the frequency of droughts in Taiwan has been increasing in recent years. Drought is a creeping phenomenon, the slow onset character of drought makes it difficult to detect at an early stage, and causes delays on making the best decision of allocating water. For these reasons, novel reservoir rule curves using projected seasonal streamflow are proposed in this study, which can potentially reduce the adverse effects of drought. This study dedicated establishing new rule curves which consider both current available storage and anticipated monthly inflows with leading time of two months to reduce the risk of water shortage. The monthly inflows are projected based on the seasonal climate forecasts from Central Weather Bureau (CWB), which a weather generation model is used to produce daily weather data for the hydrological component of the GWLF. To incorporate future monthly inflow projections into rule curves, this study designs a decision flow index which is a linear combination of current available storage and inflow projections with leading time of 2 months. By optimizing linear relationship coefficients of decision flow index, the shape of rule curves and the percent of water supply in each zone, the best rule curves to decrease water shortage risk and impacts can be developed. The Shimen Reservoir in the northern Taiwan is used as a case study to demonstrate the proposed method. Existing rule curves (M5 curves) of Shimen Reservoir are compared with two cases of new rule curves, including hindcast simulations and historic seasonal forecasts. The results show new rule curves can decrease the total water shortage ratio, and in addition, it can also allocate shortage amount to preceding months to avoid extreme shortage events. Even though some uncertainties in
Curve fitting for RHB Islamic Bank annual net profit
Nadarajan, Dineswary; Noor, Noor Fadiya Mohd
2015-05-01
The RHB Islamic Bank net profit data are obtained from 2004 to 2012. Curve fitting is done by assuming the data are exact or experimental due to smoothing process. Higher order Lagrange polynomial and cubic spline with curve fitting procedure are constructed using Maple software. Normality test is performed to check the data adequacy. Regression analysis with curve estimation is conducted in SPSS environment. All the eleven models are found to be acceptable at 10% significant level of ANOVA. Residual error and absolute relative true error are calculated and compared. The optimal model based on the minimum average error is proposed.
Formulae for Arithmetic on Genus 2 Hyperelliptic Curves
DEFF Research Database (Denmark)
Lange, Tanja
2005-01-01
The ideal class group of hyperelliptic curves can be used in cryptosystems based on the discrete logarithm problem. In this article we present explicit formulae to perform the group operations for genus 2 curves. The formulae are completely general but to achieve the lowest number of operations we...... treat odd and even characteristic separately. We present 3 different coordinate systems which are suitable for different environments, e.g. on a smart card we should avoid inversions while in software a limited number is acceptable. The presented formulae render genus two hyperelliptic curves very...
Variation of curve number with storm depth
Banasik, K.; Hejduk, L.
2012-04-01
The NRCS Curve Number (known also as SCS-CN) method is well known as a tool in predicting flood runoff depth from small ungauged catchment. The traditional way of determination the CNs, based on soil characteristics, land use and hydrological conditions, seemed to have tendency to overpredict the floods in some cases. Over 30 year rainfall-runoff data, collected in two small (A=23.4 & 82.4 km2), lowland, agricultural catchments in Center of Poland (Banasik & Woodward 2010), were used to determine runoff Curve Number and to check a tendency of changing. The observed CN declines with increasing storm size, which according recent views of Hawkins (1993) could be classified as a standard response of watershed. The analysis concluded, that using CN value according to the procedure described in USDA-SCS Handbook one receives representative value for estimating storm runoff from high rainfall depths in the analyzes catchments. This has been confirmed by applying "asymptotic approach" for estimating the watershed curve number from the rainfall-runoff data. Furthermore, the analysis indicated that CN, estimated from mean retention parameter S of recorded events with rainfall depth higher than initial abstraction, is also approaching the theoretical CN. The observed CN, ranging from 59.8 to 97.1 and from 52.3 to 95.5, in the smaller and the larger catchment respectively, declines with increasing storm size, which has been classified as a standard response of watershed. The investigation demonstrated also changeability of the CN during a year, with much lower values during the vegetation season. Banasik K. & D.E. Woodward (2010). "Empirical determination of curve number for a small agricultural watrshed in Poland". 2nd Joint Federal Interagency Conference, Las Vegas, NV, June 27 - July 1, 2010 (http://acwi.gov/sos/pubs/2ndJFIC/Contents/10E_Banasik_ 28_02_10. pdf). Hawkins R. H. (1993). "Asymptotic determination of curve numbers from data". Journal of Irrigation and Drainage
A note on families of fragility curves
International Nuclear Information System (INIS)
Kaplan, S.; Bier, V.M.; Bley, D.C.
1989-01-01
In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve
Observable Zitterbewegung in curved spacetimes
Kobakhidze, Archil; Manning, Adrian; Tureanu, Anca
2016-06-01
Zitterbewegung, as it was originally described by Schrödinger, is an unphysical, non-observable effect. We verify whether the effect can be observed in non-inertial reference frames/curved spacetimes, where the ambiguity in defining particle states results in a mixing of positive and negative frequency modes. We explicitly demonstrate that such a mixing is in fact necessary to obtain the correct classical value for a particle's velocity in a uniformly accelerated reference frame, whereas in cosmological spacetime a particle does indeed exhibit Zitterbewegung.
Observable Zitterbewegung in curved spacetimes
Energy Technology Data Exchange (ETDEWEB)
Kobakhidze, Archil, E-mail: archilk@physics.usyd.edu.au [ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, The University of Sydney, NSW 2006 (Australia); Manning, Adrian, E-mail: a.manning@physics.usyd.edu.au [ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, The University of Sydney, NSW 2006 (Australia); Tureanu, Anca, E-mail: anca.tureanu@helsinki.fi [Department of Physics, University of Helsinki, P.O. Box 64, 00014 Helsinki (Finland)
2016-06-10
Zitterbewegung, as it was originally described by Schrödinger, is an unphysical, non-observable effect. We verify whether the effect can be observed in non-inertial reference frames/curved spacetimes, where the ambiguity in defining particle states results in a mixing of positive and negative frequency modes. We explicitly demonstrate that such a mixing is in fact necessary to obtain the correct classical value for a particle's velocity in a uniformly accelerated reference frame, whereas in cosmological spacetime a particle does indeed exhibit Zitterbewegung.
Differential geometry curves, surfaces, manifolds
Kohnel, Wolfgang
2002-01-01
This carefully written book is an introduction to the beautiful ideas and results of differential geometry. The first half covers the geometry of curves and surfaces, which provide much of the motivation and intuition for the general theory. Special topics that are explored include Frenet frames, ruled surfaces, minimal surfaces and the Gauss-Bonnet theorem. The second part is an introduction to the geometry of general manifolds, with particular emphasis on connections and curvature. The final two chapters are insightful examinations of the special cases of spaces of constant curvature and Einstein manifolds. The text is illustrated with many figures and examples. The prerequisites are undergraduate analysis and linear algebra.
A NURBS approximation of experimental stress-strain curves
International Nuclear Information System (INIS)
Fedorov, Timofey V.; Morrev, Pavel G.
2016-01-01
A compact universal representation of monotonic experimental stress-strain curves of metals and alloys is proposed. It is based on the nonuniform rational Bezier splines (NURBS) of second order and may be used in a computer library of materials. Only six parameters per curve are needed; this is equivalent to a specification of only three points in a stress-strain plane. NURBS-functions of higher order prove to be surplus. Explicit expressions for both yield stress and hardening modulus are given. Two types of curves are considered: at a finite interval of strain and at infinite one. A broad class of metals and alloys of various chemical compositions subjected to various types of preliminary thermo-mechanical working is selected from a comprehensive data base in order to test the methodology proposed. The results demonstrate excellent correspondence to the experimental data. Keywords: work hardening, stress-strain curve, spline approximation, nonuniform rational B-spline, NURBS.
Projection of curves on B-spline surfaces using quadratic reparameterization
Yang, Yijun
2010-09-01
Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying completely on the surfaces by using iso-parameter curves of the reparameterized surfaces. The Hausdorff distance between the projected curve and the original curve is controlled under the user-specified distance tolerance. The projected curve is T-G 1 continuous, where T is the user-specified angle tolerance. Examples are given to show the performance of our algorithm. © 2010 Elsevier Inc. All rights reserved.
Differential geometry and topology of curves
Animov, Yu
2001-01-01
Differential geometry is an actively developing area of modern mathematics. This volume presents a classical approach to the general topics of the geometry of curves, including the theory of curves in n-dimensional Euclidean space. The author investigates problems for special classes of curves and gives the working method used to obtain the conditions for closed polygonal curves. The proof of the Bakel-Werner theorem in conditions of boundedness for curves with periodic curvature and torsion is also presented. This volume also highlights the contributions made by great geometers. past and present, to differential geometry and the topology of curves.
Placement Design of Changeable Message Signs on Curved Roadways
Directory of Open Access Journals (Sweden)
Zhongren Wang, Ph.D. P.E. T.E.
2015-01-01
Full Text Available This paper presented a fundamental framework for Changeable Message Sign (CMS placement design along roadways with horizontal curves. This analytical framework determines the available distance for motorists to read and react to CMS messages based on CMS character height, driver's cone of vision, CMS pixel's cone of legibility, roadway horizontal curve radius, and CMS lateral and vertical placement. Sample design charts were developed to illustrate how the analytical framework may facilitate CMS placement design.
Multiphoton absorption coefficients in solids: an universal curve
International Nuclear Information System (INIS)
Brandi, H.S.; Araujo, C.B. de
1983-04-01
An universal curve for the frequency dependence of the multiphoton absorption coefficient is proposed based on a 'non-perturbative' approach. Specific applications have been made to obtain two, three, four and five photons absorption coefficient in different materials. Properly scaling of the two photon absorption coefficient and the use of the universal curve yields results for the higher order absorption coefficients in good agreement with the experimental data. (Author) [pt
Flow characteristics of curved ducts
Directory of Open Access Journals (Sweden)
Rudolf P.
2007-10-01
Full Text Available Curved channels are very often present in real hydraulic systems, e.g. curved diffusers of hydraulic turbines, S-shaped bulb turbines, fittings, etc. Curvature brings change of velocity profile, generation of vortices and production of hydraulic losses. Flow simulation using CFD techniques were performed to understand these phenomena. Cases ranging from single elbow to coupled elbows in shapes of U, S and spatial right angle position with circular cross-section were modeled for Re = 60000. Spatial development of the flow was studied and consequently it was deduced that minor losses are connected with the transformation of pressure energy into kinetic energy and vice versa. This transformation is a dissipative process and is reflected in the amount of the energy irreversibly lost. Least loss coefficient is connected with flow in U-shape elbows, biggest one with flow in Sshape elbows. Finally, the extent of the flow domain influenced by presence of curvature was examined. This isimportant for proper placement of mano- and flowmeters during experimental tests. Simulations were verified with experimental results presented in literature.
Improved capacitive melting curve measurements
International Nuclear Information System (INIS)
Sebedash, Alexander; Tuoriniemi, Juha; Pentti, Elias; Salmela, Anssi
2009-01-01
Sensitivity of the capacitive method for determining the melting pressure of helium can be enhanced by loading the empty side of the capacitor with helium at a pressure nearly equal to that desired to be measured and by using a relatively thin and flexible membrane in between. This way one can achieve a nanobar resolution at the level of 30 bar, which is two orders of magnitude better than that of the best gauges with vacuum reference. This extends the applicability of melting curve thermometry to lower temperatures and would allow detecting tiny anomalies in the melting pressure, which must be associated with any phenomena contributing to the entropy of the liquid or solid phases. We demonstrated this principle in measurements of the crystallization pressure of isotopic helium mixtures at millikelvin temperatures by using partly solid pure 4 He as the reference substance providing the best possible universal reference pressure. The achieved sensitivity was good enough for melting curve thermometry on mixtures down to 100 μK. Similar system can be used on pure isotopes by virtue of a blocked capillary giving a stable reference condition with liquid slightly below the melting pressure in the reference volume. This was tested with pure 4 He at temperatures 0.08-0.3 K. To avoid spurious heating effects, one must carefully choose and arrange any dielectric materials close to the active capacitor. We observed some 100 pW loading at moderate excitation voltages.
Classical optics and curved spaces
International Nuclear Information System (INIS)
Bailyn, M.; Ragusa, S.
1976-01-01
In the eikonal approximation of classical optics, the unit polarization 3-vector of light satisfies an equation that depends only on the index, n, of refraction. It is known that if the original 3-space line element is d sigma 2 , then this polarization direction propagates parallely in the fictitious space n 2 d sigma 2 . Since the equation depends only on n, it is possible to invent a fictitious curved 4-space in which the light performs a null geodesic, and the polarization 3-vector behaves as the 'shadow' of a parallely propagated 4-vector. The inverse, namely, the reduction of Maxwell's equation, on a curve 'dielectric free) space, to a classical space with dielectric constant n=(-g 00 ) -1 / 2 is well known, but in the latter the dielectric constant epsilon and permeability μ must also equal (-g 00 ) -1 / 2 . The rotation of polarization as light bends around the sun by utilizing the reduction to the classical space, is calculated. This (non-) rotation may then be interpreted as parallel transport in the 3-space n 2 d sigma 2 [pt
p-Curve and p-Hacking in Observational Research.
Bruns, Stephan B; Ioannidis, John P A
2016-01-01
The p-curve, the distribution of statistically significant p-values of published studies, has been used to make inferences on the proportion of true effects and on the presence of p-hacking in the published literature. We analyze the p-curve for observational research in the presence of p-hacking. We show by means of simulations that even with minimal omitted-variable bias (e.g., unaccounted confounding) p-curves based on true effects and p-curves based on null-effects with p-hacking cannot be reliably distinguished. We also demonstrate this problem using as practical example the evaluation of the effect of malaria prevalence on economic growth between 1960 and 1996. These findings call recent studies into question that use the p-curve to infer that most published research findings are based on true effects in the medical literature and in a wide range of disciplines. p-values in observational research may need to be empirically calibrated to be interpretable with respect to the commonly used significance threshold of 0.05. Violations of randomization in experimental studies may also result in situations where the use of p-curves is similarly unreliable.
2002-01-01
The Atlas of Stress-Strain Curves, Second Edition is substantially bigger in page dimensions, number of pages, and total number of curves than the previous edition. It contains over 1,400 curves, almost three times as many as in the 1987 edition. The curves are normalized in appearance to aid making comparisons among materials. All diagrams include metric (SI) units, and many also include U.S. customary units. All curves are captioned in a consistent format with valuable information including (as available) standard designation, the primary source of the curve, mechanical properties (including hardening exponent and strength coefficient), condition of sample, strain rate, test temperature, and alloy composition. Curve types include monotonic and cyclic stress-strain, isochronous stress-strain, and tangent modulus. Curves are logically arranged and indexed for fast retrieval of information. The book also includes an introduction that provides background information on methods of stress-strain determination, on...
Transition curves for highway geometric design
Kobryń, Andrzej
2017-01-01
This book provides concise descriptions of the various solutions of transition curves, which can be used in geometric design of roads and highways. It presents mathematical methods and curvature functions for defining transition curves. .
Comparison and evaluation of mathematical lactation curve ...
African Journals Online (AJOL)
p2492989
A mathematical model of the lactation curve provides summary information about culling and milking strategies ..... Table 2 Statistics of the edited data for first lactation Holstein cows ..... Application of different models to the lactation curves of.
Strong laws for generalized absolute Lorenz curves when data are stationary and ergodic sequences
R. Helmers (Roelof); R. Zitikis
2004-01-01
textabstractWe consider generalized absolute Lorenz curves that include, as special cases, classical and generalized L - statistics as well as absolute or, in other words, generalized Lorenz curves. The curves are based on strictly stationary and ergodic sequences of random variables. Most of the
Gabauer, Douglas J; Li, Xiaolong
2015-04-01
The purpose of this study was to investigate motorcycle-to-barrier crash frequency on horizontally curved roadway sections in Washington State using police-reported crash data linked with roadway data and augmented with barrier presence information. Data included 4915 horizontal curved roadway sections with 252 of these sections experiencing 329 motorcycle-to-barrier crashes between 2002 and 2011. Negative binomial regression was used to predict motorcycle-to-barrier crash frequency using horizontal curvature and other roadway characteristics. Based on the model results, the strongest predictor of crash frequency was found to be curve radius. This supports a motorcycle-to-barrier crash countermeasure placement criterion based, at the very least, on horizontal curve radius. With respect to the existing horizontal curve criterion of 820 feet or less, curves meeting this criterion were found to increase motorcycle-to-barrier crash frequency rate by a factor of 10 compared to curves not meeting this criterion. Other statistically significant predictors were curve length, traffic volume and the location of adjacent curves. Assuming curves of identical radius, the model results suggest that longer curves, those with higher traffic volume, and those that have no adjacent curved sections within 300 feet of either curve end would likely be better candidates for a motorcycle-to-barrier crash countermeasure. Copyright © 2015 Elsevier Ltd. All rights reserved.
Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves
Energy Technology Data Exchange (ETDEWEB)
Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)
1998-11-01
Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)
The estimation of I–V curves of PV panel using manufacturers’ I–V curves and evolutionary strategy
International Nuclear Information System (INIS)
Barukčić, M.; Hederić, Ž.; Špoljarić, Ž.
2014-01-01
Highlights: • The approximation of a I–V curve by two linear and a sigmoid functions is proposed. • The sigmoid function is used to estimate the knee of the I–V curve. • Dependence on irradiance and temperature of sigmoid function parameters is proposed. • The sigmoid function is used to estimate maximum power point (MPP). - Abstract: The method for estimation of I–V curves of photovoltaic (PV) panel by analytic expression is presented in the paper. The problem is defined in the form of an optimization problem. The optimization problem objective is based on data from I–V curves obtained by manufacturers’ or measured I–V curves. In order to estimate PV panel parameters, the optimization problem is solved by using an evolutionary strategy. The proposed method is tested for different PV panel technologies using data sheets. In this method the I–V curve approximation with two linear and a sigmoid function is proposed. The method for estimating the knee of the I–V curve and maximum power point at any irradiance and temperature is proposed
Bubble Collision in Curved Spacetime
International Nuclear Information System (INIS)
Hwang, Dong-il; Lee, Bum-Hoon; Lee, Wonwoo; Yeom, Dong-han
2014-01-01
We study vacuum bubble collisions in curved spacetime, in which vacuum bubbles were nucleated in the initial metastable vacuum state by quantum tunneling. The bubbles materialize randomly at different times and then start to grow. It is known that the percolation by true vacuum bubbles is not possible due to the exponential expansion of the space among the bubbles. In this paper, we consider two bubbles of the same size with a preferred axis and assume that two bubbles form very near each other to collide. The two bubbles have the same field value. When the bubbles collide, the collided region oscillates back-and-forth and then the collided region eventually decays and disappears. We discuss radiation and gravitational wave resulting from the collision of two bubbles
Bacterial streamers in curved microchannels
Rusconi, Roberto; Lecuyer, Sigolene; Guglielmini, Laura; Stone, Howard
2009-11-01
Biofilms, generally identified as microbial communities embedded in a self-produced matrix of extracellular polymeric substances, are involved in a wide variety of health-related problems ranging from implant-associated infections to disease transmissions and dental plaque. The usual picture of these bacterial films is that they grow and develop on surfaces. However, suspended biofilm structures, or streamers, have been found in natural environments (e.g., rivers, acid mines, hydrothermal hot springs) and are always suggested to stem from a turbulent flow. We report the formation of bacterial streamers in curved microfluidic channels. By using confocal laser microscopy we are able to directly image and characterize the spatial and temporal evolution of these filamentous structures. Such streamers, which always connect the inner corners of opposite sides of the channel, are always located in the middle plane. Numerical simulations of the flow provide evidences for an underlying hydrodynamic mechanism behind the formation of the streamers.
Sibling curves of quadratic polynomials | Wiggins | Quaestiones ...
African Journals Online (AJOL)
Sibling curves were demonstrated in [1, 2] as a novel way to visualize the zeroes of real valued functions. In [3] it was shown that a polynomial of degree n has n sibling curves. This paper focuses on the algebraic and geometric properites of the sibling curves of real and complex quadratic polynomials. Key words: Quadratic ...
GLOBAL AND STRICT CURVE FITTING METHOD
Nakajima, Y.; Mori, S.
2004-01-01
To find a global and smooth curve fitting, cubic BSpline method and gathering line methods are investigated. When segmenting and recognizing a contour curve of character shape, some global method is required. If we want to connect contour curves around a singular point like crossing points,
Trigonometric Characterization of Some Plane Curves
Indian Academy of Sciences (India)
IAS Admin
(Figure 1). A relation between tan θ and tanψ gives the trigonometric equation of the family of curves. In this article, trigonometric equations of some known plane curves are deduced and it is shown that these equations reveal some geometric characteristics of the families of the curves under consideration. In Section 2,.
M-curves and symmetric products
Indian Academy of Sciences (India)
Indranil Biswas
2017-08-03
Aug 3, 2017 ... is bounded above by g + 1, where g is the genus of X [11]. Curves which have exactly the maximum number (i.e., genus +1) of components of the real part are called M-curves. Classifying real algebraic curves up to homeomorphism is straightforward, however, classifying even planar non-singular real ...
Holomorphic curves in exploded manifolds: Kuranishi structure
Parker, Brett
2013-01-01
This paper constructs a Kuranishi structure for the moduli stack of holomorphic curves in exploded manifolds. To avoid some technicalities of abstract Kuranishi structures, we embed our Kuranishi structure inside a moduli stack of curves. The construction also works for the moduli stack of holomorphic curves in any compact symplectic manifold.