DUAL TIMELIKE NORMAL AND DUAL TIMELIKE SPHERICAL CURVES IN DUAL MINKOWSKI SPACE
ÖNDER, Mehmet
2009-01-01
Abstract: In this paper, we give characterizations of dual timelike normal and dual timelike spherical curves in the dual Minkowski 3-space and we show that every dual timelike normal curve is also a dual timelike spherical curve. Keywords: Normal curves, Dual Minkowski 3-Space, Dual Timelike curves. Mathematics Subject Classifications (2000): 53C50, 53C40. DUAL MINKOWSKI UZAYINDA DUAL TIMELIKE NORMAL VE DUAL TIMELIKE KÜRESEL EĞRİLER Özet: Bu çalışmada, dual Minkowski 3-...
Morse theory on timelike and causal curves
International Nuclear Information System (INIS)
Everson, J.; Talbot, C.J.
1976-01-01
It is shown that the set of timelike curves in a globally hyperbolic space-time manifold can be given the structure of a Hilbert manifold under a suitable definition of 'timelike.' The causal curves are the topological closure of this manifold. The Lorentzian energy (corresponding to Milnor's energy, except that the Lorentzian inner product is used) is shown to be a Morse function for the space of causal curves. A fixed end point index theorem is obtained in which a lower bound for the index of the Hessian of the Lorentzian energy is given in terms of the sum of the orders of the conjugate points between the end points. (author)
Experimental simulation of closed timelike curves.
Ringbauer, Martin; Broome, Matthew A; Myers, Casey R; White, Andrew G; Ralph, Timothy C
2014-06-19
Closed timelike curves are among the most controversial features of modern physics. As legitimate solutions to Einstein's field equations, they allow for time travel, which instinctively seems paradoxical. However, in the quantum regime these paradoxes can be resolved, leaving closed timelike curves consistent with relativity. The study of these systems therefore provides valuable insight into nonlinearities and the emergence of causal structures in quantum mechanics--essential for any formulation of a quantum theory of gravity. Here we experimentally simulate the nonlinear behaviour of a qubit interacting unitarily with an older version of itself, addressing some of the fascinating effects that arise in systems traversing a closed timelike curve. These include perfect discrimination of non-orthogonal states and, most intriguingly, the ability to distinguish nominally equivalent ways of preparing pure quantum states. Finally, we examine the dependence of these effects on the initial qubit state, the form of the unitary interaction and the influence of decoherence.
Open Timelike Curves Violate Heisenberg's Uncertainty Principle
Pienaar, J. L.; Ralph, T. C.; Myers, C. R.
2013-02-01
Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg’s uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.
Open timelike curves violate Heisenberg's uncertainty principle.
Pienaar, J L; Ralph, T C; Myers, C R
2013-02-08
Toy models for quantum evolution in the presence of closed timelike curves have gained attention in the recent literature due to the strange effects they predict. The circuits that give rise to these effects appear quite abstract and contrived, as they require nontrivial interactions between the future and past that lead to infinitely recursive equations. We consider the special case in which there is no interaction inside the closed timelike curve, referred to as an open timelike curve (OTC), for which the only local effect is to increase the time elapsed by a clock carried by the system. Remarkably, circuits with access to OTCs are shown to violate Heisenberg's uncertainty principle, allowing perfect state discrimination and perfect cloning of coherent states. The model is extended to wave packets and smoothly recovers standard quantum mechanics in an appropriate physical limit. The analogy with general relativistic time dilation suggests that OTCs provide a novel alternative to existing proposals for the behavior of quantum systems under gravity.
Dual Smarandache Curves of a Timelike Curve lying on Unit dual Lorentzian Sphere
Kahraman, Tanju; Hüseyin Ugurlu, Hasan
2016-01-01
In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere. Firstly, we define the four types of dual Smarandache curves of a timelike curve lying on dual Lorentzian sphere.
Closed timelike curves in asymmetrically warped brane universes
Päs, Heinrich; Pakvasa, Sandip; Dent, James; Weiler, Thomas J.
2009-08-01
In asymmetrically-warped spacetimes different warp factors are assigned to space and to time. We discuss causality properties of these warped brane universes and argue that scenarios with two extra dimensions may allow for timelike curves which can be closed via paths in the extra-dimensional bulk. In particular, necessary and sufficient conditions on the metric for the existence of closed timelike curves are presented. We find a six-dimensional warped metric which satisfies the CTC conditions, and where the null, weak and dominant energy conditions are satisfied on the brane (although only the former remains satisfied in the bulk). Such scenarios are interesting, since they open the possibility of experimentally testing the chronology protection conjecture by manipulating on our brane initial conditions of gravitons or hypothetical gauge-singlet fermions (“sterile neutrinos”) which then propagate in the extra dimensions.
Quantum State Cloning Using Deutschian Closed Timelike Curves
Brun, Todd A.; Wilde, Mark M.; Winter, Andreas
2013-11-01
We show that it is possible to clone quantum states to arbitrary accuracy in the presence of a Deutschian closed timelike curve (D-CTC), with a fidelity converging to one in the limit as the dimension of the CTC system becomes large—thus resolving an open conjecture [Brun et al., Phys. Rev. Lett. 102, 210402 (2009)]. This result follows from a D-CTC-assisted scheme for producing perfect clones of a quantum state prepared in a known eigenbasis, and the fact that one can reconstruct an approximation of a quantum state from empirical estimates of the probabilities of an informationally complete measurement. Our results imply more generally that every continuous, but otherwise arbitrarily nonlinear map from states to states, can be implemented to arbitrary accuracy with D-CTCs. Furthermore, our results show that Deutsch’s model for closed timelike curves is in fact a classical model, in the sense that two arbitrary, distinct density operators are perfectly distinguishable (in the limit of a large closed timelike curve system); hence, in this model quantum mechanics becomes a classical theory in which each density operator is a distinct point in a classical phase space.
Supertube domain walls and elimination of closed timelike curves in string theory
International Nuclear Information System (INIS)
Drukker, Nadav
2004-01-01
We show that some novel physics of supertubes removes closed timelike curves from many supersymmetric spaces which naively suffer from this problem. The main claim is that supertubes naturally form domain walls, so while analytical continuation of the metric would lead to closed timelike curves, across the domain wall the metric is nondifferentiable, and the closed timelike curves are eliminated. In the examples we study, the metric inside the domain wall is always of the Goedel type, while outside the shell it looks like a localized rotating object, often a rotating black hole. Thus this mechanism prevents the appearance of closed timelike curves behind the horizons of certain rotating black holes
On Closed Timelike Curves and Warped Brane World Models
Directory of Open Access Journals (Sweden)
Slagter Reinoud Jan
2013-09-01
Full Text Available At first glance, it seems possible to construct in general relativity theory causality violating solutions. The most striking one is the Gott spacetime. Two cosmic strings, approaching each other with high velocity, could produce closed timelike curves. It was quickly recognized that this solution violates physical boundary conditions. The effective one particle generator becomes hyperbolic, so the center of mass is tachyonic. On a 5-dimensional warped spacetime, it seems possible to get an elliptic generator, so no obstruction is encountered and the velocity of the center of mass of the effective particle has an overlap with the Gott region. So a CTC could, in principle, be constructed. However, from the effective 4D field equations on the brane, which are influenced by the projection of the bulk Weyl tensor on the brane, it follows that no asymptotic conical space time is found, so no angle deficit as in the 4D counterpart model. This could also explain why we do not observe cosmic strings.
Exact string theory model of closed timelike curves and cosmological singularities
International Nuclear Information System (INIS)
Johnson, Clifford V.; Svendsen, Harald G.
2004-01-01
We study an exact model of string theory propagating in a space-time containing regions with closed timelike curves (CTCs) separated from a finite cosmological region bounded by a big bang and a big crunch. The model is an nontrivial embedding of the Taub-NUT geometry into heterotic string theory with a full conformal field theory (CFT) definition, discovered over a decade ago as a heterotic coset model. Having a CFT definition makes this an excellent laboratory for the study of the stringy fate of CTCs, the Taub cosmology, and the Milne/Misner-type chronology horizon which separates them. In an effort to uncover the role of stringy corrections to such geometries, we calculate the complete set of α ' corrections to the geometry. We observe that the key features of Taub-NUT persist in the exact theory, together with the emergence of a region of space with Euclidean signature bounded by timelike curvature singularities. Although such remarks are premature, their persistence in the exact geometry is suggestive that string theory is able to make physical sense of the Milne/Misner singularities and the CTCs, despite their pathological character in general relativity. This may also support the possibility that CTCs may be viable in some physical situations, and may be a natural ingredient in pre-big bang cosmological scenarios
Quantum field theory in spaces with closed time-like curves
Energy Technology Data Exchange (ETDEWEB)
Boulware, D.G.
1992-12-31
Gott spacetime has closed timelike curves, but no locally anomalous stress-energy. A complete orthonormal set of eigenfunctions of the wave operator is found in the special case of a spacetime in which the total deficit angle is 27{pi}. A scalar quantum field theory is constructed using these eigenfunctions. The resultant interacting quantum field theory is not unitary because the field operators can create real, on-shell, particles in the acausal region. These particles propagate for finite proper time accumulating an arbitrary phase before being annihilated at the same spacetime point as that at which they were created. As a result, the effective potential within the acausal region is complex, and probability is not conserved. The stress tensor of the scalar field is evaluated in the neighborhood of the Cauchy horizon; in the case of a sufficiently small Compton wavelength of the field, the stress tensor is regular and cannot prevent the formation of the Cauchy horizon.
Quantum field theory in spaces with closed time-like curves
International Nuclear Information System (INIS)
Boulware, D.G.
1992-01-01
Gott spacetime has closed timelike curves, but no locally anomalous stress-energy. A complete orthonormal set of eigenfunctions of the wave operator is found in the special case of a spacetime in which the total deficit angle is 27π. A scalar quantum field theory is constructed using these eigenfunctions. The resultant interacting quantum field theory is not unitary because the field operators can create real, on-shell, particles in the acausal region. These particles propagate for finite proper time accumulating an arbitrary phase before being annihilated at the same spacetime point as that at which they were created. As a result, the effective potential within the acausal region is complex, and probability is not conserved. The stress tensor of the scalar field is evaluated in the neighborhood of the Cauchy horizon; in the case of a sufficiently small Compton wavelength of the field, the stress tensor is regular and cannot prevent the formation of the Cauchy horizon
Projection-based curve clustering
International Nuclear Information System (INIS)
Auder, Benjamin; Fischer, Aurelie
2012-01-01
This paper focuses on unsupervised curve classification in the context of nuclear industry. At the Commissariat a l'Energie Atomique (CEA), Cadarache (France), the thermal-hydraulic computer code CATHARE is used to study the reliability of reactor vessels. The code inputs are physical parameters and the outputs are time evolution curves of a few other physical quantities. As the CATHARE code is quite complex and CPU time-consuming, it has to be approximated by a regression model. This regression process involves a clustering step. In the present paper, the CATHARE output curves are clustered using a k-means scheme, with a projection onto a lower dimensional space. We study the properties of the empirically optimal cluster centres found by the clustering method based on projections, compared with the 'true' ones. The choice of the projection basis is discussed, and an algorithm is implemented to select the best projection basis among a library of orthonormal bases. The approach is illustrated on a simulated example and then applied to the industrial problem. (authors)
Deformation Based Curved Shape Representation.
Demisse, Girum G; Aouada, Djamila; Ottersten, Bjorn
2017-06-02
In this paper, we introduce a deformation based representation space for curved shapes in Rn. Given an ordered set of points sampled from a curved shape, the proposed method represents the set as an element of a finite dimensional matrix Lie group. Variation due to scale and location are filtered in a preprocessing stage, while shapes that vary only in rotation are identified by an equivalence relationship. The use of a finite dimensional matrix Lie group leads to a similarity metric with an explicit geodesic solution. Subsequently, we discuss some of the properties of the metric and its relationship with a deformation by least action. Furthermore, invariance to reparametrization or estimation of point correspondence between shapes is formulated as an estimation of sampling function. Thereafter, two possible approaches are presented to solve the point correspondence estimation problem. Finally, we propose an adaptation of k-means clustering for shape analysis in the proposed representation space. Experimental results show that the proposed representation is robust to uninformative cues, e.g. local shape perturbation and displacement. In comparison to state of the art methods, it achieves a high precision on the Swedish and the Flavia leaf datasets and a comparable result on MPEG-7, Kimia99 and Kimia216 datasets.
Space- and time-like superselection rules in conformal quantum field theory
International Nuclear Information System (INIS)
Schroer, Bert
2000-11-01
In conformally invariant quantum field theories one encounters besides the standard DHR superselection theory based on spacelike (Einstein-causal) commutation relations and their Haag duality another timelike (Huygens) based superselection structure. Whereas the DHR theory based on spacelike causality of observables confirmed the Lagrangian internal symmetry picture on the level of the physical principles of local quantum physics, the attempts to understand the timelike based superselection charges associated with the center of the conformal covering group in terms of timelike localized charges lead to a more dynamical role of charges outside the DR theorem and even outside the Coleman-Mandula setting. The ensuing plektonic timelike structure of conformal theories explains the spectrum of the anomalous scale dimensions in terms of admissible braid group representations, similar to the explanation of the possible anomalous spin spectrum expected from the extension of the DHR theory to stringlike d=1+2 plektonic fields. (author)
A Type D Non-Vacuum Spacetime with Causality Violating Curves, and Its Physical Interpretation
Ahmed, Faizuddin
2017-12-01
We present a topologically trivial, non-vacuum solution of the Einstein’s field equations in four dimensions, which is regular everywhere. The metric admits circular closed timelike curves, which appear beyond the null curve, and these timelike curves are linearly stable under linear perturbations. Additionally, the spacetime admits null geodesics curve, which are not closed, and the metric is of type D in the Petrov classification scheme. The stress-energy tensor anisotropic fluid satisfy the different energy conditions and a generalization of Equation-of-State parameter of perfect fluid p=ω ρ . The metric admits a twisting, shearfree, nonexapnding timelike geodesic congruence. Finally, the physical interpretation of this solution, based on the study of the equation of the geodesics deviation, will be presented.
Timelike Completeness as an Obstruction to C 0-Extensions
Galloway, Gregory J.; Ling, Eric; Sbierski, Jan
2017-11-01
The study of low regularity (in-)extendibility of Lorentzian manifolds is motivated by the question whether a given solution to the Einstein equations can be extended (or is maximal) as a weak solution. In this paper we show that a timelike complete and globally hyperbolic Lorentzian manifold is C 0-inextendible. For the proof we make use of the result, recently established by Sämann (Ann Henri Poincaré 17(6):1429-1455, 2016), that even for continuous Lorentzian manifolds that are globally hyperbolic, there exists a length-maximizing causal curve between any two causally related points.
On the (1 + 3) threading of spacetime with respect to an arbitrary timelike vector field
Energy Technology Data Exchange (ETDEWEB)
Bejancu, Aurel [Kuwait University, Department of Mathematics, P.O.Box 5969, Safat (Kuwait); Calin, Constantin [Technical University ' ' Gh.Asachi' ' , Department of Mathematics, Iasi (Romania)
2015-04-15
We develop a newapproach on the (1 + 3) threading of spacetime (M, g) with respect to a congruence of curves defined by an arbitrary timelike vector field. The study is based on spatial tensor fields and on theRiemannian spatial connection ∇*, which behave as 3D geometric objects. We obtain new formulas for local components of the Ricci tensor field of (M, g) with respect to the threading frame field, in terms of the Ricci tensor field of ∇* and of kinematic quantities. Also, new expressions for time covariant derivatives of kinematic quantities are stated. In particular, a new form of Raychaudhuri's equation enables us to prove Lemma 6.3, which completes a well-known lemma used in the proof of the Penrose-Hawking singularity theorems. Finally, we apply the new (1 + 3) formalism to the study of the dynamics of a Kerr-Newman black hole. (orig.)
MICA: Multiple interval-based curve alignment
Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf
2018-01-01
MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA
Timelike Constant Mean Curvature Surfaces with Singularities
DEFF Research Database (Denmark)
Brander, David; Svensson, Martin
2014-01-01
We use integrable systems techniques to study the singularities of timelike non-minimal constant mean curvature (CMC) surfaces in the Lorentz–Minkowski 3-space. The singularities arise at the boundary of the Birkhoff big cell of the loop group involved. We examine the behavior of the surfaces...
Timelike single-logarithm-resummed splitting functions
International Nuclear Information System (INIS)
Albino, S.; Bolzoni, P.; Kniehl, B.A.; Kotikov, A.V.; Joint Inst. of Nuclear Research, Moscow
2011-08-01
We calculate the single logarithmic contributions to the quark singlet and gluon matrix of timelike splitting functions at all orders in the modified minimal-subtraction (MS) scheme. We fix two of the degrees of freedom of this matrix from the analogous results in the massive-gluon regularization scheme by using the relation between that scheme and the MS scheme. We determine this scheme transformation from the double logarithmic contributions to the timelike splitting functions and the coefficient functions of inclusive particle production in e + e - annihilation now available in both schemes. The remaining two degrees of freedom are fixed by reasonable physical assumptions. The results agree with the fixed-order results at next-to-next-to-leading order in the literature. (orig.)
Timelike Killing Fields and Relativistic Statistical Mechanics
Klein, David; Collas, Peter
2008-01-01
For spacetimes with timelike Killing fields, we introduce a "Fermi-Walker-Killing" coordinate system and use it to prove a Liouville Theorem for an appropriate volume element of phase space for a statistical mechanical system of particles. We derive an exact relativistic formula for the Helmholtz free energy of an ideal gas and compare it, for a class of spacetimes, to its Newtonian analog, derived both independently and as the Newtonian limit of our formula. We also find the relativistic the...
Echeverria, Fernando
I study three different topics in general relativity. The first study investigates the accuracy with which the mass and angular momentum of a black hole can be determined by measurements of gravitational waves from the hole, using a gravitational-wave detector. The black hole is assumed to have been strongly perturbed and the detector measures the waves produced by its resulting vibration and ring-down. The uncertainties in the measured parameters arise from the noise present in the detector. It is found that the faster the hole rotates, the more accurate the measurements will be, with the uncertainty in the angular momentum decreasing rapidly with increasing rotation speed. The second study is an analysis of the gravitational collapse of an infinitely long, cylindrical dust shell, an idealization of more realistic, finite-length bodies. It is found that the collapse evolves into a naked singularity in finite time. Analytical expressions for the variables describing the collapse are found at late times, near the singularity. The collapse is also followed, with a numerical simulation, from the start until very close to the singularity. The singularity is found to be strong, in the sense that an observer riding on the shell will be infinitely stretched in one direction and infinitely compressed in another. The gravitational waves emitted from the collapse are also analyzed. The last study focuses on the consequences of the existence of closed time like curves in a worm hole space time. One might expect that such curves might cause a system with apparently well-posed initial conditions to have no self-consistent evolution. We study the case of a classical particle with a hard-sphere potential, focusing attention on initial conditions for which the evolution, if followed naively, is self-inconsistent: the ball travels to the past through the worm hole colliding with its younger self, preventing itself from entering the worm hole. We find, surprisingly, that for all
On timelike supersymmetric solutions of gauged minimal 5-dimensional supergravity
Chimento, Samuele; Ortín, Tomás
2017-04-01
We analyze the timelike supersymmetric solutions of minimal gauged 5-dimensional supergravity for the case in which the Kähler base manifold admits a holomorphic isometry and depends on two real functions satisfying a simple second-order differential equation. Using this general form of the base space, the equations satisfied by the building blocks of the solutions become of, at most, fourth degree and can be solved by simple polynomic ansatzs. In this way we construct two 3-parameter families of solutions that contain almost all the timelike supersymmetric solutions of this theory with one angular momentum known so far and a few more: the (singular) supersymmetric Reissner-Nordström-AdS solutions, the three exact supersymmetric solutions describing the three near-horizon geometries found by Gutowski and Reall, three 1-parameter asymptotically-AdS5 black-hole solutions with those three near-horizon geometries (Gutowski and Reall's black hole being one of them), three generalizations of the Gödel universe and a few potentially homogenous solutions. A key rôle in finding these solutions is played by our ability to write AdS5's Kähler base space ( {\\overline{CP}}^2 or SU(1, 2)/U(2)) is three different, yet simple, forms associated to three different isometries. Furthermore, our ansatz for the Kähler metric also allows us to study the dimensional compactification of the theory and its solutions in a systematic way.
Repulsive and attractive timelike singularities in vacuum cosmologies
International Nuclear Information System (INIS)
Miller, B.D.
1979-01-01
Spherically symmetric cosmologies whose big bang is partially spacelike and partially timelike are constrained to occur only in the presence of certain types of matter, and in such cosmologies the timelike part of the big bang is a negative-mass singularity. In this paper examples are given of cylindrically symmetric cosmologies whose big bang is partially spacelike and partially timelike. These cosmologies are vacuum. In some of them, the timelike part of the big bang is clearly a (generalized) negative-mass singularity, while in others it is a (generalized) positive-mass singularity
Physics in 2116: Physicists Create Closed Time-like Curves
Schnittman, Jeremy D.
2016-01-01
This is an entry for Physics Today's recent essay contest, written as a "Search and Discovery" news story, imagining what major breakthroughs might be shaking the physics world one hundred years from now.
On timelike supersymmetric solutions of gauged minimal 5-dimensional supergravity
Energy Technology Data Exchange (ETDEWEB)
Chimento, Samuele; Ortín, Tomás [Instituto de Física Teórica UAM/CSIC,C/Nicolás Cabrera, 13-15, C.University Cantoblanco, E-28049 Madrid (Spain)
2017-04-04
We analyze the timelike supersymmetric solutions of minimal gauged 5-dimensional supergravity for the case in which the Kähler base manifold admits a holomorphic isometry and depends on two real functions satisfying a simple second-order differential equation. Using this general form of the base space, the equations satisfied by the building blocks of the solutions become of, at most, fourth degree and can be solved by simple polynomic ansatzs. In this way we construct two 3-parameter families of solutions that contain almost all the timelike supersymmetric solutions of this theory with one angular momentum known so far and a few more: the (singular) supersymmetric Reissner-Nordström-AdS solutions, the three exact supersymmetric solutions describing the three near-horizon geometries found by Gutowski and Reall, three 1-parameter asymptotically-AdS{sub 5} black-hole solutions with those three near-horizon geometries (Gutowski and Reall’s black hole being one of them), three generalizations of the Gödel universe and a few potentially homogenous solutions. A key rôle in finding these solutions is played by our ability to write AdS{sub 5}’s Kähler base space ( (ℂℙ)-bar {sup 2} or SU(1,2)/U(2)) is three different, yet simple, forms associated to three different isometries. Furthermore, our ansatz for the Kähler metric also allows us to study the dimensional compactification of the theory and its solutions in a systematic way.
Higher-Order Corrections to Timelike Jets
Energy Technology Data Exchange (ETDEWEB)
Giele, W.T.; /Fermilab; Kosower, D.A.; /Saclay, SPhT; Skands, P.Z.; /CERN
2011-02-01
We present a simple formalism for the evolution of timelike jets in which tree-level matrix element corrections can be systematically incorporated, up to arbitrary parton multiplicities and over all of phase space, in a way that exponentiates the matching corrections. The scheme is cast as a shower Markov chain which generates one single unweighted event sample, that can be passed to standard hadronization models. Remaining perturbative uncertainties are estimated by providing several alternative weight sets for the same events, at a relatively modest additional overhead. As an explicit example, we consider Z {yields} q{bar q} evolution with unpolarized, massless quarks and include several formally subleading improvements as well as matching to tree-level matrix elements through {alpha}{sub s}{sup 4}. The resulting algorithm is implemented in the publicly available VINCIA plugin to the PYTHIA8 event generator.
Analysis of velocity planning interpolation algorithm based on NURBS curve
Zhang, Wanjun; Gao, Shanping; Cheng, Xiyan; Zhang, Feng
2017-04-01
To reduce interpolation time and Max interpolation error in NURBS (Non-Uniform Rational B-Spline) inter-polation caused by planning Velocity. This paper proposed a velocity planning interpolation algorithm based on NURBS curve. Firstly, the second-order Taylor expansion is applied on the numerator in NURBS curve representation with parameter curve. Then, velocity planning interpolation algorithm can meet with NURBS curve interpolation. Finally, simulation results show that the proposed NURBS curve interpolator meet the high-speed and high-accuracy interpolation requirements of CNC systems. The interpolation of NURBS curve should be finished.
Point- and curve-based geometric conflation
López-Vázquez, C.
2013-01-01
Geometric conflation is the process undertaken to modify the coordinates of features in dataset A in order to match corresponding ones in dataset B. The overwhelming majority of the literature considers the use of points as features to define the transformation. In this article we present a procedure to consider one-dimensional curves also, which are commonly available as Global Navigation Satellite System (GNSS) tracks, routes, coastlines, and so on, in order to define the estimate of the displacements to be applied to each object in A. The procedure involves three steps, including the partial matching of corresponding curves, the computation of some analytical expression, and the addition of a correction term in order to satisfy basic cartographic rules. A numerical example is presented. © 2013 Copyright Taylor and Francis Group, LLC.
Compact Hilbert Curve Index Algorithm Based on Gray Code
Directory of Open Access Journals (Sweden)
CAO Xuefeng
2016-12-01
Full Text Available Hilbert curve has best clustering in various kinds of space filling curves, and has been used as an important tools in discrete global grid spatial index design field. But there are lots of redundancies in the standard Hilbert curve index when the data set has large differences between dimensions. In this paper, the construction features of Hilbert curve is analyzed based on Gray code, and then the compact Hilbert curve index algorithm is put forward, in which the redundancy problem has been avoided while Hilbert curve clustering preserved. Finally, experiment results shows that the compact Hilbert curve index outperforms the standard Hilbert index, their 1 computational complexity is nearly equivalent, but the real data set test shows the coding time and storage space decrease 40%, the speedup ratio of sorting speed is nearly 4.3.
On Finsler spacetimes with a timelike Killing vector field
Caponio, Erasmo; Stancarone, Giuseppe
2018-04-01
We study Finsler spacetimes and Killing vector fields taking care of the fact that the generalised metric tensor associated to the Lorentz–Finsler function L is in general well defined only on a subset of the slit tangent bundle. We then introduce a new class of Finsler spacetimes endowed with a timelike Killing vector field that we call stationary splitting Finsler spacetimes. We characterize when a Finsler spacetime with a timelike Killing vector field is locally a stationary splitting. Finally, we show that the causal structure of a stationary splitting is the same of one of two Finslerian static spacetimes naturally associated to the stationary splitting.
Gravitational stability and screening effect from D extra timelike dimensions
International Nuclear Information System (INIS)
Matsuda, Satoshi; Seki, Shigenori
2001-01-01
We study (3+1)+D dimensional spacetime, where D extra dimensions are timelike. Compactification of the D timelike dimensions leads to tachyonic Kaluza-Klein gravitons. We calculate the gravitational self-energies of massive spherical bodies due to the tachyonic exchange, discuss their stability, and find that the gravitational force is screened in a certain number of the extra dimensions. We also derive the exact relationship between the Newton constants in the full (4+D)-dimensional spacetime with the D extra times and the ordinary Newton constant of our 4-dimensional world
Radiative corrections in nucleon time-like form factors measurements
Energy Technology Data Exchange (ETDEWEB)
Van de Wiele, Jacques [Universite de Paris-Sud, Institut de Physique Nucleaire, Orsay Cedex (France); Ong, Saro [Universite de Paris-Sud, Institut de Physique Nucleaire, Orsay Cedex (France); Universite de Picardie Jules Verne, Amiens (France)
2013-02-15
The completely general radiative corrections to lowest order, including the final- and initial-state radiations, are studied in proton-antiproton annihilation into an electron-positron pair. Numerical estimates have been made in a realistic configuration of the PANDA detector at FAIR for the proton time-like form factors measurements. (orig.)
Kaon transverse charge density from space- and timelike data
Mecholsky, N. A.; Meija-Ott, J.; Carmignotto, M.; Horn, T.; Miller, G. A.; Pegg, I. L.
2017-12-01
We used the world data on the kaon form factor to extract the transverse kaon charge density using a dispersion integral of the imaginary part of the kaon form factor in the timelike region. Our analysis includes recent data from e+e- annihiliation measurements extending the kinematic reach of the data into the region of high momentum transfers conjugate to the region of short transverse distances. To calculate the transverse density we created a superset of both timelike and spacelike data and developed an empirical parameterization of the kaon form factor. The spacelike set includes two new data points we extracted from existing cross section data. We estimate the uncertainty on the resulting transverse density to be 5% at b =0.025 fm and significantly better at large distances. New kaon data planned with the 12 GeV Jefferson Lab may have a significant impact on the charge density at distances of b <0.1 fm.
Form factors and QCD in spacelike and timelike regions
International Nuclear Information System (INIS)
Bakulev, A. P.; Radyushkin, A. V.; Stefanis, N. G.
2000-01-01
We analyze the basic hard exclusive processes, the πγ * γ-transition and the pion and nucleon electromagnetic form factors, and discuss the analytic continuation of QCD formulas from the spacelike q 2 2 >0 of the relevant momentum transfers. We describe the construction of the timelike version of the coupling constant α s . We show that due to the analytic continuation of the collinear logarithms, each eigenfunction of the evolution equation acquires a phase factor and investigate the resulting interference effects which are shown to be very small. We find no sources for the K-factor-type enhancements in the perturbative QCD contribution to the hadronic form factors. To study the soft part of the pion electromagnetic form factor, we use a QCD sum rule inspired model and show that there are noncanceling Sudakov double logarithms which result in a K-factor-type enhancement in the timelike region
Development of a Charpy master curve-based embrittlement trend curve
International Nuclear Information System (INIS)
Erikson, M.
2011-01-01
Under the current U.S. surveillance programs, the Charpy V-notch energy (CVE), yield strength, and tensile strength are measured (all as a function of test temperature) at various times during the operational life of the reactor vessel. Conventionally, the CVE vs. temperature data are fit using a hyperbolic tangent (tanh) function to determine the temperature at which the mean CVE is equal to 30 ft-lbs (41J). This index temperature, which is designated T30 or T41J, is used to track irradiation damage. Recently an alternative strategy for fitting the CVE vs. temperature data was proposed in which a single CVE vs. temperature relationship appears to well represent the behavior of a very wide variety of ferritic steels for temperatures at and below fracture mode transition. It was demonstrated that when upper shelf data are excluded from a fit of Charpy V-notch energy (CVE) vs. temperature a single exponential function is found that well represents the transition temperature behavior of ferritic steels. The findings suggest that a reanalysis of already tested Charpy surveillance specimens can provide the basis for development of an embrittlement trend curve that is less influenced by the biases that arise from the tanh curve fitting method. Recently, a program was initiated with a goal of using the Charpy MC transition data fit to define a reference temperature, to use instead of the traditionally defined tanh-based T30/T41J reference temperature, in development of an embrittlement trend curve. The existing USLWR database was mined for datasets with sufficient data points within the transition temperature region for use in defining a TCVE reference temperature. These values were then used to define ΔTCVE data with irradiation. This data, along with chemistry, temperature, flux and fluence information, was used to develop the embrittlement trend curve presented herein. Predictions of embrittlement behavior made using this ETC were then compared to predictions made
Using Spreadsheets to Produce Acid-Base Titration Curves.
Cawley, Martin James; Parkinson, John
1995-01-01
Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)
Qualitative Comparison of Contraction-Based Curve Skeletonization Methods
Sobiecki, André; Yasan, Haluk C.; Jalba, Andrei C.; Telea, Alexandru C.
2013-01-01
In recent years, many new methods have been proposed for extracting curve skeletons of 3D shapes, using a mesh-contraction principle. However, it is still unclear how these methods perform with respect to each other, and with respect to earlier voxel-based skeletonization methods, from the viewpoint
Prediction of flow boiling curves based on artificial neural network
International Nuclear Information System (INIS)
Wu Junmei; Xi'an Jiaotong Univ., Xi'an; Su Guanghui
2007-01-01
The effects of the main system parameters on flow boiling curves were analyzed by using an artificial neural network (ANN) based on the database selected from the 1960s. The input parameters of the ANN are system pressure, mass flow rate, inlet subcooling, wall superheat and steady/transition boiling, and the output parameter is heat flux. The results obtained by the ANN show that the heat flux increases with increasing inlet sub cooling for all heat transfer modes. Mass flow rate has no significant effects on nucleate boiling curves. The transition boiling and film boiling heat fluxes will increase with an increase of mass flow rate. The pressure plays a predominant role and improves heat transfer in whole boiling regions except film boiling. There are slight differences between the steady and the transient boiling curves in all boiling regions except the nucleate one. (authors)
Thermodynamic Activity-Based Progress Curve Analysis in Enzyme Kinetics.
Pleiss, Jürgen
2018-03-01
Macrokinetic Michaelis-Menten models based on thermodynamic activity provide insights into enzyme kinetics because they separate substrate-enzyme from substrate-solvent interactions. Kinetic parameters are estimated from experimental progress curves of enzyme-catalyzed reactions. Three pitfalls are discussed: deviations between thermodynamic and concentration-based models, product effects on the substrate activity coefficient, and product inhibition. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reliability Based Geometric Design of Horizontal Circular Curves
Rajbongshi, Pabitra; Kalita, Kuldeep
2018-03-01
Geometric design of horizontal circular curve primarily involves with radius of the curve and stopping sight distance at the curve section. Minimum radius is decided based on lateral thrust exerted on the vehicles and the minimum stopping sight distance is provided to maintain the safety in longitudinal direction of vehicles. Available sight distance at site can be regulated by changing the radius and middle ordinate at the curve section. Both radius and sight distance depend on design speed. Speed of vehicles at any road section is a variable parameter and therefore, normally the 98th percentile speed is taken as the design speed. This work presents a probabilistic approach for evaluating stopping sight distance, considering the variability of all input parameters of sight distance. It is observed that the 98th percentile sight distance value is much lower than the sight distance corresponding to 98th percentile speed. The distribution of sight distance parameter is also studied and found to follow a lognormal distribution. Finally, the reliability based design charts are presented for both plain and hill regions, and considering the effect of lateral thrust.
Optimal Reliability-Based Planning of Experiments for POD Curves
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Faber, Michael Havbro; Kroon, I. B.
Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First O...... Order Reliability Methods in combination with life-cycle cost-optimal inspection and maintenance planning. The methodology is based on preposterior analyses from Bayesian decision theory. An illustrative example is shown.......Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First...
Nucleon electromagnetic structure studies in the spacelike and timelike regions
Energy Technology Data Exchange (ETDEWEB)
Guttmann, Julia
2013-07-23
The thesis investigates the nucleon structure probed by the electromagnetic interaction. One of the most basic observables, reflecting the electromagnetic structure of the nucleon, are the form factors, which have been studied by means of elastic electron-proton scattering with ever increasing precision for several decades. In the timelike region, corresponding with the proton-antiproton annihilation into a electron-positron pair, the present experimental information is much less accurate. However, in the near future high-precision form factor measurements are planned. About 50 years after the first pioneering measurements of the electromagnetic form factors, polarization experiments stirred up the field since the results were found to be in striking contradiction to the findings of previous form factor investigations from unpolarized measurements. Triggered by the conflicting results, a whole new field studying the influence of two-photon exchange corrections to elastic electron-proton scattering emerged, which appeared as the most likely explanation of the discrepancy. The main part of this thesis deals with theoretical studies of two-photon exchange, which is investigated particularly with regard to form factor measurements in the spacelike as well as in the timelike region. An extraction of the two-photon amplitudes in the spacelike region through a combined analysis using the results of unpolarized cross section measurements and polarization experiments is presented. Furthermore, predictions of the two-photon exchange effects on the e{sup +}p/e{sup -}p cross section ratio are given for several new experiments, which are currently ongoing. The two-photon exchange corrections are also investigated in the timelike region in the process p anti p → e{sup +}e{sup -} by means of two factorization approaches. These corrections are found to be smaller than those obtained for the spacelike scattering process. The influence of the two-photon exchange corrections on
Wide steering angle microscanner based on curved surface
Sabry, Yasser; Khalil, Diaa; Saadany, Bassam; Bourouina, Tarik
2013-03-01
Intensive industrial and academic research is oriented towards the design and fabrication of optical beam steering systems based on MEMS technology. In most of these systems, the scanning is achieved by rotating a flat micromirror around a central axis in which the main challenge is achieving a wide mirror rotation angle. In this work, a novel method of optical beam scanning based on reflection from a curved surface is presented. The scanning occurs when the optical axis of the curved surface is displaced with respect to the optical axis of the incident beam. To overcome the possible deformation of the spot with the scanning angle, the curved surface is designed with a specific aspherical profile. Moreover, the scanning exhibits a more linearized scanning angle-displacement relation than the conventional spherical profile. The presented scanner is fabricated using DRIE technology on an SOI wafer. The curved surface (reflector) is metalized and attached to a comb-drive actuator fabricated in the same lithography step. A single-mode fiber, behaving as a Gaussian beam source, is positioned on the substrate facing the mirror. The reflected optical beam angle and spotsize in the far field is recorded versus the relative shift between the fiber and the curved mirror. The spot size is plotted versus the scanning angle and a scanning spot size uniformity of about +/-10% is obtained for optical deflection angles up to 100 degrees. As the optical beam is propagating parallel to the wafer substrate, a completely integrated laser scanner can be achieved with filters and actuators self-aligned on the same chip that allows low cost and mass production of this important product.
Investigation of the bases for use of the KIc curve
International Nuclear Information System (INIS)
McCabe, D.E.; Nanstad, R.K.; Rosenfield, A.R.; Marschall, C.W.; Irwin, G.R.
1991-01-01
Title 10 of the Code of Federal Regulations, Part 50 (10CFR50), Appendix G, establishes the bases for setting allowable pressure and temperature limits on reactors during heatup and cooldown operation. Both the K Ic and K Ia curves are utilized in prescribed ways to maintain reactor vessel structural integrity in the presence of an assumed or actual flaw and operating stresses. Currently, the code uses the K Ia curve, normalized to the RT NDT , to represent the fracture toughness trend for unirradiated and irradiated pressure vessel steels. Although this is clearly a conservative policy, it has been suggested that the K Ic curve is the more appropriate for application to a non-accident operating condition. A number of uncertainties have been identified, however, that might convert normal operating transients into a dynamic loading situation. Those include the introduction of running cracks from local brittle zones, crack pop-ins, reduced toughness from arrested cleavage cracks, description of the K Ic curve for irradiated materials, and other related unresolved issues relative to elastic-plastic fracture mechanics. Some observations and conclusions can be made regarding various aspects of those uncertainties and they are discussed in this paper. A discussion of further work required and under way to address the remaining uncertainties is also presented
Statistical data processing of mobility curves of univalent weak bases
Czech Academy of Sciences Publication Activity Database
Šlampová, Andrea; Boček, Petr
2008-01-01
Roč. 29, č. 2 (2008), s. 538-541 ISSN 0173-0835 R&D Projects: GA AV ČR IAA400310609; GA ČR GA203/05/2106 Institutional research plan: CEZ:AV0Z40310501 Keywords : mobility curve * univalent weak bases * statistical evaluation Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.509, year: 2008
Pion transverse charge density from timelike form factor data
Energy Technology Data Exchange (ETDEWEB)
Gerald Miller, Mark Strikman, Christian Weiss
2011-01-01
The transverse charge density in the pion can be represented as a dispersion integral of the imaginary part of the pion form factor in the timelike region. This formulation incorporates information from e+e- annihilation experiments and allows one to reconstruct the transverse density much more accurately than from the spacelike pion form factor data alone. We calculate the transverse density using an empirical parametrization of the timelike pion form factor and estimate that it is determined to an accuracy of ~10% at a distance b ~ 0.1 fm, and significantly better at larger distances. The density is found to be close to that obtained from a zero-width rho meson pole over a wide range and shows a pronounced rise at small distances. The resulting two-dimensional image of the fast-moving pion can be interpreted in terms of its partonic structure in QCD. We argue that the singular behavior of the charge density at the center requires a substantial presence of pointlike configurations in the pion's partonic wave function, which can be probed in other high-momentum transfer processes.
THE CPA QUALIFICATION METHOD BASED ON THE GAUSSIAN CURVE FITTING
Directory of Open Access Journals (Sweden)
M.T. Adithia
2015-01-01
Full Text Available The Correlation Power Analysis (CPA attack is an attack on cryptographic devices, especially smart cards. The results of the attack are correlation traces. Based on the correlation traces, an evaluation is done to observe whether significant peaks appear in the traces or not. The evaluation is done manually, by experts. If significant peaks appear then the smart card is not considered secure since it is assumed that the secret key is revealed. We develop a method that objectively detects peaks and decides which peak is significant. We conclude that using the Gaussian curve fitting method, the subjective qualification of the peak significance can be objectified. Thus, better decisions can be taken by security experts. We also conclude that the Gaussian curve fitting method is able to show the influence of peak sizes, especially the width and height, to a significance of a particular peak.
Optimal Reliability-Based Planning of Experiments for POD Curves
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Faber, M. H.; Kroon, I. B.
Optimal planning of the crack detection test is considered. The test are used to update the information on the reliability of the inspection techniques modelled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability based test plans can be obtained using First O...... Order Reliability Methods in combination with life-cycle cost-optimal inspection and maintenance planning. The methodology is based on preposterior analyses from Bayesian decision theory. An illustrative example is shown....
Modelling of acid-base titration curves of mineral assemblages
Directory of Open Access Journals (Sweden)
Stamberg Karel
2016-01-01
Full Text Available The modelling of acid-base titration curves of mineral assemblages was studied with respect to basic parameters of their surface sites to be obtained. The known modelling approaches, component additivity (CA and generalized composite (GC, and three types of different assemblages (fucoidic sandstones, sedimentary rock-clay and bentonite-magnetite samples were used. In contrary to GC-approach, application of which was without difficulties, the problem of CA-one consisted in the credibility and accessibility of the parameters characterizing the individual mineralogical components.
Timelike geodesics around a charged spherically symmetric dilaton black hole
Directory of Open Access Journals (Sweden)
Blaga C.
2015-01-01
Full Text Available In this paper we study the timelike geodesics around a spherically symmetric charged dilaton black hole. The trajectories around the black hole are classified using the effective potential of a free test particle. This qualitative approach enables us to determine the type of orbit described by test particle without solving the equations of motion, if the parameters of the black hole and the particle are known. The connections between these parameters and the type of orbit described by the particle are obtained. To visualize the orbits we solve numerically the equation of motion for different values of parameters envolved in our analysis. The effective potential of a free test particle looks different for a non-extremal and an extremal black hole, therefore we have examined separately these two types of black holes.
Asymptotically AdS spacetimes with a timelike Kasner singularity
Energy Technology Data Exchange (ETDEWEB)
Ren, Jie [Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem 91904 (Israel)
2016-07-21
Exact solutions to Einstein’s equations for holographic models are presented and studied. The IR geometry has a timelike cousin of the Kasner singularity, which is the less generic case of the BKL (Belinski-Khalatnikov-Lifshitz) singularity, and the UV is asymptotically AdS. This solution describes a holographic RG flow between them. The solution’s appearance is an interpolation between the planar AdS black hole and the AdS soliton. The causality constraint is always satisfied. The entanglement entropy and Wilson loops are discussed. The boundary condition for the current-current correlation function and the Laplacian in the IR is examined. There is no infalling wave in the IR, but instead, there is a normalizable solution in the IR. In a special case, a hyperscaling-violating geometry is obtained after a dimensional reduction.
International Nuclear Information System (INIS)
Oliver, D.R. Jr.; Davis, W.R.
1977-01-01
This paper treats matter field space-times admitting timelike conformal motions and timelike members of the family of contracted Ricci collineations (FCRC). The physical properties of these timelike symmetries in relation to the time development of relativistic matter field space-times are developed in terms of a number of specific theorems. Insofar as possible, the similarities and differences of the timelike conformal motions and the FCRC are discussed in some detail. Special applications are given that illustrate the possible value of the present considerations and related conservation expressions in relation to the Cauchy problem of matter field space-times admitting timelike symmetry properties. (author)
A Method of Timbre-Shape Synthesis Based On Summation of Spherical Curves
DEFF Research Database (Denmark)
Putnam, Lance Jonathan
2014-01-01
for simultaneous production of sonic tones and graphical curves based on additive synthesis of spherical curves. The spherical curves are generated from a sequence of elemental 3D rotations, similar to a Euler rotation. We show that this method can produce many important two- and three-dimensional curves directly...
Decline curve based models for predicting natural gas well performance
Directory of Open Access Journals (Sweden)
Arash Kamari
2017-06-01
Full Text Available The productivity of a gas well declines over its production life as cannot cover economic policies. To overcome such problems, the production performance of gas wells should be predicted by applying reliable methods to analyse the decline trend. Therefore, reliable models are developed in this study on the basis of powerful artificial intelligence techniques viz. the artificial neural network (ANN modelling strategy, least square support vector machine (LSSVM approach, adaptive neuro-fuzzy inference system (ANFIS, and decision tree (DT method for the prediction of cumulative gas production as well as initial decline rate multiplied by time as a function of the Arps' decline curve exponent and ratio of initial gas flow rate over total gas flow rate. It was concluded that the results obtained based on the models developed in current study are in satisfactory agreement with the actual gas well production data. Furthermore, the results of comparative study performed demonstrates that the LSSVM strategy is superior to the other models investigated for the prediction of both cumulative gas production, and initial decline rate multiplied by time.
Defining the learning curve for team-based laparoscopic pancreaticoduodenectomy.
Speicher, Paul J; Nussbaum, Daniel P; White, Rebekah R; Zani, Sabino; Mosca, Paul J; Blazer, Dan G; Clary, Bryan M; Pappas, Theodore N; Tyler, Douglas S; Perez, Alexander
2014-11-01
The purpose of this study was to define the learning curves for laparoscopic pancreaticoduodenectomy (LPD) with and without laparoscopic reconstruction, using paired surgical teams consisting of advanced laparoscopic-trained surgeons and advanced oncologic-trained surgeons. All patients undergoing PD without vein resection at a single institution were retrospectively analyzed. LPD was introduced by initially focusing on laparoscopic resection followed by open reconstruction (hybrid) for 18 months prior to attempting a totally LPD (TLPD) approach. Cases were compared with Chi square, Fisher's exact test, and Kruskal-Wallis analysis of variance (ANOVA). Between March 2010 and June 2013, 140 PDs were completed at our institution, of which 56 (40 %) were attempted laparoscopically. In 31/56 procedures we planned to perform only the resection laparoscopically (hybrid), of which 7 (23 %) required premature conversion before completion of resection. Following the first 23 of these hybrid cases, a total of 25 TLPDs have been performed, of which there were no conversions to open. For all LPD, a significant reduction in operative times was identified following the first 10 patients (median 478.5 vs. 430.5 min; p = 0.01), approaching open PD levels. After approximately 50 cases, operative times and estimated blood loss were consistently lower than those for open PD. In our experience of building an LPD program, the initial ten cases represent the biggest hurdle with respect to operative times. For an experienced teaching center using a staged and team-based approach, LPD appears to offer meaningful reductions in operative time and blood loss within the first 50 cases.
Energy Technology Data Exchange (ETDEWEB)
Malament, D.B.
1977-07-01
The title assertion is proven, and two corollaries are established. First, the topology of every past and future distinguishing spacetime is determined by its causal structure. Second, in every spacetime the path topology of Hawking, King, and McCarthy codes topological, differential, and conformal structure.
Magnetic fluid based squeeze film between porous annular curved ...
Indian Academy of Sciences (India)
lower plate, considering a magnetic fluid lubricant in the presence of an external magnetic field oblique to the plates. Expressions were obtained for ... Keywords. Magnetic fluid; lubrication; annular curved plates. PACS No. 81.40.Pq. 1. Introduction ... is backed by a solid wall. The film thickness h is taken as h =h0 exp(-Br2); ...
Fractal based curves in musical creativity: A critical annotation
Georgaki, Anastasia; Tsolakis, Christos
In this article we examine fractal curves and synthesis algorithms in musical composition and research. First we trace the evolution of different approaches for the use of fractals in music since the 80's by a literature review. Furthermore, we review representative fractal algorithms and platforms that implement them. Properties such as self-similarity (pink noise), correlation, memory (related to the notion of Brownian motion) or non correlation at multiple levels (white noise), can be used to develop hierarchy of criteria for analyzing different layers of musical structure. L-systems can be applied in the modelling of melody in different musical cultures as well as in the investigation of musical perception principles. Finally, we propose a critical investigation approach for the use of artificial or natural fractal curves in systematic musicology.
Reference results for time-like evolution up to $\\mathcal{O}(\\alpha_s^3)$
Bertone, Valerio; Nocera, Emanuele R.
2015-01-01
We present high-precision numerical results for time-like Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution in the $\\overline{\\rm MS}$ factorisation scheme, for the first time up to next-to-next-to-leading order accuracy in quantum chromodynamics. First, we scrutinise the analytical expressions of the splitting functions available in the literature, in both x and N space, and check their mutual consistency. Second, we implement time-like evolution in two publicly available, entirely independent and conceptually different numerical codes, in x and N space respectively: the already existing APFEL code, which has been updated with time-like evolution, and the new MELA code, which has been specifically developed to perform the study in this work. Third, by means of a model for fragmentation functions, we provide results for the evolution in different factorisation schemes, for different ratios between renormalisation and factorisation scales and at different final scales. Our results are collected in the forma...
Local thermal equilibrium and KMS states in curved spacetime
International Nuclear Information System (INIS)
Solveen, Christoph
2012-01-01
On the example of a free massless and conformally coupled scalar field, it is argued that in quantum field theory in curved spacetimes with the time-like Killing field, the corresponding KMS states (generalized Gibbs ensembles) at parameter β > 0 need not possess a definite temperature in the sense of the zeroth law. In fact, these states, although passive in the sense of the second law, are not always in local thermal equilibrium (LTE). A criterion characterizing LTE states with sharp local temperature is discussed. Moreover, a proposal is made for fixing the renormalization freedom of composite fields which serve as ‘thermal observables’ and a new definition of the thermal energy of LTE states is introduced. Based on these results, a general relation between the local temperature and the parameter β is established for KMS states in (anti) de Sitter spacetime. (paper)
Single-Spin Polarization Effects and the Determination of Timelike Proton Form Factors
Energy Technology Data Exchange (ETDEWEB)
Brodsky, S
2003-10-24
We show that measurements of the proton's polarization in e{sup +}e{sup -} {yields} p{bar p} strongly discriminate between analytic forms of models which fit the proton form factors in the spacelike region. In particular, the single-spin asymmetry normal to the scattering plane measures the relative phase difference between the timelike G{sub E} and G{sub M} form factors. The expected proton polarization in the timelike region is large, of order of several tens of percent.
A volume-based method for denoising on curved surfaces
Biddle, Harry
2013-09-01
We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.
Timelike symmetry of the quantum transition and Einstein-Podolsky-Rosen paradox
International Nuclear Information System (INIS)
Costa de Beauregard, Olivier
1976-01-01
The non-locality in the paradox is very close to that of Feynman's electron-positron system: the sum of two timelike vectors with 4th components of opposite signs may be spacelike. The intrinsic time symmetry of the quantum transition consists in the presence of both the delayed and the advanced wave inside the ''collapsed'' wave [fr
Directory of Open Access Journals (Sweden)
Wenting Luo
2016-04-01
Full Text Available Pavement horizontal curve is designed to serve as a transition between straight segments, and its presence may cause a series of driving-related safety issues to motorists and drivers. As is recognized that traditional methods for curve geometry investigation are time consuming, labor intensive, and inaccurate, this study attempts to develop a method that can automatically conduct horizontal curve identification and measurement at network level. The digital highway data vehicle (DHDV was utilized for data collection, in which three Euler angles, driving speed, and acceleration of survey vehicle were measured with an inertial measurement unit (IMU. The 3D profiling data used for cross slope calibration was obtained with PaveVision3D Ultra technology at 1 mm resolution. In this study, the curve identification was based on the variation of heading angle, and the curve radius was calculated with kinematic method, geometry method, and lateral acceleration method. In order to verify the accuracy of the three methods, the analysis of variance (ANOVA test was applied by using the control variable of curve radius measured by field test. Based on the measured curve radius, a curve safety analysis model was used to predict the crash rates and safe driving speeds at horizontal curves. Finally, a case study on 4.35 km road segment demonstrated that the proposed method could efficiently conduct network level analysis.
Curved Waveguide Based Nuclear Fission for Small, Lightweight Reactors
Coker, Robert; Putnam, Gabriel
2012-01-01
The focus of the presented work is on the creation of a system of grazing incidence, supermirror waveguides for the capture and reuse of fission sourced neutrons. Within research reactors, neutron guides are a well known tool for directing neutrons from the confined and hazardous central core to a more accessible testing or measurement location. Typical neutron guides have rectangular, hollow cross sections, which are crafted as thin, mirrored waveguides plated with metal (commonly nickel). Under glancing angles with incoming neutrons, these waveguides can achieve nearly lossless transport of neutrons to distant instruments. Furthermore, recent developments have created supermirror surfaces which can accommodate neutron grazing angles up to four times as steep as nickel. A completed system will form an enclosing ring or spherical resonator system to a coupled neutron source for the purpose of capturing and reusing free neutrons to sustain and/or accelerate fission. While grazing incidence mirrors are a known method of directing and safely using neutrons, no method has been disclosed for capture and reuse of neutrons or sustainment of fission using a circular waveguide structure. The presented work is in the process of fabricating a functional, highly curved, neutron supermirror using known methods of Ni-Ti layering capable of achieving incident reflection angles up to four times steeper than nickel alone. Parallel work is analytically investigating future geometries, mirror compositions, and sources for enabling sustained fission with applicability to the propulsion and energy goals of NASA and other agencies. Should research into this concept prove feasible, it would lead to development of a high energy density, low mass power source potentially capable of sustaining fission with a fraction of the standard critical mass for a given material and a broadening of feasible materials due to reduced rates of release, absorption, and non-fission for neutrons. This
Meites, T; Meites, L
1970-06-01
This paper deals with isovalent ion-combination titrations based on reactions that can be represented by the equation M(n+) + X(n-) --> MX, where the activity of the product MX is invariant throughout a titration, and with the derivative titration curves obtained by plotting d[M(+)]/dfversus f for such titrations. It describes some of the ways in which such curves can be obtained; it compares and contrasts them both with potentiometric titration curves, which resemble them in shape, and with segmented titration curves, from which they are derived; and it discusses their properties in detail.
Testing the equality of nonparametric regression curves based on ...
African Journals Online (AJOL)
Abstract. In this work we propose a new methodology for the comparison of two regression functions f1 and f2 in the case of homoscedastic error structure and a fixed design. Our approach is based on the empirical Fourier coefficients of the regression functions f1 and f2 respectively. As our main results we obtain the ...
Going Beyond, Going Further: The Preparation of Acid-Base Titration Curves.
McClendon, Michael
1984-01-01
Background information, list of materials needed, and procedures used are provided for a simple technique for generating mechanically plotted acid-base titration curves. The method is suitable for second-year high school chemistry students. (JN)
Directory of Open Access Journals (Sweden)
Janković Marko
2013-01-01
Full Text Available In this paper, we analyze the possibilities of the diagnosis of Parkinson's disease at an early stage, based on characteristics of the input-output curve. The input-output (IO curve was analyzed in two ways: we analyzed the gain of the curve for low-level transcranial stimulation and we analyzed the overall 'quality' of the IO curve. The 'quality' of the curve calculation is based on basic concepts from quantum mechanics and calculation of Tsallis entropy.
Feature Extraction from 3D Point Cloud Data Based on Discrete Curves
Directory of Open Access Journals (Sweden)
Yi An
2013-01-01
Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.
Feasibility studies of time-like proton electromagnetic form factors at PANDA-FAIR
Energy Technology Data Exchange (ETDEWEB)
Dbeyssi, Alaa; Capozza, Luigi; Deiseroth, Malte; Froehlich, Bertold; Khaneft, Dmitry; Mora Espi, Maria Carmen; Noll, Oliver; Rodriguez Pineiro, David; Valente, Roserio; Zambrana, Manuel; Zimmermann, Iris [Helmholtz-Institut Mainz, Mainz (Germany); Maas, Frank [Helmholtz-Institut Mainz, Mainz (Germany); Institute of Nuclear Physics, Mainz (Germany); PRISMA Cluster of Excellence, Mainz (Germany); Marchand, Dominique; Tomasi-Gustafsson, Egle; Wang, Ying [Institut de Physique Nucleaire, Orsay (France); Collaboration: PANDA-Collaboration
2015-07-01
Electromagnetic form factors are fundamental quantities which describe the intrinsic electric and magnetic distributions of hadrons. Time-like proton form factors are experimentally accessible through the annihilation processes anti p+p <-> e{sup +}+e{sup -}. Their measurement in the time-like region had been limited by the low statistics achieved by the experiments. This contribution reports on the results of Monte Carlo simulations for future measurements of electromagnetic proton form factors at PANDA (antiProton ANnihilation at DArmstadt). In frame of the PANDARoot software, the statistical precision at which the proton form factors will be determined is estimated. The signal (anti p+p → e{sup +}+e{sup -}) identification and the suppression of the main background process (anti p+p → π{sup +}+π{sup -}) are studied. Different methods have been used and/or developed to generate and analyse the processes of interest. The results show that time-like proton form factors will be measured at PANDA with unprecedented statistical accuracy.
Neutron time-like electromagnetic form factor measurement with direct scan method at BESIII
Energy Technology Data Exchange (ETDEWEB)
Larin, Paul; Ahmed, Samer Ali Nasher; Lin, Dexu; Rosner, Christoph; Wang, Yadi [Helmholtz-Institut Mainz (Germany); Institut fuer Kernphysik, Johannes Gutenberg-Universitaet Mainz (Germany); Dbeyssi, Alaa; Morales, Cristina [Helmholtz-Institut Mainz (Germany); Maas, Frank [Helmholtz-Institut Mainz (Germany); Institut fuer Kernphysik, Johannes Gutenberg-Universitaet Mainz (Germany); PRISMA Cluster of Excellence, Johannes Gutenberg-Universitaet Mainz (Germany); Collaboration: BESIII-Collaboration
2016-07-01
The internal structure and dynamics of the neutron can be understood through the study of its electromagnetic (EM) form factors (FF). In comparison to proton FF measurements, less data on the neutron is available in the space-like as well as in the time-like region. None of the previous experiments were able to measure the ratio of the electric and the magnetic FF in the time-like region so far. The BESIII (Beijing Spectrometer III) experiment at BEPCII (Beijing Electron Positron Collider II) collected in 2014/15 a large sample of e{sup +}e{sup -} scan data in the region between 2.0 and 3.08 GeV with a total luminosity of 523.5 pb{sup -1}. With this poster we show our efforts to measure the effective FF of the neutron in a large energy region and the possibility to measure for the first time the ratio of the neutron form factors in the time-like region.
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
Analytical expression for initial magnetization curve of Fe-based soft magnetic composite material
International Nuclear Information System (INIS)
Birčáková, Zuzana; Kollár, Peter; Füzer, Ján; Bureš, Radovan; Fáberová, Mária
2017-01-01
The analytical expression for the initial magnetization curve for Fe-phenolphormaldehyde resin composite material was derived based on the already proposed ideas of the magnetization vector deviation function and the domain wall annihilation function, characterizing the reversible magnetization processes through the extent of deviation of magnetization vectors from magnetic field direction and the irreversible processes through the effective numbers of movable domain walls, respectively. As for composite materials the specific dependences of these functions were observed, the ideas were extended meeting the composites special features, which are principally the much higher inner demagnetizing fields produced by magnetic poles on ferromagnetic particle surfaces. The proposed analytical expression enables us to find the relative extent of each type of magnetization processes when magnetizing a specimen along the initial curve. - Highlights: • Analytical expression of the initial curve derived for SMC. • Initial curve described by elementary magnetization processes. • Influence of inner demagnetizing fields on magnetization process in SMC.
Analytical expression for initial magnetization curve of Fe-based soft magnetic composite material
Energy Technology Data Exchange (ETDEWEB)
Birčáková, Zuzana, E-mail: zuzana.bircakova@upjs.sk [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Kollár, Peter; Füzer, Ján [Institute of Physics, Faculty of Science, Pavol Jozef Šafárik University, Park Angelinum 9, 04154 Košice (Slovakia); Bureš, Radovan; Fáberová, Mária [Institute of Materials Research, Slovak Academy of Sciences, Watsonova 47, 04001 Košice (Slovakia)
2017-02-01
The analytical expression for the initial magnetization curve for Fe-phenolphormaldehyde resin composite material was derived based on the already proposed ideas of the magnetization vector deviation function and the domain wall annihilation function, characterizing the reversible magnetization processes through the extent of deviation of magnetization vectors from magnetic field direction and the irreversible processes through the effective numbers of movable domain walls, respectively. As for composite materials the specific dependences of these functions were observed, the ideas were extended meeting the composites special features, which are principally the much higher inner demagnetizing fields produced by magnetic poles on ferromagnetic particle surfaces. The proposed analytical expression enables us to find the relative extent of each type of magnetization processes when magnetizing a specimen along the initial curve. - Highlights: • Analytical expression of the initial curve derived for SMC. • Initial curve described by elementary magnetization processes. • Influence of inner demagnetizing fields on magnetization process in SMC.
International Nuclear Information System (INIS)
Ros, F C; Sidek, L M; Desa, M N; Arifin, K; Tosaka, H
2013-01-01
The purpose of the stage-discharge curves varies from water quality study, flood modelling study, can be used to project climate change scenarios and so on. As the bed of the river often changes due to the annual monsoon seasons that sometimes cause by massive floods, the capacity of the river will changed causing shifting controlled to happen. This study proposes to use the historical flood event data from 1960 to 2009 in calculating the stage-discharge curve of Guillemard Bridge located in Sg. Kelantan. Regression analysis was done to check the quality of the data and examine the correlation between the two variables, Q and H. The mean values of the two variables then were adopted to find the value of difference between zero gauge height and the level of zero flow, 'a', K and 'n' to fit into rating curve equation and finally plotting the stage-discharge rating curve. Regression analysis of the historical flood data indicate that 91 percent of the original uncertainty has been explained by the analysis with the standard error of 0.085.
A standard curve based method for relative real time PCR data processing
Directory of Open Access Journals (Sweden)
Krause Andreas
2005-03-01
Full Text Available Abstract Background Currently real time PCR is the most precise method by which to measure gene expression. The method generates a large amount of raw numerical data and processing may notably influence final results. The data processing is based either on standard curves or on PCR efficiency assessment. At the moment, the PCR efficiency approach is preferred in relative PCR whilst the standard curve is often used for absolute PCR. However, there are no barriers to employ standard curves for relative PCR. This article provides an implementation of the standard curve method and discusses its advantages and limitations in relative real time PCR. Results We designed a procedure for data processing in relative real time PCR. The procedure completely avoids PCR efficiency assessment, minimizes operator involvement and provides a statistical assessment of intra-assay variation. The procedure includes the following steps. (I Noise is filtered from raw fluorescence readings by smoothing, baseline subtraction and amplitude normalization. (II The optimal threshold is selected automatically from regression parameters of the standard curve. (III Crossing points (CPs are derived directly from coordinates of points where the threshold line crosses fluorescence plots obtained after the noise filtering. (IV The means and their variances are calculated for CPs in PCR replicas. (V The final results are derived from the CPs' means. The CPs' variances are traced to results by the law of error propagation. A detailed description and analysis of this data processing is provided. The limitations associated with the use of parametric statistical methods and amplitude normalization are specifically analyzed and found fit to the routine laboratory practice. Different options are discussed for aggregation of data obtained from multiple reference genes. Conclusion A standard curve based procedure for PCR data processing has been compiled and validated. It illustrates that
On-chip magnetic bead-based DNA melting curve analysis using a magnetoresistive sensor
DEFF Research Database (Denmark)
Rizzi, Giovanni; Østerberg, Frederik Westergaard; Henriksen, Anders Dahl
2014-01-01
We present real-time measurements of DNA melting curves in a chip-based system that detects the amount of surface-bound magnetic beads using magnetoresistive magnetic field sensors. The sensors detect the difference between the amount of beads bound to the top and bottom sensor branches of the di......We present real-time measurements of DNA melting curves in a chip-based system that detects the amount of surface-bound magnetic beads using magnetoresistive magnetic field sensors. The sensors detect the difference between the amount of beads bound to the top and bottom sensor branches...
Wells, Gary L; Yang, Yueran; Smalarz, Laura
2015-04-01
We provide a novel Bayesian treatment of the eyewitness identification problem as it relates to various system variables, such as instruction effects, lineup presentation format, lineup-filler similarity, lineup administrator influence, and show-ups versus lineups. We describe why eyewitness identification is a natural Bayesian problem and how numerous important observations require careful consideration of base rates. Moreover, we argue that the base rate in eyewitness identification should be construed as a system variable (under the control of the justice system). We then use prior-by-posterior curves and information-gain curves to examine data obtained from a large number of published experiments. Next, we show how information-gain curves are moderated by system variables and by witness confidence and we note how information-gain curves reveal that lineups are consistently more proficient at incriminating the guilty than they are at exonerating the innocent. We then introduce a new type of analysis that we developed called base rate effect-equivalency (BREE) curves. BREE curves display how much change in the base rate is required to match the impact of any given system variable. The results indicate that even relatively modest changes to the base rate can have more impact on the reliability of eyewitness identification evidence than do the traditional system variables that have received so much attention in the literature. We note how this Bayesian analysis of eyewitness identification has implications for the question of whether there ought to be a reasonable-suspicion criterion for placing a person into the jeopardy of an identification procedure. (c) 2015 APA, all rights reserved).
Directory of Open Access Journals (Sweden)
Sylvie Troncale
Full Text Available MOTIVATION: Reverse phase protein array (RPPA is a powerful dot-blot technology that allows studying protein expression levels as well as post-translational modifications in a large number of samples simultaneously. Yet, correct interpretation of RPPA data has remained a major challenge for its broad-scale application and its translation into clinical research. Satisfying quantification tools are available to assess a relative protein expression level from a serial dilution curve. However, appropriate tools allowing the normalization of the data for external sources of variation are currently missing. RESULTS: Here we propose a new method, called NormaCurve, that allows simultaneous quantification and normalization of RPPA data. For this, we modified the quantification method SuperCurve in order to include normalization for (i background fluorescence, (ii variation in the total amount of spotted protein and (iii spatial bias on the arrays. Using a spike-in design with a purified protein, we test the capacity of different models to properly estimate normalized relative expression levels. The best performing model, NormaCurve, takes into account a negative control array without primary antibody, an array stained with a total protein stain and spatial covariates. We show that this normalization is reproducible and we discuss the number of serial dilutions and the number of replicates that are required to obtain robust data. We thus provide a ready-to-use method for reliable and reproducible normalization of RPPA data, which should facilitate the interpretation and the development of this promising technology. AVAILABILITY: The raw data, the scripts and the normacurve package are available at the following web site: http://microarrays.curie.fr.
Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia
2017-09-01
The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.
ONODA, Tomoaki; YAMAMOTO, Ryuta; SAWAMURA, Kyohei; MURASE, Harutaka; NAMBO, Yasuo; INOUE, Yoshinobu; MATSUI, Akira; MIYAKE, Takeshi; HIRAI, Nobuhiro
2014-01-01
ABSTRACT We propose an approach of estimating individual growth curves based on the birthday information of Japanese Thoroughbred horses, with considerations of the seasonal compensatory growth that is a typical characteristic of seasonal breeding animals. The compensatory growth patterns appear during only the winter and spring seasons in the life of growing horses, and the meeting point between winter and spring depends on the birthday of each horse. We previously developed new growth curve equations for Japanese Thoroughbreds adjusting for compensatory growth. Based on the equations, a parameter denoting the birthday information was added for the modeling of the individual growth curves for each horse by shifting the meeting points in the compensatory growth periods. A total of 5,594 and 5,680 body weight and age measurements of Thoroughbred colts and fillies, respectively, and 3,770 withers height and age measurements of both sexes were used in the analyses. The results of predicted error difference and Akaike Information Criterion showed that the individual growth curves using birthday information better fit to the body weight and withers height data than not using them. The individual growth curve for each horse would be a useful tool for the feeding managements of young Japanese Thoroughbreds in compensatory growth periods. PMID:25013356
Probabilistic View-based 3D Curve Skeleton Computation on the GPU
Kustra, Jacek; Jalba, Andrei; Telea, Alexandru
2013-01-01
Computing curve skeletons of 3D shapes is a challenging task. Recently, a high-potential technique for this task was proposed, based on integrating medial information obtained from several 2D projections of a 3D shape. However effective, this technique is strongly influenced in terms of complexity
Refined tropical curve counts and canonical bases for quantum cluster algebras
DEFF Research Database (Denmark)
Mandel, Travis
We express the (quantizations of the) Gross-Hacking-Keel-Kontsevich canonical bases for cluster algebras in terms of certain (Block-Göttsche) weighted counts of tropical curves. In the process, we obtain via scattering diagram techniques a new invariance result for these Block-Göttsche counts....
Prototype base curve attachment for the topographer: what will replace the vanishing radiuscope?
Elder, Keshia S; Benjamin, William J
2009-03-01
With the disappearance of the radiuscope, alternative methods must be developed to accurately measure gas-permeable (GP) lenses. The purpose of this study was to design, manufacture, and test a novel attachment for the evaluation of spherical base curves of GP rigid contact lenses using a corneal topographer. A topographer attachment was devised with the intent of measuring the surface radii of GP lenses. A front-surface mirror was fixed at a 45 degrees angle above a hemispherical GP lens mount taken from a radiuscope. The device was set in the chin rest of a Humphrey Atlas topographer (Carl Zeiss Meditec, Inc., Jena, Germany) such that the image from the back of a GP surface was focused. The base curve radii of 9 rigid polymer buttons nominally ranging from 5.00 mm to 9.00 mm in 0.5-mm steps were measured 3 times with a Neitz Auto CG (Neitz Instruments Co., Ltd., Tokyo, Japan), Reichert Radiuscope (Reichert Ophthalmic Instruments, Depew, New York), and Humphrey Atlas topographer using the base curve (BC) attachment. An analysis of variance with replication and without interaction, using the main effects of measurement method (n = 5) and nominal button radius (n = 9), found that there was no statistically significant effect of the base curve measurement method (F[4, 122] = 1.23; P = 0.303), although there was a statistically significant effect of nominal button radius (F[8, 122] = 4.70; P = 0.003). Although the optical system of the corneal topographer was designed for measurement of convex surfaces, measurement of base curve radii from concave surfaces was performed without a correction factor using the prototype attachment. The mean SimK and Axial radii were within the tolerance for GP base curve radius, 0.05 mm, cited in ANSI Z80.20-2004 and ISO 18369-2:2006. Thus, the clinical feasibility of this prototype BC attachment to support measurement of spherical base curve radii for GP lenses by a corneal topographer was demonstrated.
Estimating Soil Water Retention Curve Using The Particle Size Distribution Based on Fractal Approach
Directory of Open Access Journals (Sweden)
M.M. Chari
2016-02-01
showed that the fractal dimension of particle size distributions obtained with both methods were not significantly different from each other. DSWRCwas also using the suction-moisture . The results indicate that all three fractal dimensions related to soil texture and clay content of the soil increases. Linear regression relationships between Dm1 and Dm2 with DSWRC was created using 48 soil samples in order to determine the coefficient of 0.902 and 0.871 . Then, based on relationships obtained from the four methods (1- Dm1 = DSWRC, 2-regression equationswere obtained Dm1, 3- Dm2 = DSWRC and 4. The regression equation obtained Dm2. DSWRC expression was used to express DSWRC. Various models for the determination of soil moisture suction according to statistical indicators normalized root mean square error, mean error, relative error.And mean geometric modeling efficiency was evaluated. The results of all four fractalsare close to each other and in most soils it is consistent with the measured data. Models predict the ability to work well in sandy loam soil fractal models and the predicted measured moisture value is less than the estimated fractal dimension- less than its actual value is the moisture curve. Conclusions: In this study, the work of Skaggs et al. (24 was used and it was amended by Fooladmand and Sepaskhah (8 grading curve using the percentage of developed sand, silt and clay . The fractal dimension of the particle size distribution was obtained.The fractal dimension particle size of the radius of the particle size of sand, silt and clay were used, respectively.In general, the study of fractals to simulate the effectiveness of retention curve proved successful. And soon it was found that the use of data, such as sand, silt and clay retention curve can be estimated with reasonable accuracy.
Study on Vehicle Track Model in Road Curved Section Based on Vehicle Dynamic Characteristics
Directory of Open Access Journals (Sweden)
Ren Yuan-Yuan
2012-01-01
Full Text Available Plenty of experiments and data analysis of vehicle track type in road curved section show that the deviation and the crossing characteristics of vehicle track paths are directly related to the driving stability and security. In this connection, the concept of driving trajectory in curved section was proposed, six track types were classified and defined, and furthermore their characteristic features were determined. Most importantly, considering curve geometry and vehicle dynamic characteristics, each trajectory model was established, respectively, and the optimum driving trajectory models were finally determined based on the crucial factors of vehicle yaw rate, which was also the most important factor that impacts vehicle’s handling stability. Through it all, MATLAB was used to simulate and verify the correctness of models. Finally, this paper comes to the conclusion that normal trajectory and cutting trajectory are the optimum driving trajectories.
DEFF Research Database (Denmark)
Tatu, Aditya Jayant
tracking interfaces, active contour based segmentation methods and others. It can also be used to study shape spaces, as deforming a shape can be thought of as evolving its boundary curve. During curve evolution a curve traces out a path in the infinite dimensional space of curves. Due to application...... defined subspace, the N-links bicycle chain space, i.e. the space of curves with equidistant neighboring landmark points. This in itself is a useful shape space for medical image analysis applications. The Histogram of Gradient orientation based features are many in number and are widely used...
Investigation of the bases for use of the K sub Ic curve
Energy Technology Data Exchange (ETDEWEB)
McCabe, D.E.; Nanstad, R.K. (Oak Ridge National Lab., TN (USA)); Rosenfield, A.R.; Marschall, C.W. (Battelle, Columbus, OH (USA)); Irwin, G.R. (Maryland Univ., College Park, MD (USA))
1991-01-01
Title 10 of the Code of Federal Regulations, Part 50 (10CFR50), Appendix G, establishes the bases for setting allowable pressure and temperature limits on reactors during heatup and cooldown operation. Both the K{sub Ic} and K{sub Ia} curves are utilized in prescribed ways to maintain reactor vessel structural integrity in the presence of an assumed or actual flaw and operating stresses. Currently, the code uses the K{sub Ia} curve, normalized to the RT{sub NDT}, to represent the fracture toughness trend for unirradiated and irradiated pressure vessel steels. Although this is clearly a conservative policy, it has been suggested that the K{sub Ic} curve is the more appropriate for application to a non-accident operating condition. A number of uncertainties have been identified, however, that might convert normal operating transients into a dynamic loading situation. Those include the introduction of running cracks from local brittle zones, crack pop-ins, reduced toughness from arrested cleavage cracks, description of the K{sub Ic} curve for irradiated materials, and other related unresolved issues relative to elastic-plastic fracture mechanics. Some observations and conclusions can be made regarding various aspects of those uncertainties and they are discussed in this paper. A discussion of further work required and under way to address the remaining uncertainties is also presented.
Seismic Failure Probability of a Curved Bridge Based on Analytical and Neural Network Approaches
Directory of Open Access Journals (Sweden)
K. Karimi-Moridani
2017-01-01
Full Text Available This study focuses on seismic fragility assessment of horizontal curved bridge, which has been derived by neural network prediction. The objective is the optimization of structural responses of metaheuristic solutions. A regression model for the responses of the horizontal curved bridge with variable coefficients is built in the neural networks simulation environment based on the existing NTHA data. In order to achieve accurate results in a neural network, 1677 seismic analysis was performed in OpenSees. To achieve better performance of neural network and reduce the dimensionality of input data, dimensionality reduction techniques such as factor analysis approach were applied. Different types of neural network training algorithm were used and the best algorithm was adopted. The developed ANN approach is then used to verify the fragility curves of NTHA. The obtained results indicated that neural network approach could be used for predicting the seismic behavior of bridge elements and fragility, with enough feature extraction of ground motion records and response of structure according to the statistical works. Fragility curves extracted from the two approaches generally show proper compliance.
Halftone information hiding technology based on phase feature of space filling curves
Hu, Jianhua; Cao, Peng; Dong, Zhihong; Cao, Xiaohe
2017-08-01
To solve the problems of the production of interference fringes (namely moiré in printing) and improve the image quality in printing process of halftone screening for information hiding, a halftone screening security technique based on the phase feature of space filling curves is studied in this paper. This method effectively solves the problem of moire and optimizes the quality of the screening, so that the images presented after screening have achieved good visual effect. The pseudo-random scrambling encryption of the plaintext information and the halftone screening technique based on the phase feature of the space filling curves are carried out when screening,which not only eliminates the common moire in the screening but also improves the image quality and the security of information.
An explanation for the shape of nanoindentation unloading curves based on finite element simulation
International Nuclear Information System (INIS)
Bolshakov, A.; Pharr, G.M.
1995-01-01
Current methods for measuring hardness and modulus from nanoindentation load-displacement data are based on Sneddon's equations for the indentation of an elastic half-space by an axially symmetric rigid punch. Recent experiments have shown that nanoindentation unloading data are distinctly curved in a manner which is not consistent with either the flat punch or the conical indenter geometries frequently used in modeling, but are more closely approximated by a parabola of revolution. Finite element simulations for conical indentation of an elastic-plastic material are presented which corroborate the experimental observations, and from which a simple explanation for the shape of the unloading curve is derived. The explanation is based on the concept of an effective indenter shape whose geometry is determined by the shape of the plastic hardness impression formed during indentation
Directory of Open Access Journals (Sweden)
Shuanghong Li
2016-01-01
Full Text Available Laminar cooling process is a large-scale, nonlinear system, so the temperature control of such system is a difficult and complex problem. In this paper, a novel modeling method and a GPC-PID based control strategy for laminar cooling process are proposed to control the global temperature curve to produce high quality steel. First, based on the analysis of the cooling process of laminar flow, a new TS fuzzy model which possesses intelligence and self-learning ability is established to improve the temperature prediction accuracy. Second, the target temperature curve can be divided into several subgoals and each subgoal can be described by a CARIMA type of model. Then, by the decentralized predictive control method, GPC-PID based control strategy is introduced to guarantee the laminar cooling process to achieve subtargets, respectively; in that way the steel plate temperature will drop along the optimal temperature curve. Moreover, by employing the dSPACE control board into the process control system, the matrix process ability is added to the production line without large-scale reconstruction. Finally, the effectiveness and performance of the proposed modeling and control strategy are demonstrated by the industrial data and metallography detection in one steel company.
Locally-Based Kernal PLS Smoothing to Non-Parametric Regression Curve Fitting
Rosipal, Roman; Trejo, Leonard J.; Wheeler, Kevin; Korsmeyer, David (Technical Monitor)
2002-01-01
We present a novel smoothing approach to non-parametric regression curve fitting. This is based on kernel partial least squares (PLS) regression in reproducing kernel Hilbert space. It is our concern to apply the methodology for smoothing experimental data where some level of knowledge about the approximate shape, local inhomogeneities or points where the desired function changes its curvature is known a priori or can be derived based on the observed noisy data. We propose locally-based kernel PLS regression that extends the previous kernel PLS methodology by incorporating this knowledge. We compare our approach with existing smoothing splines, hybrid adaptive splines and wavelet shrinkage techniques on two generated data sets.
WANG, J.
2017-12-01
In stream water quality control, the total maximum daily load (TMDL) program is very effective. However, the load duration curves (LDC) of TMDL are difficult to be established because no sufficient observed flow and pollutant data can be provided in data-scarce watersheds in which no hydrological stations or consecutively long-term hydrological data are available. Although the point sources or a non-point sources of pollutants can be clarified easily with the aid of LDC, where does the pollutant come from and to where it will be transported in the watershed cannot be traced by LDC. To seek out the best management practices (BMPs) of pollutants in a watershed, and to overcome the limitation of LDC, we proposed to develop LDC based on a distributed hydrological model of SWAT for the water quality management in data scarce river basins. In this study, firstly, the distributed hydrological model of SWAT was established with the scarce-hydrological data. Then, the long-term daily flows were generated with the established SWAT model and rainfall data from the adjacent weather station. Flow duration curves (FDC) was then developed with the aid of generated daily flows by SWAT model. Considering the goal of water quality management, LDC curves of different pollutants can be obtained based on the FDC. With the monitored water quality data and the LDC curves, the water quality problems caused by the point or non-point source pollutants in different seasons can be ascertained. Finally, the distributed hydrological model of SWAT was employed again to tracing the spatial distribution and the origination of the pollutants of coming from what kind of agricultural practices and/or other human activities. A case study was conducted in the Jian-jiang river, a tributary of Yangtze river, of Duyun city, Guizhou province. Results indicate that this kind of method can realize the water quality management based on TMDL and find out the suitable BMPs for reducing pollutant in a watershed.
Paris, Adrien; André Garambois, Pierre; Calmant, Stéphane; Paiva, Rodrigo; Walter, Collischonn; Santos da Silva, Joecila; Medeiros Moreira, Daniel; Bonnet, Marie-Paule; Seyler, Frédérique; Monnier, Jérôme
2016-04-01
Estimating river discharge for ungauged river reaches from satellite measurements is not straightforward given the nonlinearity of flow behavior with respect to measurable and non measurable hydraulic parameters. As a matter of facts, current satellite datasets do not give access to key parameters such as river bed topography and roughness. A unique set of almost one thousand altimetry-based rating curves was built by fit of ENVISAT and Jason-2 water stages with discharges obtained from the MGB-IPH rainfall-runoff model in the Amazon basin. These rated discharges were successfully validated towards simulated discharges (Ens = 0.70) and in-situ discharges (Ens = 0.71) and are not mission-dependent. The rating curve writes Q = a(Z-Z0)b*sqrt(S), with Z the water surface elevation and S its slope gained from satellite altimetry, a and b power law coefficient and exponent and Z0 the river bed elevation such as Q(Z0) = 0. For several river reaches in the Amazon basin where ADCP measurements are available, the Z0 values are fairly well validated with a relative error lower than 10%. The present contribution aims at relating the identifiability and the physical meaning of a, b and Z0given various hydraulic and geomorphologic conditions. Synthetic river bathymetries sampling a wide range of rivers and inflow discharges are used to perform twin experiments. A shallow water model is run for generating synthetic satellite observations, and then rating curve parameters are determined for each river section thanks to a MCMC algorithm. Thanks to twin experiments, it is shown that rating curve formulation with water surface slope, i.e. closer from Manning equation form, improves parameter identifiability. The compensation between parameters is limited, especially for reaches with little water surface variability. Rating curve parameters are analyzed for riffle and pools for small to large rivers, different river slopes and cross section shapes. It is shown that the river bed
An Improved MPPT Algorithm for PV Generation Applications Based on P-U Curve Reconstitution
Directory of Open Access Journals (Sweden)
Yaoqiang Wang
2016-01-01
Full Text Available The output power of PV array changes with the variation of environmental factors, such as temperature and solar irradiation. Therefore, a maximum power point (MPP tracking (MPPT algorithm is essential for the photovoltaic generation system. However, the P-U curve changes dynamically with the variation of the environmental factors; here, the misjudgment may occur if a simple perturb-and-observe (P&O MPPT algorithm is used. In order to solve this problem, this paper takes MPPT as the main research object, and an improved MPPT algorithm for PV generation applications based on P-U curve reconstitution is proposed. Firstly, the mathematical model of PV array is presented, and then the output dynamic characteristics are analyzed. Based on this, a P-U curve reconstitution strategy is introduced, and the improved MPPT algorithm is proposed. At last, simulation and comparative analysis are conducted. Results show that, with the proposed algorithm, MPP is tracked accurately, and the misjudgment problem is solved effectively.
Collaborative filtering algorithm based on Forgetting Curve and Long Tail theory
Qi, Shen; Li, Shiwei; Zhou, Hao
2017-03-01
The traditional collaborative filtering algorithm only pays attention to the rating by users. In reality, however, user and item information is always changing with time flying. Therefore, recommendation systems need to take time-varying changes into consideration. The collaborative filtering algorithm which is based on Forgetting Curve and Long Tail theory (FCLT) is introduced for the above problems. The following two points are discussed depending on the problem: First, the user-item rating matrix can update in real time by forgetting curve; secondly, according to the Long Tail theory and item popularity, a further similarity calculation method is obtained. The experimental results demonstrated that the proposed algorithm can effectively improve the recommendation accuracy and alleviate the Long Tail effect.
Curve aligning approach for gait authentication based on a wearable accelerometer
International Nuclear Information System (INIS)
Sun, Hu; Yuao, Tao
2012-01-01
Gait authentication based on a wearable accelerometer is a novel biometric which can be used for identity identification, medical rehabilitation and early detection of neurological disorders. The method for matching gait patterns tells heavily on authentication performances. In this paper, curve aligning is introduced as a new method for matching gait patterns and it is compared with correlation and dynamic time warping (DTW). A support vector machine (SVM) is proposed to fuse pattern-matching methods in a decision level. Accelerations collected from ankles of 22 walking subjects are processed for authentications in our experiments. The fusion of curve aligning with backward–forward accelerations and DTW with vertical accelerations promotes authentication performances substantially and consistently. This fusion algorithm is tested repeatedly. Its mean and standard deviation of equal error rates are 0.794% and 0.696%, respectively, whereas among all presented non-fusion algorithms, the best one shows an EER of 3.03%. (paper)
Directory of Open Access Journals (Sweden)
Cuo Guan
2017-01-01
Full Text Available This paper provides a method for evaluating the status of old oilfield development. This method mainly uses the abundant coring well data of the oilfield to obtain the cumulative distribution curve of the displacement efficiency after the displacement efficiency of the statistical wells in the study area in a similar period is ordered from small to large. Based on the cumulative distribution curve of displacement efficiency, combined with the reservoir ineffective circulation limit, the cumulative water absorption ratio of reservoirs and other data are used to study the reservoir producing degree, calculate the degree of oil recovery, evaluate the proportion of the remaining movable oil after water flooding, calculate the reservoir ineffective circulation thickness and ineffective circulation water volume, and so on.
2011-01-01
Background Simulation-based medical education has been widely used in medical skills training; however, the effectiveness and long-term outcome of simulation-based training in thoracentesis requires further investigation. The purpose of this study was to assess the learning curve of simulation-based thoracentesis training, study skills retention and transfer of knowledge to a clinical setting following simulation-based education intervention in thoracentesis procedures. Methods Fifty-two medical students were enrolled in this study. Each participant performed five supervised trials on the simulator. Participant's performance was assessed by performance score (PS), procedure time (PT), and participant's confidence (PC). Learning curves for each variable were generated. Long-term outcome of the training was measured by the retesting and clinical performance evaluation 6 months and 1 year, respectively, after initial training on the simulator. Results Significant improvements in PS, PT, and PC were noted among the first 3 to 4 test trials (p 0.05). Clinical competency in thoracentesis was improved in participants who received simulation training relative to that of first year medical residents without such experience (p simulation-based thoracentesis training can significantly improve an individual's performance. The saturation of learning from the simulator can be achieved after four practice sessions. Simulation-based training can assist in long-term retention of skills and can be partially transferred to clinical practice. PMID:21696584
Energy Technology Data Exchange (ETDEWEB)
Jenkin, Thomas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Larson, Andrew [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Mark F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ben [U.S. Department of Energy; Spitsen, Paul [U.S. Department of Energy
2018-03-27
In light of the changing electricity resource mixes across the United States, an important question in electricity modeling is how additions and retirements of generation, including additions in variable renewable energy (VRE) generation could impact markets by changing hourly wholesale energy prices. Instead of using resource-intensive production cost models (PCMs) or building and using simple generator supply curves, this analysis uses a 'top-down' approach based on regression analysis of hourly historical energy and load data to estimate the impact of supply changes on wholesale electricity prices, provided the changes are not so substantial that they fundamentally alter the market and dispatch-order driven behavior of non-retiring units. The rolling supply curve (RSC) method used in this report estimates the shape of the supply curve that fits historical hourly price and load data for given time intervals, such as two-weeks, and then repeats this on a rolling basis through the year. These supply curves can then be modified on an hourly basis to reflect the impact of generation retirements or additions, including VRE and then reapplied to the same load data to estimate the change in hourly electricity price. The choice of duration over which these RSCs are estimated has a significant impact on goodness of fit. For example, in PJM in 2015, moving from fitting one curve per year to 26 rolling two-week supply curves improves the standard error of the regression from 16 dollars/MWh to 6 dollars/MWh and the R-squared of the estimate from 0.48 to 0.76. We illustrate the potential use and value of the RSC method by estimating wholesale price effects under various generator retirement and addition scenarios, and we discuss potential limits of the technique, some of which are inherent. The ability to do this type of analysis is important to a wide range of market participants and other stakeholders, and it may have a role in complementing use of or providing
Determination of critical nitrogen dilution curve based on stem dry matter in rice.
Directory of Open Access Journals (Sweden)
Syed Tahir Ata-Ul-Karim
Full Text Available Plant analysis is a very promising diagnostic tool for assessment of crop nitrogen (N requirements in perspectives of cost effective and environment friendly agriculture. Diagnosing N nutritional status of rice crop through plant analysis will give insights into optimizing N requirements of future crops. The present study was aimed to develop a new methodology for determining the critical nitrogen (Nc dilution curve based on stem dry matter (SDM and to assess its suitability to estimate the level of N nutrition for rice (Oryza sativa L. in east China. Three field experiments with varied N rates (0-360 kg N ha(-1 using three Japonica rice hybrids, Lingxiangyou-18, Wuxiangjing-14 and Wuyunjing were conducted in Jiangsu province of east China. SDM and stem N concentration (SNC were determined during vegetative stage for growth analysis. A Nc dilution curve based on SDM was described by the equation (Nc = 2.17W(-0.27 with W being SDM in t ha(-1, when SDM ranged from 0.88 to 7.94 t ha(-1. However, for SDM < 0.88 t ha(-1, the constant critical value Nc = 1.76% SDM was applied. The curve was dually validated for N-limiting and non-N-limiting growth conditions. The N nutrition index (NNI and accumulated N deficit (Nand of stem ranged from 0.57 to 1.06 and 51.1 to -7.07 kg N ha(-1, respectively, during key growth stages under varied N rates in 2010 and 2011. The values of ΔN derived from either NNI or Nand could be used as references for N dressing management during rice growth. Our results demonstrated that the present curve well differentiated the conditions of limiting and non-limiting N nutrition in rice crop. The SDM based Nc dilution curve can be adopted as an alternate and novel approach for evaluating plant N status to support N fertilization decision during the vegetative growth of Japonica rice in east China.
Determination of critical nitrogen dilution curve based on stem dry matter in rice.
Ata-Ul-Karim, Syed Tahir; Yao, Xia; Liu, Xiaojun; Cao, Weixing; Zhu, Yan
2014-01-01
Plant analysis is a very promising diagnostic tool for assessment of crop nitrogen (N) requirements in perspectives of cost effective and environment friendly agriculture. Diagnosing N nutritional status of rice crop through plant analysis will give insights into optimizing N requirements of future crops. The present study was aimed to develop a new methodology for determining the critical nitrogen (Nc) dilution curve based on stem dry matter (SDM) and to assess its suitability to estimate the level of N nutrition for rice (Oryza sativa L.) in east China. Three field experiments with varied N rates (0-360 kg N ha(-1)) using three Japonica rice hybrids, Lingxiangyou-18, Wuxiangjing-14 and Wuyunjing were conducted in Jiangsu province of east China. SDM and stem N concentration (SNC) were determined during vegetative stage for growth analysis. A Nc dilution curve based on SDM was described by the equation (Nc = 2.17W(-0.27) with W being SDM in t ha(-1)), when SDM ranged from 0.88 to 7.94 t ha(-1). However, for SDM < 0.88 t ha(-1), the constant critical value Nc = 1.76% SDM was applied. The curve was dually validated for N-limiting and non-N-limiting growth conditions. The N nutrition index (NNI) and accumulated N deficit (Nand) of stem ranged from 0.57 to 1.06 and 51.1 to -7.07 kg N ha(-1), respectively, during key growth stages under varied N rates in 2010 and 2011. The values of ΔN derived from either NNI or Nand could be used as references for N dressing management during rice growth. Our results demonstrated that the present curve well differentiated the conditions of limiting and non-limiting N nutrition in rice crop. The SDM based Nc dilution curve can be adopted as an alternate and novel approach for evaluating plant N status to support N fertilization decision during the vegetative growth of Japonica rice in east China.
Yang, Qian; Lew, Hwee Yeong; Peh, Raymond Hock Huat; Metz, Michael Patrick; Loh, Tze Ping
2016-10-01
Reference intervals are the most commonly used decision support tool when interpreting quantitative laboratory results. They may require partitioning to better describe subpopulations that display significantly different reference values. Partitioning by age is particularly important for the paediatric population since there are marked physiological changes associated with growth and maturation. However, most partitioning methods are either technically complex or require prior knowledge of the underlying physiology/biological variation of the population. There is growing interest in the use of continuous centile curves, which provides seamless laboratory reference values as a child grows, as an alternative to rigidly described fixed reference intervals. However, the mathematical functions that describe these curves can be complex and may not be easily implemented in laboratory information systems. Hence, the use of fixed reference intervals is expected to continue for a foreseeable time. We developed a method that objectively proposes optimised age partitions and reference intervals for quantitative laboratory data (http://research.sph.nus.edu.sg/pp/ppResult.aspx), based on the sum of gradient that best describes the underlying distribution of the continuous centile curves. It is hoped that this method may improve the selection of age intervals for partitioning, which is receiving increasing attention in paediatric laboratory medicine. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.
Fuller, Jason C.; Chassin, David P.; Pratt, Robert G.; Hauer, Matthew; Tuffner, Francis K.
2017-03-07
Disclosed herein are representative embodiments of methods, apparatus, and systems for distributing a resource (such as electricity) using a resource allocation system. One of the disclosed embodiments is a method for operating a transactive thermostatic controller configured to submit bids to a market-based resource allocation system. According to the exemplary method, a first bid curve is determined, the first bid curve indicating a first set of bid prices for corresponding temperatures and being associated with a cooling mode of operation for a heating and cooling system. A second bid curve is also determined, the second bid curve indicating a second set of bid prices for corresponding temperatures and being associated with a heating mode of operation for a heating and cooling system. In this embodiment, the first bid curve, the second bid curve, or both the first bid curve and the second bid curve are modified to prevent overlap of any portion of the first bid curve and the second bid curve.
Null to time-like infinity Green’s functions for asymptotic symmetries in Minkowski spacetime
International Nuclear Information System (INIS)
Campiglia, Miguel
2015-01-01
We elaborate on the Green’s functions that appeared in http://dx.doi.org/10.1007/JHEP07(2015)115http://arxiv.org/abs/1509.01406 when generalizing, from massless to massive particles, various equivalences between soft theorems and Ward identities of large gauge symmetries. We analyze these Green’s functions in considerable detail and show that they form a hierarchy of functions which describe ‘boundary to bulk’ propagators for large U(1) gauge parameters, supertranslations and sphere vector fields respectively. As a consistency check we verify that the Green’s functions associated to the large diffeomorphisms map the Poincare group at null infinity to the Poincare group at time-like infinity.
Analysis and research on curved surface's prototyping error based on FDM process
Gong, Y. D.; Zhang, Y. C.; Yang, T. B.; Wang, W. S.
2008-12-01
Analysis and research methods on curved surface's prototyping error with FDM (Fused Deposition Modeling) process are introduced in this paper, then the experiment result of curved surface's prototyping error is analyzed, and the integrity of point cloud information and the fitting method of curved surface prototyping are discussed as well as the influence on curved surface's prototyping error with different software. Finally, the qualitative and quantitative conclusions on curved surface's prototyping error are acquired in this paper.
Enhancement of global flood damage assessments using building material based vulnerability curves
Englhardt, Johanna; de Ruiter, Marleen; de Moel, Hans; Aerts, Jeroen
2017-04-01
This study discusses the development of an enhanced approach for flood damage and risk assessments using vulnerability curves that are based on building material information. The approach draws upon common practices in earthquake vulnerability assessments, and is an alternative for land-use or building occupancy approach in flood risk assessment models. The approach is of particular importance for studies where there is a large variation in building material, such as large scale studies or studies in developing countries. A case study of Ethiopia is used to demonstrate the impact of the different methodological approaches on direct damage assessments due to flooding. Generally, flood damage assessments use damage curves for different land-use or occupancy types (i.e. urban or residential and commercial classes). However, these categories do not necessarily relate directly to vulnerability of damage by flood waters. For this, the construction type and building material may be more important, as is used in earthquake risk assessments. For this study, we use building material classification data of the PAGER1 project to define new building material based vulnerability classes for flood damage. This approach will be compared to the widely applied land-use based vulnerability curves such as used by De Moel et al. (2011). The case of Ethiopia demonstrates and compares the feasibility of this novel flood vulnerability method on a country level which holds the potential to be scaled up to a global level. The study shows that flood vulnerability based on building material also allows for better differentiation between flood damage in urban and rural settings, opening doors to better link to poverty studies when such exposure data is available. Furthermore, this new approach paves the road to the enhancement of multi-risk assessments as the method enables the comparison of vulnerability across different natural hazard types that also use material-based vulnerability curves
Directory of Open Access Journals (Sweden)
Lázcoz Paula
2008-02-01
Full Text Available Abstract Background We present two melting curve analysis (MCA-based semiquantitative real time PCR techniques to detect the promoter methylation status of genes. The first, MCA-MSP, follows the same principle as standard MSP but it is performed in a real time thermalcycler with results being visualized in a melting curve. The second, MCA-Meth, uses a single pair of primers designed with no CpGs in its sequence. These primers amplify both unmethylated and methylated sequences. In clinical applications the MSP technique has revolutionized methylation detection by simplifying the analysis to a PCR-based protocol. MCA-analysis based techniques may be able to further improve and simplify methylation analyses by reducing starting DNA amounts, by introducing an all-in-one tube reaction and by eliminating a final gel stage for visualization of the result. The current study aimed at investigating the feasibility of both MCA-MSP and MCA-Meth in the analysis of promoter methylation, and at defining potential advantages and shortcomings in comparison to currently implemented techniques, i.e. bisulfite sequencing and standard MSP. Methods The promoters of the RASSF1A (3p21.3, BLU (3p21.3 and MGMT (10q26 genes were analyzed by MCA-MSP and MCA-Meth in 13 astrocytoma samples, 6 high grade glioma cell lines and 4 neuroblastoma cell lines. The data were compared with standard MSP and validated by bisulfite sequencing. Results Both, MCA-MSP and MCA-Meth, successfully determined promoter methylation. MCA-MSP provided information similar to standard MSP analyses. However the analysis was possible in a single tube and avoided the gel stage. MCA-Meth proved to be useful in samples with intermediate methylation status, reflected by a melting curve position shift in dependence on methylation extent. Conclusion We propose MCA-MSP and MCA-Meth as alternative or supplementary techniques to MSP or bisulfite sequencing.
Agreement Dynamics of Memory-Based Naming Game with Forgetting Curve of Ebbinghaus
Shi, Xiao-Ming; Zhang, Jie-Fang
2009-04-01
We propose a memory-based naming game (MBNG) model with two-state variables in full-connected networks, which is like some previous opinion propagation models. It is found that this model is deeply affected by the memory decision parameter, and then its dynamical behaviour can be partly analysed by using numerical simulation and analytical argument. We also report a modified MBNG model with the forgetting curve of Ebbinghaus in the memory. With deletion of one parameter in the MBNG model, it can converge to success rate S(t) = 1 and the average sum E(t) is decided by the size of network N.
Emulating the Ebbinghaus forgetting curve of the human brain with a NiO-based memristor
Hu, S. G.; Liu, Y.; Chen, T. P.; Liu, Z.; Yu, Q.; Deng, L. J.; Yin, Y.; Hosaka, Sumio
2013-09-01
The well-known Ebbinghaus forgetting curve, which describes how information is forgotten over time, can be emulated using a NiO-based memristor with conductance that decreases with time after the application of electrical pulses. Here, the conductance is analogous to the memory state, while each electrical pulse represents a memory stimulation or learning event. The decrease in the conductance with time depends on the stimulation parameters, including pulse height and width and the number of pulses, which emulates memory loss behavior well in that the time taken for the memory to be lost depends on how the information is learned.
Optical fiber sensors for process refractometry and temperature measuring based on curved fibers
International Nuclear Information System (INIS)
Willsch, R.; Schwotzer, G.; Haubenreisser, W.; Jahn, J.U.
1986-01-01
Based on U-shape curved multimode fibers with defined bending radii intensity-modulated optical sensors for the determination of refractive index changes in liquids and related measurands (solution concentration, mixing ratio and others) in process-refractometry and for temperature measuring under special environmental conditions have been developed. The optoelectronic transmitting and receiving units are performed in modular technique and can be used in multi-purpose applications. The principles, performance and characteristical properties of these sensors are described and their possibilities of application in process measuring and automation are discussed by some selected examples. (orig.) [de
Bektaş, Burcu; Dursun, Uğur
2015-01-01
In this work, we focus on a class of timelike rotational surfaces in Minkowski space E-1(4) with 2-dimensional axis. There are three types of rotational surfaces with 2-dimensional axis, called rotational surfaces of elliptic, hyperbolic or parabolic type. We obtain all flat timelike rotational surface of elliptic and hyperbolic types with pointwise 1-type Gauss map of the first and second kind. We also prove that there exists no flat timelike rotational surface of parabolic type in E-1(4) wi...
Li, Y J; Wang, Y G; An, B; Xu, H; Liu, Y; Zhang, L C; Ma, H Y; Wang, W M
2016-01-01
A practical anodic and cathodic curve intersection model, which consisted of an apparent anodic curve and an imaginary cathodic line, was proposed to explain multiple corrosion potentials occurred in potentiodynamic polarization curves of Fe-based glassy alloys in alkaline solution. The apparent anodic curve was selected from the measured anodic curves. The imaginary cathodic line was obtained by linearly fitting the differences of anodic curves and can be moved evenly or rotated to predict the number and value of corrosion potentials.
Curved-array-based multispectral photoacoustic imaging of human finger joints.
Huang, Na; He, Ming; Shi, Haosheng; Zhao, Yuan; Lu, Man; Zou, Xianbing; Yao, Lei; Jiang, Huabei; Xi, Lei
2017-10-02
In this study, we present the design, fabrication, and evaluation of a curved-array-based photoacoustic imaging system designed for imaging vasculatures inside human finger joints with multispectral strategy. The transducers was fabricated with polyvinylidene fluoride (PVDF) film with a size of 30 mm×2.8 mm, and a curvature radius of 82 mm. A detailed comparison between the PVDF transducer and commercial piezoelectric ceramic (PTZ) transducers was performed. In addition, phantom and in vivo mouse experiments were carried out to evaluate the system performance. Furthermore, we recruited healthy volunteers and did multispectral photoacoustic imaging of blood vessels in finger joints. The transducers have an average center frequency of 6.6 MHz, and a mean bandwidth of 95%. The lateral and axial resolutions of the system are 110 μm and 800 μm, respectively, and the diameter of the active imaging is larger than 50 mm. We successfully captured the drug-induced cerebral bleeding spots in intact mouse brains, and recovered both morphology and oxygen saturation of the blood vessels in human finger joints. The PVDF transducer has a better performance in bandwidth compared with commercial transducers. The curved design of the transducer offers a better sensitivity and a higher axial resolution compared with the flat design. Based on the phantom, animal and human experiments, the proposed system has the potential to be used in clinical diagnosis of early-stage arthritis.
Xu, Lili; Luo, Shuqian
2010-01-01
Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.
Jenkins Model Based Ferrofluid Lubrication of a Curved Rough Annular Squeeze Film with Slip Velocity
Directory of Open Access Journals (Sweden)
J.R. Patel
2015-06-01
Full Text Available This paper deals with the combined effect of roughness and slip velocity on the performance of a Jenkins model based ferrofluid squeeze film in curved annular plates. Beavers and Joseph’s slip model has been adopted to incorporate the effect of slip velocity. The stochastic model of Christensen and Tonder has been deployed to evaluate the effect of surface roughness. The associated stochastically averaged Reynolds type equation is solved to derive the pressure distribution, leading to the calculation of load carrying capacity. The graphical representation makes it clear that although, the effect of transverse surface roughness is adverse in general, Jenkins model based ferrofluid lubrication provides some measures in mitigating the adverse effect and this becomes more manifest when the slip parameter is reduced and negatively skewed roughness occurs. Of course, a judicious choice of curvature parameters and variance (-ve add to this positive effect.
A Method for Formulizing Disaster Evacuation Demand Curves Based on SI Model
Directory of Open Access Journals (Sweden)
Yulei Song
2016-10-01
Full Text Available The prediction of evacuation demand curves is a crucial step in the disaster evacuation plan making, which directly affects the performance of the disaster evacuation. In this paper, we discuss the factors influencing individual evacuation decision making (whether and when to leave and summarize them into four kinds: individual characteristics, social influence, geographic location, and warning degree. In the view of social contagion of decision making, a method based on Susceptible-Infective (SI model is proposed to formulize the disaster evacuation demand curves to address both social influence and other factors’ effects. The disaster event of the “Tianjin Explosions” is used as a case study to illustrate the modeling results influenced by the four factors and perform the sensitivity analyses of the key parameters of the model. Some interesting phenomena are found and discussed, which is meaningful for authorities to make specific evacuation plans. For example, due to the lower social influence in isolated communities, extra actions might be taken to accelerate evacuation process in those communities.
Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation
Directory of Open Access Journals (Sweden)
Marisa W. Paryasto
2013-09-01
Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.
Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation
Directory of Open Access Journals (Sweden)
Marisa W. Paryasto
2012-04-01
Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.
Psychometric curve and behavioral strategies for whisker-based texture discrimination in rats.
Directory of Open Access Journals (Sweden)
Takeshi Morita
Full Text Available The rodent whisker system is a major model for understanding neural mechanisms for tactile sensation of surface texture (roughness. Rats discriminate surface texture using the whiskers, and several theories exist for how texture information is physically sensed by the long, moveable macrovibrissae and encoded in spiking of neurons in somatosensory cortex. However, evaluating these theories requires a psychometric curve for texture discrimination, which is lacking. Here we trained rats to discriminate rough vs. fine sandpapers and grooved vs. smooth surfaces. Rats intermixed trials at macrovibrissa contact distance (nose >2 mm from surface with trials at shorter distance (nose <2 mm from surface. Macrovibrissae were required for distant contact trials, while microvibrissae and non-whisker tactile cues were used for short distance trials. A psychometric curve was measured for macrovibrissa-based sandpaper texture discrimination. Rats discriminated rough P150 from smoother P180, P280, and P400 sandpaper (100, 82, 52, and 35 µm mean grit size, respectively. Use of olfactory, visual, and auditory cues was ruled out. This is the highest reported resolution for rodent texture discrimination, and constrains models of neural coding of texture information.
R-curve behavior and micromechanisms of fracture in resin based dental restorative composites.
Shah, M B; Ferracane, J L; Kruzic, J J
2009-10-01
The fracture properties and micromechanisms of fracture for two commercial dental composites, one microhybrid (FiltekZ250) and one nanofill (FiltekSupreme Plus), were studied by measuring fracture resistance curves (R-curves) using pre-cracked compact-tension specimens and by conducting both unnotched and double notched four point beam bending experiments. Four point bending experiments showed about 20% higher mean flexural strength of the microhybrid composite compared to the nanofill. Rising fracture resistance was observed over approximately 1 mm of crack extension for both composites, and higher overall fracture resistance was observed for the microhybrid composite. Such fracture behavior was attributed to crack deflection and crack bridging toughening mechanisms that developed with crack extension, causing the toughness to increase. Despite the lower strength and toughness of the present nanofill composite, based on micromechanics observations, large nanoparticle clusters appear to be as effective at deflecting cracks and imparting toughening as solid particles. Thus, with further microstructural refinement, it should be possible to achieve a superior combination of aesthetic and mechanical performance using the nanocluster approach for dental composites.
A PID Positioning Controller with a Curve Fitting Model Based on RFID Technology
Directory of Open Access Journals (Sweden)
Young-Long Chen
2013-04-01
Full Text Available The global positioning system (GPS is an important research topic to solve outdoor positioning problems, but GPS is unable to locate objects accurately and precisely indoors. Some available systems apply ultrasound or optical tracking. This paper presents an efficient proportional-integral-derivative (PID controller with curve fitting model for mobile robot localization and position estimation which adopts passive radio frequency identification (RFID tags in a space. This scheme is based on a mobile robot carries an RFID reader module which reads the installed low-cost passive tags under the floor in a grid-like pattern. The PID controllers increase the efficiency of captured RFID tags and the curve fitting model is used to systematically identify the revolutions per minute (RPM of the motor. We control and monitor the position of the robot from a remote location through a mobile phone via Wi-Fi and Bluetooth network. Experiment results present that the number of captured RFID tags of our proposed scheme outperforms that of the previous scheme.
Kort-Butler, Lisa A; Hagewen, Kellie J
2011-05-01
Research on adolescent self-esteem indicates that adolescence is a time in which individuals experience important changes in their physical, cognitive, and social identities. Prior research suggests that there is a positive relationship between an adolescent's participation in structured extracurricular activities and well-being in a variety of domains, and some research indicates that these relationships may be dependent on the type of activities in which adolescents participate. Building on previous research, a growth-curve analysis was utilized to examine self-esteem trajectories from adolescence (age 14) to young adulthood (age 26). Using 3 waves of data from National Longitudinal Study of Adolescent Health (n = 5,399; 47.8% male), the analysis estimated a hierarchical growth-curve model emphasizing the effects of age and type of school-based extracurricular activity portfolio, including sports and school clubs, on self-esteem. The results indicated that age had a linear relationship with self-esteem over time. Changes in both the initial level of self-esteem and the growth of self-esteem over time were significantly influenced by the type of extracurricular activity portfolio. The findings were consistent across race and sex. The results support the utility of examining the longitudinal impact of portfolio type on well-being outcomes.
An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.
Muthurajan, Vinothkumar; Narayanasamy, Balaji
2016-01-01
Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.
An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment
Directory of Open Access Journals (Sweden)
Vinothkumar Muthurajan
2016-01-01
Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.
A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks
Chen, Huifang; Ge, Linlin; Xie, Lei
2015-01-01
The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes. PMID:26184224
Spectral optimization simulation of white light based on the photopic eye-sensitivity curve
International Nuclear Information System (INIS)
Dai, Qi; Hao, Luoxi; Lin, Yi; Cui, Zhe
2016-01-01
Spectral optimization simulation of white light is studied to boost maximum attainable luminous efficacy of radiation at high color-rendering index (CRI) and various color temperatures. The photopic eye-sensitivity curve V(λ) is utilized as the dominant portion of white light spectra. Emission spectra of a blue InGaN light-emitting diode (LED) and a red AlInGaP LED are added to the spectrum of V(λ) to match white color coordinates. It is demonstrated that at the condition of color temperature from 2500 K to 6500 K and CRI above 90, such white sources can achieve spectral efficacy of 330–390 lm/W, which is higher than the previously reported theoretical maximum values. We show that this eye-sensitivity-based approach also has advantages on component energy conversion efficiency compared with previously reported optimization solutions
Spectral optimization simulation of white light based on the photopic eye-sensitivity curve
Energy Technology Data Exchange (ETDEWEB)
Dai, Qi, E-mail: qidai@tongji.edu.cn [College of Architecture and Urban Planning, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Institute for Advanced Study, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Key Laboratory of Ecology and Energy-saving Study of Dense Habitat (Tongji University), Ministry of Education, 1239 Siping Road, Shanghai 200092 (China); Hao, Luoxi; Lin, Yi; Cui, Zhe [College of Architecture and Urban Planning, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Key Laboratory of Ecology and Energy-saving Study of Dense Habitat (Tongji University), Ministry of Education, 1239 Siping Road, Shanghai 200092 (China)
2016-02-07
Spectral optimization simulation of white light is studied to boost maximum attainable luminous efficacy of radiation at high color-rendering index (CRI) and various color temperatures. The photopic eye-sensitivity curve V(λ) is utilized as the dominant portion of white light spectra. Emission spectra of a blue InGaN light-emitting diode (LED) and a red AlInGaP LED are added to the spectrum of V(λ) to match white color coordinates. It is demonstrated that at the condition of color temperature from 2500 K to 6500 K and CRI above 90, such white sources can achieve spectral efficacy of 330–390 lm/W, which is higher than the previously reported theoretical maximum values. We show that this eye-sensitivity-based approach also has advantages on component energy conversion efficiency compared with previously reported optimization solutions.
A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.
Chen, Huifang; Ge, Linlin; Xie, Lei
2015-07-14
The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.
Rational quadratic trigonometric Bézier curve based on new basis with exponential functions
Directory of Open Access Journals (Sweden)
Wu Beibei
2017-06-01
Full Text Available We construct a rational quadratic trigonometric Bézier curve with four shape parameters by introducing two exponential functions into the trigonometric basis functions in this paper. It has the similar properties as the rational quadratic Bézier curve. For given control points, the shape of the curve can be flexibly adjusted by changing the shape parameters and the weight. Some conics can be exactly represented when the control points, the shape parameters and the weight are chosen appropriately. The C0, C1 and C2 continuous conditions for joining two constructed curves are discussed. Some examples are given.
Timelike duality, M'-theory and an exotic form of the Englert solution
Henneaux, Marc; Ranjbar, Arash
2017-08-01
Through timelike dualities, one can generate exotic versions of M-theory with different spacetime signatures. These are the M *-theory with signature (9 , 2 , -), the M'-theory, with signature (6 , 5 , +) and the theories with reversed signatures (1 , 10 , -), (2 , 9 , +) and (5 , 6 , -). In ( s, t, ±), s is the number of space directions, t the number of time directions, and ± refers to the sign of the kinetic term of the 3 form. The only irreducible pseudo-riemannian manifolds admitting absolute parallelism are, besides Lie groups, the seven-sphere S 7 ≡ SO(8) /SO(7) and its pseudo-riemannian version S 3,4 ≡ SO(4 , 4) /SO(3 , 4). [There is also the complexification SO(8,C)/SO(7,C) , but it is of dimension too high for our considerations.] The seven-sphere S 7 ≡ S 7,0 has been found to play an important role in 11-dimensional supergravity, both through the Freund-Rubin solution and the Englert solution that uses its remarkable parallelizability to turn on non trivial internal fluxes. The spacetime manifold is in both cases AdS 4 × S 7. We show that S 3,4 enjoys a similar role in M '-theory and construct the exotic form AdS 4 × S 3,4 of the Englert solution, with non zero internal fluxes turned on. There is no analogous solution in M ∗-theory.
Detection of Time Lags between Quasar Continuum Emission Bands Based On Pan-STARRS Light Curves
Energy Technology Data Exchange (ETDEWEB)
Jiang, Yan-Fei [Kavli Institute for Theoretical Physics, University of California, Santa Barbara, CA 93106 (United States); Green, Paul J.; Pancoast, Anna; MacLeod, Chelsea L. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Greene, Jenny E. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Morganson, Eric; Shen, Yue [Department of Astronomy, University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States); Anderson, Scott F.; Ruan, John J. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Brandt, W. N.; Grier, C. J. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Rix, H.-W. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Protopapas, Pavlos [Institute for Applied Computational Science, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138 (United States); Scott, Caroline [Astrophysics, Imperial College London, Blackett Laboratory, London SW7 2AZ (United Kingdom); Burgett, W. S.; Hodapp, K. W.; Huber, M. E.; Kaiser, N.; Kudritzki, R. P.; Magnier, E. A. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu HI 96822 (United States); and others
2017-02-20
We study the time lags between the continuum emission of quasars at different wavelengths, based on more than four years of multi-band ( g , r , i , z ) light curves in the Pan-STARRS Medium Deep Fields. As photons from different bands emerge from different radial ranges in the accretion disk, the lags constrain the sizes of the accretion disks. We select 240 quasars with redshifts of z ≈ 1 or z ≈ 0.3 that are relatively emission-line free. The light curves are sampled from day to month timescales, which makes it possible to detect lags on the scale of the light crossing time of the accretion disks. With the code JAVELIN , we detect typical lags of several days in the rest frame between the g band and the riz bands. The detected lags are ∼2–3 times larger than the light crossing time estimated from the standard thin disk model, consistent with the recently measured lag in NGC 5548 and microlensing measurements of quasars. The lags in our sample are found to increase with increasing luminosity. Furthermore, the increase in lags going from g − r to g − i and then to g − z is slower than predicted in the thin disk model, particularly for high-luminosity quasars. The radial temperature profile in the disk must be different from what is assumed. We also find evidence that the lags decrease with increasing line ratios between ultraviolet Fe ii lines and Mg ii, which may point to changes in the accretion disk structure at higher metallicity.
Chamidah, Nur; Rifada, Marisa
2016-03-01
There is significant of the coeficient correlation between weight and height of the children. Therefore, the simultaneous model estimation is better than partial single response approach. In this study we investigate the pattern of sex difference in growth curve of children from birth up to two years of age in Surabaya, Indonesia based on biresponse model. The data was collected in a longitudinal representative sample of the Surabaya population of healthy children that consists of two response variables i.e. weight (kg) and height (cm). While a predictor variable is age (month). Based on generalized cross validation criterion, the modeling result based on biresponse model by using local linear estimator for boy and girl growth curve gives optimal bandwidth i.e 1.41 and 1.56 and the determination coefficient (R2) i.e. 99.99% and 99.98%,.respectively. Both boy and girl curves satisfy the goodness of fit criterion i.e..the determination coefficient tends to one. Also, there is difference pattern of growth curve between boy and girl. The boy median growth curves is higher than those of girl curve.
Directory of Open Access Journals (Sweden)
Patel Jimit R.
2015-01-01
Full Text Available This paper analyzes the combined effect of slip velocity and transverse roughness on the performance of a Jenkins model based ferrofluid lubrication of a squeeze film in curved rough annular plates. The slip model of Beavers and Joseph has been invoked to evaluate the effect of slip velocity. In order to find the effect of surface roughness the stochastic averaging model of Christensen and Tonder has been used. The pressure distribution is obtained by solving the concerned stochastically averaged Reynolds type equation. The load carrying capacity is calculated. The graphical representations of the results indicate that the effect of transverse surface roughness is adverse in general, however, the situation is relatively better in the case of negatively skewed roughness. Further, Jenkins model based ferrofluid lubrication offers some measures in reducing the adverse effect of roughness when slip parameter is kept at reduced level with a suitable ratio of curvature parameters. Lastly, the positive effect of magnetization gets a boost due to the combined effect of variance (-ve and negatively skewed roughness suitably choosing the aspect ratio.
Directory of Open Access Journals (Sweden)
Noriyuki Oka
Full Text Available In the brain, the mechanisms of attention to the left and the right are known to be different. It is possible that brain activity when driving also differs with different horizontal road alignments (left or right curves, but little is known about this. We found driver brain activity to be different when driving on left and right curves, in an experiment using a large-scale driving simulator and functional near-infrared spectroscopy (fNIRS.The participants were fifteen healthy adults. We created a course simulating an expressway, comprising straight line driving and gentle left and right curves, and monitored the participants under driving conditions, in which they drove at a constant speed of 100 km/h, and under non-driving conditions, in which they simply watched the screen (visual task. Changes in hemoglobin concentrations were monitored at 48 channels including the prefrontal cortex, the premotor cortex, the primary motor cortex and the parietal cortex. From orthogonal vectors of changes in deoxyhemoglobin and changes in oxyhemoglobin, we calculated changes in cerebral oxygen exchange, reflecting neural activity, and statistically compared the resulting values from the right and left curve sections.Under driving conditions, there were no sites where cerebral oxygen exchange increased significantly more during right curves than during left curves (p > 0.05, but cerebral oxygen exchange increased significantly more during left curves (p < 0.05 in the right premotor cortex, the right frontal eye field and the bilateral prefrontal cortex. Under non-driving conditions, increases were significantly greater during left curves (p < 0.05 only in the right frontal eye field.Left curve driving was thus found to require more brain activity at multiple sites, suggesting that left curve driving may require more visual attention than right curve driving. The right frontal eye field was activated under both driving and non-driving conditions.
Supergravity on an Atiyah-Hitchin base
International Nuclear Information System (INIS)
Stotyn, Sean; Mann, R.B.
2008-01-01
We construct solutions to five dimensional minimal supergravity using an Atiyah-Hitchin base space. In examining the structure of solutions we show that they generically contain a singularity either on the Atiyah-Hitchin bolt or at larger radius where there is a singular solitonic boundary. However for most points in parameter space the solution exhibits a velocity of light surface (analogous to what appears in a Goedel space-time) that shields the singularity. For these solutions, all closed time-like curves are causally disconnected from the rest of the space-time in that they exist within the velocity of light surface, which null geodesics are unable to cross. The singularities in these solutions are thus found to be hidden behind the velocity of light surface and so are not naked despite the lack of an event horizon. Outside of this surface the space-time is geodesically complete, asymptotically flat and can be arranged so as not to contain closed time-like curves at infinity. The rest of parameter space simply yields solutions with naked singularities.
Malik, M. S.; Cavuto, A.; Martarelli, M.; Pandarese, G.; Revel, G. M.
2014-05-01
High speed train axles are integrated for a lifetime and it is time and resource consuming to conduct in service inspection with high accuracy. Laser ultrasonics is a proposed solution as a subset of non-contact measuring methods effective also for hard to reach areas and even recently proved to be effective using Laser Doppler Vibrometer (LDV) or air-coupled probes in reception. A reliability analysis of laser ultrasonics for this specific application is here performed. The research is mainly based on numerical study of the effect of high energy laser pulses on the surface of a steel axle and of the behavior of the ultrasonic waves in detecting possible defects. Probability of Detection (POD) concept is used as an estimated reliability of the inspection method. In particular Model Assisted Probability of Detection (MAPOD), a modified form of POD where models are used to infer results for making a decisive statistical approach of POD curve, is here adopted. This paper implements this approach by taking the inputs from limited experiments conducted on a high speed train axle using laser ultrasonics (source pulsed Nd:Yag, reception by high-frequency LDV) to calibrate a multiphysics FE model and by using the calibrated model to generate data samples statistically representative of damaged train axles. The simulated flaws are in accordance with the real defects present on the axle. A set of flaws of different depth has been modeled in order to assess the laser ultrasonics POD for this specific application.
Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.
Directory of Open Access Journals (Sweden)
Liping Zhang
Full Text Available In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.
Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi
2013-08-10
An optical identity authentication scheme based on the elliptic curve digital signature algorithm (ECDSA) and phase retrieval algorithm (PRA) is proposed. In this scheme, a user's certification image and the quick response code of the user identity's keyed-hash message authentication code (HMAC) with added noise, serving as the amplitude and phase restriction, respectively, are digitally encoded into two phase keys using a PRA in the Fresnel domain. During the authentication process, when the two phase keys are presented to the system and illuminated by a plane wave of correct wavelength, an output image is generated in the output plane. By identifying whether there is a match between the amplitude of the output image and all the certification images pre-stored in the database, the system can thus accomplish a first-level verification. After the confirmation of first-level verification, the ECDSA signature is decoded from the phase part of the output image and verified to allege whether the user's identity is legal or not. Moreover, the introduction of HMAC makes it almost impossible to forge the signature and hence the phase keys thanks to the HMAC's irreversible property. Theoretical analysis and numerical simulations both validate the feasibility of our proposed scheme.
Interior Temperature Measurement Using Curved Mercury Capillary Sensor Based on X-ray Radiography
Chen, Shuyue; Jiang, Xing; Lu, Guirong
2017-07-01
A method was presented for measuring the interior temperature of objects using a curved mercury capillary sensor based on X-ray radiography. The sensor is composed of a mercury bubble, a capillary and a fixed support. X-ray digital radiography was employed to capture image of the mercury column in the capillary, and a temperature control system was designed for the sensor calibration. We adopted livewire algorithms and mathematical morphology to calculate the mercury length. A measurement model relating mercury length to temperature was established, and the measurement uncertainty associated with the mercury column length and the linear model fitted by least-square method were analyzed. To verify the system, the interior temperature measurement of an autoclave, which is totally closed, was taken from 29.53°C to 67.34°C. The experiment results show that the response of the system is approximately linear with an uncertainty of maximum 0.79°C. This technique provides a new approach to measure interior temperature of objects.
Energy Technology Data Exchange (ETDEWEB)
Li, Ben; He, Feng; Ouyang, Jiting, E-mail: jtouyang@bit.edu.cn [School of Physics, Beijing Institute of Technology, Beijing 100081 (China); Duan, Xiaoxi [Research Center of Laser Fusion, CAEP, Mianyang 621900 (China)
2015-12-15
Simulation work is very important for understanding the formation of self-organized discharge patterns. Previous works have witnessed different models derived from other systems for simulation of discharge pattern, but most of these models are complicated and time-consuming. In this paper, we introduce a convenient phenomenological dynamic model based on the basic dynamic process of glow discharge and the voltage transfer curve (VTC) to study the dielectric barrier glow discharge (DBGD) pattern. VTC is an important characteristic of DBGD, which plots the change of wall voltage after a discharge as a function of the initial total gap voltage. In the modeling, the combined effect of the discharge conditions is included in VTC, and the activation-inhibition effect is expressed by a spatial interaction term. Besides, the model reduces the dimensionality of the system by just considering the integration effect of current flow. All these greatly facilitate the construction of this model. Numerical simulations turn out to be in good accordance with our previous fluid modeling and experimental result.
Multiaxial fatigue criterion based on parameters from torsion and axial S-N curve
Directory of Open Access Journals (Sweden)
M. Margetin
2016-07-01
Full Text Available Multiaxial high cycle fatigue is a topic that concerns nearly all industrial domains. In recent years, a great deal of recommendations how to address problems with multiaxial fatigue life time estimation have been made and a huge progress in the field has been achieved. Until now, however, no universal criterion for multiaxial fatigue has been proposed. Addressing this situation, this paper offers a design of a new multiaxial criterion for high cycle fatigue. This criterion is based on critical plane search. Damage parameter consists of a combination of normal and shear stresses on a critical plane (which is a plane with maximal shear stress amplitude. Material parameters used in proposed criterion are obtained from torsion and axial S-N curves. Proposed criterion correctly calculates life time for boundary loading condition (pure torsion and pure axial loading. Application of proposed model is demonstrated on biaxial loading and the results are verified with testing program using specimens made from S355 steel. Fatigue material parameters for proposed criterion and multiple sets of data for different combination of axial and torsional loading have been obtained during the experiment.
Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.
Zhang, Liping; Tang, Shanyu; Luo, He
2016-01-01
In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.
Li, Yi; Chen, Yuren
2016-12-30
To make driving assistance system more humanized, this study focused on the prediction and assistance of drivers' perception-response time on mountain highway curves. Field tests were conducted to collect real-time driving data and driver vision information. A driver-vision lane model quantified curve elements in drivers' vision. A multinomial log-linear model was established to predict perception-response time with traffic/road environment information, driver-vision lane model, and mechanical status (last second). A corresponding assistance model showed a positive impact on drivers' perception-response times on mountain highway curves. Model results revealed that the driver-vision lane model and visual elements did have important influence on drivers' perception-response time. Compared with roadside passive road safety infrastructure, proper visual geometry design, timely visual guidance, and visual information integrality of a curve are significant factors for drivers' perception-response time.
Li, Yi; Chen, Yuren
2016-01-01
To make driving assistance system more humanized, this study focused on the prediction and assistance of drivers’ perception-response time on mountain highway curves. Field tests were conducted to collect real-time driving data and driver vision information. A driver-vision lane model quantified curve elements in drivers’ vision. A multinomial log-linear model was established to predict perception-response time with traffic/road environment information, driver-vision lane model, and mechanica...
Identification of replication origins in archaeal genomes based on the Z-curve method
Directory of Open Access Journals (Sweden)
Ren Zhang
2005-01-01
Full Text Available The Z-curve is a three-dimensional curve that constitutes a unique representation of a DNA sequence, i.e., both the Z-curve and the given DNA sequence can be uniquely reconstructed from the other. We employed Z-curve analysis to identify one replication origin in the Methanocaldococcus jannaschii genome, two replication origins in the Halobacterium species NRC-1 genome and one replication origin in the Methanosarcina mazei genome. One of the predicted replication origins of Halobacterium species NRC-1 is the same as a replication origin later identified by in vivo experiments. The Z-curve analysis of the Sulfolobus solfataricus P2 genome suggested the existence of three replication origins, which is also consistent with later experimental results. This review aims to summarize applications of the Z-curve in identifying replication origins of archaeal genomes, and to provide clues about the locations of as yet unidentified replication origins of the Aeropyrum pernix K1, Methanococcus maripaludis S2, Picrophilus torridus DSM 9790 and Pyrobaculum aerophilum str. IM2 genomes.
Phenomenological analysis of near-threshold periodic modulations of the proton timelike form factor
Bianconi, A.; Tomasi-Gustafsson, E.
2016-03-01
We have recently highlighted the presence of a periodically oscillating 10% modulation in the BABAR Collaboration data on the proton timelike form factors, expressing the deviations from the pointlike behavior of the proton-antiproton electromagnetic current in the reaction e++e-→p ¯+p . Here we deepen our previous data analysis and confirm that in the case of several standard parametrizations it is possible to write the form factor in the form F0+Fosc , where F0 is a parametrization expressing the long-range trend of the form factor (for q2 ranging from the p ¯p threshold to 36 GeV2), and Fosc is a function of the form exp(-B p )cos(C p ) , where p is the relative momentum of the final p ¯p pair. Error bars allow for a clean identification of the main features of this modulation for q2annihilation of p ¯p pairs into multimeson states. We interpret the flux-creating part of the potential as due to the creation of a 1 /q -ranged state when the virtual photon decays into a set of current quarks and antiquarks. This short-lived compact state may be expressed as a sum of several hadronic states including the ones with large mass Qn≫q , that may exist for a time t ˜1 /(Qn-q ) . The decay of these large-mass states leads to an intermediate-stage regeneration of the p ¯p channel.
Timelike and null equatorial geodesics in the Bonnor-Sackfield relativistic disk
Directory of Open Access Journals (Sweden)
Guillermo A. González
2011-06-01
Full Text Available A study of timelike and null equatorial geodesics in the BonnorSackfield relativistic thin disk is presented. The motion of test particles in the equatorial plane is analyzed, both for the newtonian thin disk model as for the corresponding relativistic disk. The nature of the possible orbits is studied by means of a qualitative analysis of the effective potential and by numerically solving the motion equation for radial and non-radial equatorial trajectories. The existence of stable, unstable and marginally stable circular orbits is analyzed, both for the newtonian and relativistic case. Examples of the numerical results, obtained with some simple values of the parameters, are presented. Resumen. En este trabajo se presenta un estudio de las geodésicas temporales y nulas en el disco delgado relativista y newtoniano de Bonnor-Sackfield. Se analiza el movimiento de las partículas de prueba en el plano ecuatorial, tanto para el modelo newtoniano del disco delgado como para el disco relativista correspondiente. La naturaleza de las órbitas posibles se estudia por medio de un análisis cualitativo del potencial efectivo, y numéricamente mediante la solución de la ecuación de movimiento de las trayectorias ecuatorial radial y no radial: Se analiza la existencia de órbitas estables, circulares inestables y estables marginalmente, tanto para el caso newtoniano, como el relativista. Se presentan ejemplos de los resultados numéricos obtenidos con algunos valores de los parámetros simples.
Directory of Open Access Journals (Sweden)
M. Sivapalan
2012-11-01
Full Text Available Predictions of hydrological responses in ungauged catchments can benefit from a classification scheme that can organize and pool together catchments that exhibit a level of hydrologic similarity, especially similarity in some key variable or signature of interest. Since catchments are complex systems with a level of self-organization arising from co-evolution of climate and landscape properties, including vegetation, there is much to be gained from developing a classification system based on a comparative study of a population of catchments across climatic and landscape gradients. The focus of this paper is on climate seasonality and seasonal runoff regime, as characterized by the ensemble mean of within-year variation of climate and runoff. The work on regime behavior is part of an overall study of the physical controls on regional patterns of flow duration curves (FDCs, motivated by the fact that regime behavior leaves a major imprint upon the shape of FDCs, especially the slope of the FDCs. As an exercise in comparative hydrology, the paper seeks to assess the regime behavior of 428 catchments from the MOPEX database simultaneously, classifying and regionalizing them into homogeneous or hydrologically similar groups. A decision tree is developed on the basis of a metric chosen to characterize similarity of regime behavior, using a variant of the Iterative Dichotomiser 3 (ID3 algorithm to form a classification tree and associated catchment classes. In this way, several classes of catchments are distinguished, in which the connection between the five catchments' regime behavior and climate and catchment properties becomes clearer. Only four similarity indices are entered into the algorithm, all of which are obtained from smoothed daily regime curves of climatic variables and runoff. Results demonstrate that climate seasonality plays the most significant role in the classification of US catchments, with rainfall timing and climatic aridity index
Gonyon, Thomas; Carter, Phillip W; Phillips, Gerald; Owen, Heather; Patel, Dipa; Kotha, Priyanka; Green, John-Bruce D
2014-08-01
The information content of the calcium phosphate compatibility curves for adult parenteral nutrition (PN) solutions may benefit from a more sophisticated statistical treatment. Binary logistic regression analyses were evaluated as part of an alternate method for generating formulation compatibility curves. A commercial PN solution was challenged with a systematic array of calcium and phosphate concentrations. These formulations were then characterized for particulates by visual inspection, light obscuration, and filtration followed by optical microscopy. Logistic regression analyses of the data were compared with traditional treatments for generating compatibility curves. Assay-dependent differences were observed in the compatibility curves and associated probability contours; the microscopic method of precipitate detection generated the most robust results. Calcium and phosphate compatibility data generated from small-volume glass containers reasonably predicted the observed compatibility of clinically relevant flexible containers. The published methods for creating calcium and phosphate compatibility curves via connecting the highest passing or lowest failing calcium concentrations should be augmented or replaced by probability contours of the entire experimental design to determine zones of formulation incompatibilities. We recommend researchers evaluate their data with logistic regression analysis to help build a more comprehensive probabilistic database of compatibility information. © 2013 American Society for Parenteral and Enteral Nutrition.
Mukkamala, R.; Cohen, R. J.; Mark, R. G.
2002-01-01
Guyton developed a popular approach for understanding the factors responsible for cardiac output (CO) regulation in which 1) the heart-lung unit and systemic circulation are independently characterized via CO and venous return (VR) curves, and 2) average CO and right atrial pressure (RAP) of the intact circulation are predicted by graphically intersecting the curves. However, this approach is virtually impossible to verify experimentally. We theoretically evaluated the approach with respect to a nonlinear, computational model of the pulsatile heart and circulation. We developed two sets of open circulation models to generate CO and VR curves, differing by the manner in which average RAP was varied. One set applied constant RAPs, while the other set applied pulsatile RAPs. Accurate prediction of intact, average CO and RAP was achieved only by intersecting the CO and VR curves generated with pulsatile RAPs because of the pulsatility and nonlinearity (e.g., systemic venous collapse) of the intact model. The CO and VR curves generated with pulsatile RAPs were also practically independent. This theoretical study therefore supports the validity of Guyton's graphical analysis.
Memristance controlling approach based on modification of linear M—q curve
International Nuclear Information System (INIS)
Liu Hai-Jun; Li Zhi-Wei; Yu Hong-Qi; Sun Zhao-Lin; Nie Hong-Shan
2014-01-01
The memristor has broad application prospects in many fields, while in many cases, those fields require accurate impedance control. The nonlinear model is of great importance for realizing memristance control accurately, but the implementing complexity caused by iteration has limited the actual application of this model. Considering the approximate linear characteristics at the middle region of the memristance-charge (M—q) curve of the nonlinear model, this paper proposes a memristance controlling approach, which is achieved by linearizing the middle region of the M—q curve of the nonlinear memristor, and establishes the linear relationship between memristances M and input excitations so that it can realize impedance control precisely by only adjusting input signals briefly. First, it analyzes the feasibility for linearizing the middle part of the M—q curve of the memristor with a nonlinear model from the qualitative perspective. Then, the linearization equations of the middle region of the M—q curve is constructed by using the shift method, and under a sinusoidal excitation case, the analytical relation between the memristance M and the charge time t is derived through the Taylor series expansions. At last, the performance of the proposed approach is demonstrated, including the linearizing capability for the middle part of the M—q curve of the nonlinear model memristor, the controlling ability for memristance M, and the influence of input excitation on linearization errors. (interdisciplinary physics and related areas of science and technology)
International Nuclear Information System (INIS)
Paegert, Martin; Stassun, Keivan G.; Burger, Dan M.
2014-01-01
We describe a new neural-net-based light curve classifier and provide it with documentation as a ready-to-use tool for the community. While optimized for identification and classification of eclipsing binary stars, the classifier is general purpose, and has been developed for speed in the context of upcoming massive surveys such as the Large Synoptic Survey Telescope. A challenge for classifiers in the context of neural-net training and massive data sets is to minimize the number of parameters required to describe each light curve. We show that a simple and fast geometric representation that encodes the overall light curve shape, together with a chi-square parameter to capture higher-order morphology information results in efficient yet robust light curve classification, especially for eclipsing binaries. Testing the classifier on the ASAS light curve database, we achieve a retrieval rate of 98% and a false-positive rate of 2% for eclipsing binaries. We achieve similarly high retrieval rates for most other periodic variable-star classes, including RR Lyrae, Mira, and delta Scuti. However, the classifier currently has difficulty discriminating between different sub-classes of eclipsing binaries, and suffers a relatively low (∼60%) retrieval rate for multi-mode delta Cepheid stars. We find that it is imperative to train the classifier's neural network with exemplars that include the full range of light curve quality to which the classifier will be expected to perform; the classifier performs well on noisy light curves only when trained with noisy exemplars. The classifier source code, ancillary programs, a trained neural net, and a guide for use, are provided.
Monitoring and Fault Detection in Photovoltaic Systems Based On Inverter Measured String I-V Curves
DEFF Research Database (Denmark)
Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas
2015-01-01
Most photovoltaic (PV) string inverters have the hardware capability to measure at least part of the current-voltage (I-V) characteristic curve of the PV strings connected at the input. However, this intrinsic capability of the inverters is not used, since I-V curve measurement and monitoring...... functions are not implemented in the inverter control software. In this paper, we aim to show how such a functionality can be useful for PV system monitoring purposes, to detect the presence and cause of power-loss in the PV strings, be it due to shading, degradation of the PV modules or balance-of-system...... components through increased series resistance losses, or shunting of the PV modules. To achieve this, we propose and experimentally demonstrate three complementary PV system monitoring methods that make use of the I-V curve measurement capability of a commercial string inverter. The first method is suitable...
van Aert, Robbie C M; Wicherts, Jelte M.; van Assen, Marcel A L M
2016-01-01
Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of
Directory of Open Access Journals (Sweden)
Xiaoyun Huang
2015-09-01
Full Text Available To improve the real-time performance and detection rate of a Lane Detection and Reconstruction (LDR system, an extended-search-based lane detection method and a Bézier curve-based lane reconstruction algorithm are proposed in this paper. The extended-search-based lane detection method is designed to search boundary blocks from the initial position, in an upwards direction and along the lane, with small search areas including continuous search, discontinuous search and bending search in order to detect different lane boundaries. The Bézier curve-based lane reconstruction algorithm is employed to describe a wide range of lane boundary forms with comparatively simple expressions. In addition, two Bézier curves are adopted to reconstruct the lanes' outer boundaries with large curvature variation. The lane detection and reconstruction algorithm — including initial-blocks' determining, extended search, binarization processing and lane boundaries' fitting in different scenarios — is verified in road tests. The results show that this algorithm is robust against different shadows and illumination variations; the average processing time per frame is 13 ms. Significantly, it presents an 88.6% high-detection rate on curved lanes with large or variable curvatures, where the accident rate is higher than that of straight lanes.
Hsu, Pi-Shan
2012-01-01
This study aims to develop the core mechanism for realizing the development of personalized adaptive e-learning platform, which is based on the previous learning effort curve research and takes into account the learner characteristics of learning style and self-efficacy. 125 university students from Taiwan are classified into 16 groups according…
International Nuclear Information System (INIS)
Kuivalainen, Kalle; Peiponen, Kai-Erik; Myller, Kari
2009-01-01
An optical measurement device, which is a diffractive element-based sensor, is presented for the detection of latent fingerprints on curved objects such as a ballpoint pen. The device provides image and gloss information on the ridges of a fingerprint. The device is expected to have applications in forensic studies. (technical design note)
Brinkman, W.M.; Luursema, J.M.; Kengen, B.; Schout, B.M.; Witjes, J.A.; Bekkers, R.L.M.
2013-01-01
OBJECTIVE: To answer 2 research questions: what are the learning curve patterns of novices on the da Vinci skills simulator parameters and what parameters are appropriate for criterion-based robotic training. MATERIALS AND METHODS: A total of 17 novices completed 2 simulator sessions within 3 days.
Optimization of ISOL targets based on Monte-Carlo simulations of ion release curves
Mustapha, B
2003-01-01
A detailed model for simulating release curves from ISOL targets has been developed. The full 3D geometry is implemented using Geant-4. Produced particles are followed individually from production to release. The delay time is computed event by event. All processes involved: diffusion, effusion and decay are included to obtain the overall release curve. By fitting to the experimental data, important parameters of the release process (diffusion coefficient, sticking time, ...) are extracted. They can be used to improve the efficiency of existing targets and design new ones more suitable to produce beams of rare isotopes.
Optimization of ISOL targets based on Monte-Carlo simulations of ion release curves
International Nuclear Information System (INIS)
Mustapha, B.; Nolen, J.A.
2003-01-01
A detailed model for simulating release curves from ISOL targets has been developed. The full 3D geometry is implemented using Geant-4. Produced particles are followed individually from production to release. The delay time is computed event by event. All processes involved: diffusion, effusion and decay are included to obtain the overall release curve. By fitting to the experimental data, important parameters of the release process (diffusion coefficient, sticking time, ...) are extracted. They can be used to improve the efficiency of existing targets and design new ones more suitable to produce beams of rare isotopes
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
Directory of Open Access Journals (Sweden)
Alavalapati Goutham Reddy
Full Text Available Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
DEFF Research Database (Denmark)
Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja
2013-01-01
This paper introduces EECM-MPFQ, a fast implementation of the elliptic-curve method of factoring integers. EECM-MPFQ uses fewer modular multiplications than the well-known GMP-ECM software, takes less time than GMP-ECM, and finds more primes than GMP-ECM. The main improvements above the modular......-arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...
Image Features Based on Characteristic Curves and Local Binary Patterns for Automated HER2 Scoring
Directory of Open Access Journals (Sweden)
Ramakrishnan Mukundan
2018-02-01
Full Text Available This paper presents novel feature descriptors and classification algorithms for the automated scoring of HER2 in Whole Slide Images (WSI of breast cancer histology slides. Since a large amount of processing is involved in analyzing WSI images, the primary design goal has been to keep the computational complexity to the minimum possible level and to use simple, yet robust feature descriptors that can provide accurate classification of the slides. We propose two types of feature descriptors that encode important information about staining patterns and the percentage of staining present in ImmunoHistoChemistry (IHC-stained slides. The first descriptor is called a characteristic curve, which is a smooth non-increasing curve that represents the variation of percentage of staining with saturation levels. The second new descriptor introduced in this paper is a local binary pattern (LBP feature curve, which is also a non-increasing smooth curve that represents the local texture of the staining patterns. Both descriptors show excellent interclass variance and intraclass correlation and are suitable for the design of automatic HER2 classification algorithms. This paper gives the detailed theoretical aspects of the feature descriptors and also provides experimental results and a comparative analysis.
Aspects of Pairing Based Cryptography on Jacobians of Genus Two Curves
DEFF Research Database (Denmark)
Ravnshøj, Christian Robenhagen
The thesis concerns properties of Jacobians of genus two curves defined over a finite field. Such Jacobians have a wide range of applications in data security; e.g. netbanking and digital signature. New properties of the Jacobians are proved; here, a description of the embedding of -torsion point...
Zhang, Y. S.; Wang, H.; Chen, G. L.; Zhang, X. Q.
2007-03-01
Advanced high strength steels are being increasingly used in the automotive industry to reduce weight and improve fuel economy. However, due to increased physical properties and chemistry of high strength steels, it is difficult to directly substitute these materials into production processes currently designed for mild steels. New process parameters and process-related issues must be developed and understood for high strength steels. Among all issues, endurance of the electrode cap is the most important. In this paper, electrode wear characteristics of hot-dipped galvanized dual-phase (DP600) steels and the effect on weld quality are firstly analysed. An electrode displacement curve which can monitor electrode wear was measured by a developing experimental system using a servo gun. A neuro-fuzzy inference system based on the electrode displacement curve is developed for minimizing the effect of a worn electrode on weld quality by adaptively adjusting input variables based on the measured electrode displacement curve when electrode wear occurs. A modified current curve is implemented to reduce the effects of electrode wear on weld quality using a developed neuro-fuzzy system.
Fracture resistance curves and toughening mechanisms in polymer based dental composites
DEFF Research Database (Denmark)
De Souza, J.A.; Goutianos, Stergios; Skovgaard, M.
2011-01-01
The fracture resistance (R-curve behaviour) of two commercial dental composites (Filtek Z350® and Concept Advanced®) were studied using Double Cantilever Beam sandwich specimens loaded with pure bending moments to obtain stable crack growth. The experiments were conducted in an environmental...... displayed distinctly different R-curve behaviours. The difference was related to different toughening mechanisms as the two composites had markedly different microstructures. Contrary to common experience, the composite with the finer microstructure (smaller particles), the Concept Advanced®, showed...... significantly higher fracture resistance than the composite with the coarser microstructure. The fracture properties were related to the flexural strength of the dental composites. The method, thus, can provide useful insight into how the microstructure enhances toughness, which is necessary for the future...
Carrancho, Á.; Villalaín, J. J.; Pavón-Carrasco, F. J.; Osete, M. L.; Straus, L. G.; Vergès, J. M.; Carretero, J. M.; Angelucci, D. E.; González Morales, M. R.; Arsuaga, J. L.; Bermúdez de Castro, J. M.; Carbonell, E.
2013-10-01
Neolithic, Chalcolithic and Bronze Age anthropogenic cave sediments from three caves from northern Spain have been palaeomagnetically investigated. 662 oriented specimens corresponding to 39 burning events (ash-carbonaceous couplets) from the three sites with an average of 16 samples per fire were collected. 26 new archaeomagnetic directions have been obtained for the time period ranging from 5500 to 2000 yr cal. BC. These results represent the oldest archaeomagnetic directions obtained from burnt archaeological materials throughout all Western Europe. Magnetisation is carried by pseudo-single domain low-coercivity ferromagnetic minerals (magnetite, magnetite with no significant isomorphous substitution and/or maghaemite). Rock-magnetic experiments indicate a thermoremanent origin of the magnetisation although a thermochemical magnetisation cannot be excluded. Combination of the new data presented here and the recent updated Bulgarian database allows us to propose the first European palaeosecular variation (PSV) curve for the Neolithic. A bootstrap method was applied for the curve construction using penalised cubic B-splines in time. The new palaeosecular variation curve is well constrained from 6000 BC to 3700 BC, the period with the highest density of data, showing a declination maximum around 4700 BC and a minimum in inclination at 4300 BC, which are not recorded by the recent global CALS10K.1b and regional SCHA.DIF.8K models due to the use of lake sediment data. Dating resolution by using the proposed PSV curve oscillates from approximately ±30 yr to ±200 yr for the period 6000 to 1000 yr BC, reaching similar resolution as radiocarbon dating. Considering the good preservation, age-control and widespread occurrence of burnt archaeological materials across Southern Europe, they represent a new source of data for geomagnetic field modelling, as well as for archaeomagnetic dating.
An Improved MPPT Algorithm for PV Generation Applications Based on P-U Curve Reconstitution
Wang, Yaoqiang; Zhang, Meiling; Cheng, Xian
2016-01-01
The output power of PV array changes with the variation of environmental factors, such as temperature and solar irradiation. Therefore, a maximum power point (MPP) tracking (MPPT) algorithm is essential for the photovoltaic generation system. However, the P-U curve changes dynamically with the variation of the environmental factors; here, the misjudgment may occur if a simple perturb-and-observe (P&O) MPPT algorithm is used. In order to solve this problem, this paper takes MPPT as the main re...
The Predicting Model of E-commerce Site Based on the Ideas of Curve Fitting
Tao, Zhang; Li, Zhang; Dingjun, Chen
On the basis of the idea of the second multiplication curve fitting, the number and scale of Chinese E-commerce site is analyzed. A preventing increase model is introduced in this paper, and the model parameters are solved by the software of Matlab. The validity of the preventing increase model is confirmed though the numerical experiment. The experimental results show that the precision of preventing increase model is ideal.
SAR raw data compression based on geometric characteristic of Gaussian curve
Liu, Juan-ni; Zhou, Quan
2015-07-01
Because of simple and good performance, the block adaptive quantization (BAQ) algorithm becomes a popular method for spaceborne synthetic aperture radar (SAR) raw data compression. As the distribution of SAR data can be accurately modeled as Gaussian, the algorithm adaptively quantizes the SAR data using Llyod-Max quantizer, which is optimal for standard Gaussian signal. However, due to the complexity of the imaging target features, the probability distribution function of some SAR data deviates from the Gaussian distribution, so the BAQ compression performance declined. In view of this situation, this paper proposes a method to judge whether the data satisfies Gaussian distribution by using the geometrical relationship between standard Gaussian curve and a triangle whose area is equal to that of the Gaussian curve, then getting the coordinates of the intersection of two curves, and comparing the integral value within each node to form three judgment conditions. Finally, the data satisfying these conditions is compressed by BAQ, otherwise compressed by DPCM. Experimental results indicate that the proposed scheme improves the performance compared with BAQ method.
A New Model of Stopping Sight Distance of Curve Braking Based on Vehicle Dynamics
Directory of Open Access Journals (Sweden)
Rong-xia Xia
2016-01-01
Full Text Available Compared with straight-line braking, cornering brake has longer braking distance and poorer stability. Therefore, drivers are more prone to making mistakes. The braking process and the dynamics of vehicles in emergency situations on curves were analyzed. A biaxial four-wheel vehicle was simplified to a single model. Considering the braking process, dynamics, force distribution, and stability, a stopping sight distance of the curve braking calculation model was built. Then a driver-vehicle-road simulation platform was built using multibody dynamic software. The vehicle test of brake-in-turn was realized in this platform. The comparison of experimental and calculated values verified the reliability of the computational model. Eventually, the experimental values and calculated values were compared with the stopping sight distance recommended by the Highway Route Design Specification (JTGD20-2006; the current specification of stopping sight distance does not apply to cornering brake sight distance requirements. In this paper, the general values and limits of the curve stopping sight distance are presented.
Directory of Open Access Journals (Sweden)
A. Yu. Alikov
2017-01-01
Full Text Available Objectives. The aim of present paper is to minimise the errors in the approximation of experimentally obtained acceleration curves.Methods. Based on the features and disadvantages of the well-known Simoyu method for calculating transfer functions on the basis of acceleration curves, a modified version of the method is developed using the MathLab and MathCad software. This is based on minimising the sum of the squares of the experimental point deviations from the solution of the differential equation at the same points.Results. Methods for the implementation of parametric identification are analysed and the Simoyu method is chosen as the most effective. On the basis of the analysis of its advantages and disadvantages, a modified method is proposed that allows the structure and parameters of the transfer function to be identified according to the experimental acceleration curve, as well as the choice of optimal numerical values of those parameters obtained for minimising errors in the approximation of the experimentally obtained acceleration curves.Conclusion. The problem of optimal control over a complex technical facility was solved. On the basis of the modified Simoyu method, an algorithm for the automated selection of the optimal shape and calculation of transfer function parameters of dynamic elements of complex technical objects according to the acceleration curves in the impact channels was developed. This has allowed the calculation efficiency of the dynamic characteristics of control objects to be increased by minimising the approximation errors. The efficiency of the proposed calculation method is shown. Its simplicity makes it possible to apply to practical calculations, especially for use in the design of complex technical objects within the framework of the computer aided design system. The proposed method makes it possible to increase the accuracy of the approximation by at least 20%, which is an important advantage for its practical
Tema, E.; Herrero-Bervera, E.; Lanos, Ph.
2017-11-01
Hawaii is an ideal place for reconstructing the past variations of the Earth's magnetic field in the Pacific Ocean thanks to the almost continuous volcanic activity during the last 10 000 yrs. We present here an updated compilation of palaeomagnetic data from historic and radiocarbon dated Hawaiian lava flows available for the last ten millennia. A total of 278 directional and 66 intensity reference data have been used for the calculation of the first full geomagnetic field reference secular variation (SV) curves for central Pacific covering the last ten millennia. The obtained SV curves are calculated following recent advances on curve building based on the Bayesian statistics and are well constrained for the last five millennia while for older periods their error envelopes are wide due to the scarce number of reference data. The new Bayesian SV curves show three clear intensity maxima during the last 3000 yrs that are accompanied by sharp directional changes. Such short-term variations of the geomagnetic field could be interpreted as archaeomagnetic jerks and could be an interesting feature of the geomagnetic field variation in the Pacific Ocean that should be further explored by new data.
Interpolating Spline Curve-Based Perceptual Encryption for 3D Printing Models
Directory of Open Access Journals (Sweden)
Giao N. Pham
2018-02-01
Full Text Available With the development of 3D printing technology, 3D printing has recently been applied to many areas of life including healthcare and the automotive industry. Due to the benefit of 3D printing, 3D printing models are often attacked by hackers and distributed without agreement from the original providers. Furthermore, certain special models and anti-weapon models in 3D printing must be protected against unauthorized users. Therefore, in order to prevent attacks and illegal copying and to ensure that all access is authorized, 3D printing models should be encrypted before being transmitted and stored. A novel perceptual encryption algorithm for 3D printing models for secure storage and transmission is presented in this paper. A facet of 3D printing model is extracted to interpolate a spline curve of degree 2 in three-dimensional space that is determined by three control points, the curvature coefficients of degree 2, and an interpolating vector. Three control points, the curvature coefficients, and interpolating vector of the spline curve of degree 2 are encrypted by a secret key. The encrypted features of the spline curve are then used to obtain the encrypted 3D printing model by inverse interpolation and geometric distortion. The results of experiments and evaluations prove that the entire 3D triangle model is altered and deformed after the perceptual encryption process. The proposed algorithm is responsive to the various formats of 3D printing models. The results of the perceptual encryption process is superior to those of previous methods. The proposed algorithm also provides a better method and more security than previous methods.
An industrial batch dryer simulation tool based on the concept of the characteristic drying curve
DEFF Research Database (Denmark)
Kærn, Martin Ryhl; Elmegaard, Brian; Schneider, P.
2013-01-01
content in the material to be invariant in the airflow direction. In the falling-rate period, the concept of the Characteristic Drying Curve (CDC) is used as proposed by Langrish et al. (1991), but modified to account for a possible end-drying rate. Using the CDC both hygroscopic and non......-hygroscopic materials may be analyzed by the tool and guidelines for the determination of the CDC coefficients are provided. The comparison of the simulation tool with measurements shows that the assumption of invariant material properties along the flow direction is doubtful at least for the actual case of interest...
An Improved Minimum Error Interpolator of CNC for General Curves Based on FPGA
Directory of Open Access Journals (Sweden)
Jiye HUANG
2014-05-01
Full Text Available This paper presents an improved minimum error interpolation algorithm for general curves generation in computer numerical control (CNC. Compared with the conventional interpolation algorithms such as the By-Point Comparison method, the Minimum- Error method and the Digital Differential Analyzer (DDA method, the proposed improved Minimum-Error interpolation algorithm can find a balance between accuracy and efficiency. The new algorithm is applicable for the curves of linear, circular, elliptical and parabolic. The proposed algorithm is realized on a field programmable gate array (FPGA with Verilog HDL language, and simulated by the ModelSim software, and finally verified on a two-axis CNC lathe. The algorithm has the following advantages: firstly, the maximum interpolation error is only half of the minimum step-size; and secondly the computing time is only two clock cycles of the FPGA. Simulations and actual tests have proved that the high accuracy and efficiency of the algorithm, which shows that it is highly suited for real-time applications.
DQ thermal buckling analysis of embedded curved carbon nanotubes based on nonlocal elasticity theory
Directory of Open Access Journals (Sweden)
AliReza Setoodeh
Full Text Available Abstract To investigate the thermal buckling of curved carbon nanotubes (CCNTs embedded in an elastic medium, nonlocal elasticity theory is employed in combination with the theory of thin curved beams. Differential quadrature (DQ method is implemented to discretize the resulted governing equations. Solving these equations enables us to estimate the critical temperature and the critical axial buckling load for CCNTs surrounded by an elastic medium and under the effect of a uniform temperature change. The elastic interaction between the nanotube and its surrounding medium is modeled as a Winkler-Pasternak elastic foundation. The fast convergence of the DQ method is demonstrated and also its accuracy is verified by comparing the results with available solutions in the literature. The effects of various parameters such as different boundary conditions, nonlocal parameter, Winkler and Pasternak elastic modulus, temperature and nanotube curvature on the critical buckling temperature and load are successfully studied. The results reveal that the critical buckling load depends significantly on the curvature of the CCNT.
He, G.; Zhu, H.; Xu, J.; Gao, K.; Zhu, D.
2017-09-01
The bionic research of shape is an important aspect of the research on bionic robot, and its implementation cannot be separated from the shape modeling and numerical simulation of the bionic object, which is tedious and time-consuming. In order to improve the efficiency of shape bionic design, the feet of animals living in soft soil and swamp environment are taken as bionic objects, and characteristic skeleton curve, section curve, joint rotation variable, position and other parameters are used to describe the shape and position information of bionic object’s sole, toes and flipper. The geometry modeling of the bionic object is established by using the parameterization of characteristic curves and variables. Based on this, the integration framework of parametric modeling and finite element modeling, dynamic analysis and post-processing of sinking process in soil is proposed in this paper. The examples of bionic ostrich foot and bionic duck foot are also given. The parametric modeling and integration technique can achieve rapid improved design based on bionic object, and it can also greatly improve the efficiency and quality of robot foot bionic design, and has important practical significance to improve the level of bionic design of robot foot’s shape and structure.
Prettenthaler, F.; Amrusch, P.; Habsburg-Lothringen, C.
2010-04-01
To date, in Austria no empirical assessment of absolute damage curves has been realized on the basis of detailed information on flooded buildings due to a dam breach, presumably because of the lack of data. This paper tries to fill this gap by estimating an absolute flood-damage curve, based on data of a recent flood event in Austria in 2006. First, a concise analysis of the case study area is conducted, i.e., the maximum damage potential is identified by using raster-based GIS. Thereafter, previous literature findings on existing flood-damage functions are considered in order to determine a volume-water damage function that can be used for further flood damage assessment. Finally, the flood damage function is cross validated and applied in prediction of damage potential in the study area. For future development of the estimated flood damage curve, and to aid more general use, we propose verification against field data on damage caused by natural waves in rivers.
A Provably Secure RFID Authentication Protocol Based on Elliptic Curve for Healthcare Environments.
Farash, Mohammad Sabzinejad; Nawaz, Omer; Mahmood, Khalid; Chaudhry, Shehzad Ashraf; Khan, Muhammad Khurram
2016-07-01
To enhance the quality of healthcare in the management of chronic disease, telecare medical information systems have increasingly been used. Very recently, Zhang and Qi (J. Med. Syst. 38(5):47, 32), and Zhao (J. Med. Syst. 38(5):46, 33) separately proposed two authentication schemes for telecare medical information systems using radio frequency identification (RFID) technology. They claimed that their protocols achieve all security requirements including forward secrecy. However, this paper demonstrates that both Zhang and Qi's scheme, and Zhao's scheme could not provide forward secrecy. To augment the security, we propose an efficient RFID authentication scheme using elliptic curves for healthcare environments. The proposed RFID scheme is secure under common random oracle model.
Two-Factor User Authentication with Key Agreement Scheme Based on Elliptic Curve Cryptosystem
Directory of Open Access Journals (Sweden)
Juan Qu
2014-01-01
Full Text Available A password authentication scheme using smart card is called two-factor authentication scheme. Two-factor authentication scheme is the most accepted and commonly used mechanism that provides the authorized users a secure and efficient method for accessing resources over insecure communication channel. Up to now, various two-factor user authentication schemes have been proposed. However, most of them are vulnerable to smart card loss attack, offline password guessing attack, impersonation attack, and so on. In this paper, we design a password remote user authentication with key agreement scheme using elliptic curve cryptosystem. Security analysis shows that the proposed scheme has high level of security. Moreover, the proposed scheme is more practical and secure in contrast to some related schemes.
Komatsu, Shohei; Scatton, Olivier; Goumard, Claire; Sepulveda, Ailton; Brustia, Raffaele; Perdigao, Fabiano; Soubrane, Olivier
2017-05-01
Laparoscopic hepatectomy continues to be a challenging operation associated with a steep learning curve. This study aimed to evaluate the learning process during 15 years of experience with laparoscopic hepatectomy and to identify approaches to standardization of this procedure. Prospectively collected data of 317 consecutive laparoscopic hepatectomies performed from January 2000 to December 2014 were reviewed retrospectively. The operative procedures were classified into 4 categories (minor hepatectomy, left lateral sectionectomy [LLS], left hepatectomy, and right hepatectomy), and indications were classified into 5 categories (benign-borderline tumor, living donor, metastatic liver tumor, biliary malignancy, and hepatocellular carcinoma). During the first 10 years, the procedures were limited mainly to minor hepatectomy and LLS, and the indications were limited to benign-borderline tumor and living donor. Implementation of major hepatectomy rapidly increased the proportion of malignant tumors, especially hepatocellular carcinoma, starting from 2011. Conversion rates decreased with experience for LLS (13.3% vs 3.4%; p = 0.054) and left hepatectomy (50.0% vs 15.0%; p = 0.012), but not for right hepatectomy (41.4% vs 35.7%; p = 0.661). Our 15-year experience clearly demonstrates the stepwise procedural evolution from LLS through left hepatectomy to right hepatectomy, as well as the trend in indications from benign-borderline tumor/living donor to malignant tumors. In contrast to LLS and left hepatectomy, a learning curve was not observed for right hepatectomy. The ongoing development process can contribute to faster standardization necessary for future advances in laparoscopic hepatectomy. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
A new post-frac evaluation method for shale gas wells based on fracturing curves
Directory of Open Access Journals (Sweden)
Xiaobing Bian
2016-03-01
Full Text Available Post-fracturing evaluation by using limited data is of great significance to continuous improvement of the fracturing programs. In this paper, a fracturing curve was divided into two stages (i.e., prepad fluid injection and main fracturing so as to further understand the parameters of reservoirs and artificial fractures. The brittleness and plasticity of formations were qualitatively identified by use of the statistics of formation fracture frequency, and average pressure dropping range and rate during the prepad fluid injection. The composite brittleness index was quantitatively calculated by using the energy zones in the process of fracturing. It is shown from the large-scale true triaxial physical simulation results that the complexity of fractures is reflected by the pressure fluctuation frequency and amplitude in the main fracturing curve, and combined with the brittleness and plasticity of formations, the fracture morphology far away from the well can be diagnosed. Well P, a shale gas well in SE Chongqing, was taken as an example for post-fracturing evaluation. It is shown that the shale beds are of stronger heterogeneity along the extension directions of horizontal wells, and with GR 260 API as the dividing line between brittleness and plasticity in this area, complex fracture systems tend to form in brittleness-prone formations. In Well P, half of the fractures are single fractures, so it is necessary to carry out fine subsection and turnaround fracturing so as to improve development effects. This paper provides a theoretical basis for improving the fracturing well design and increasing the effective stimulated volume in this area.
Properties of three-body decay functions derived with time-like jet calculus beyond leading order
International Nuclear Information System (INIS)
Sugiura, Tetsuya
2002-01-01
Three-body decay functions in time-like parton branching are calculated using the jet calculus to the next-to-leading logarithmic (NLL) order in perturbative quantum chromodynamics (QCD). The phase space contributions from each of the ladder diagrams and interference diagrams are presented. We correct part of the results for the three-body decay functions calculated previously by two groups. Employing our new results, the properties of the three-body decay functions in the regions of soft partons are examined numerically. Furthermore, we examine the contribution of the three-body decay functions modified by the restriction resulting from the kinematical boundary of the phase space for two-body decay in the parton shower model. This restriction leads to some problems for the parton shower model. For this reason, we propose a new restriction introduced by the kinematical boundary of the phase space for two-body decay. (author)
Clarke, F H; Cahoon, N M
1987-08-01
A convenient procedure has been developed for the determination of partition and distribution coefficients. The method involves the potentiometric titration of the compound, first in water and then in a rapidly stirred mixture of water and octanol. An automatic titrator is used, and the data is collected and analyzed by curve fitting on a microcomputer with 64 K of memory. The method is rapid and accurate for compounds with pKa values between 4 and 10. Partition coefficients can be measured for monoprotic and diprotic acids and bases. The partition coefficients of the neutral compound and its ion(s) can be determined by varying the ratio of octanol to water. Distribution coefficients calculated over a wide range of pH values are presented graphically as "distribution profiles". It is shown that subtraction of the titration curve of solvent alone from that of the compound in the solvent offers advantages for pKa determination by curve fitting for compounds of low aqueous solubility.
Energy Technology Data Exchange (ETDEWEB)
Soriguera Marti, F.; Martinez-Diaz, M.; Perez Perez, I.
2016-07-01
Travel time is probably the most important indicator of the level of service of a highway, and it is also the most appreciated information for its users. Administrations and private companies make increasing efforts to improve its real time estimation. The appearance of new technologies makes the precise measurement of travel times easier than never before. However, direct measurements of travel time are, by nature, outdated in real time, and lack of the desired forecasting capabilities. This paper introduces a new methodology to improve the real time estimation of travel times by using the equipment usually present in most highways, i.e., loop detectors, in combination with Automatic Vehicle Identification or Tracking Technologies. One of the most important features of the method is the usage of cumulative counts at detectors as an input, avoiding the drawbacks of common spot-speed methodologies. Cumulative count curves have great potential for freeway travel time information systems, as they provide spatial measurements and thus allow the calculation of instantaneous travel times. In addition, they exhibit predictive capabilities. Nevertheless, they have not been used extensively mainly because of the error introduced by the accumulation of the detector drift. The proposed methodology solves this problem by correcting the deviations using direct travel time measurements. The method results highly beneficial for its accuracy as well as for its low implementation cost. (Author)
A novel three-dimensional smile analysis based on dynamic evaluation of facial curve contour
Lin, Yi; Lin, Han; Lin, Qiuping; Zhang, Jinxin; Zhu, Ping; Lu, Yao; Zhao, Zhi; Lv, Jiahong; Lee, Mln Kyeong; Xu, Yue
2016-02-01
The influence of three-dimensional facial contour and dynamic evaluation decoding on factors of smile esthetics is essential for facial beauty improvement. However, the kinematic features of the facial smile contour and the contribution from the soft tissue and underlying skeleton are uncharted. Here, the cheekbone-maxilla contour and nasolabial fold were combined into a “smile contour” delineating the overall facial topography emerges prominently in smiling. We screened out the stable and unstable points on the smile contour using facial motion capture and curve fitting, before analyzing the correlation between soft tissue coordinates and hard tissue counterparts of the screened points. Our finding suggests that the mouth corner region was the most mobile area characterizing smile expression, while the other areas remained relatively stable. Therefore, the perioral area should be evaluated dynamically while the static assessment outcome of other parts of the smile contour contribute partially to their dynamic esthetics. Moreover, different from the end piece, morphologies of the zygomatic area and the superior part of the nasolabial crease were determined largely by the skeleton in rest, implying the latter can be altered by orthopedic or orthodontic correction and the former better improved by cosmetic procedures to improve the beauty of smile.
Guimarães, L B de M; Anzanello, M J; Renner, J S
2012-05-01
This paper presents a method for implementing multifunctional work teams in a footwear company that followed the Taylor/Ford system for decades. The suggested framework first applies a Learning Curve (LC) modeling to assess whether rotation between tasks of different complexities affects workers' learning rate and performance. Next, the Macroergonomic Work Analysis (MA) method (Guimarães, 1999, 2009) introduces multifunctional principles in work teams towards workers' training and resources improvement. When applied to a pilot line consisting of 100 workers, the intervention-reduced work related accidents in 80%, absenteeism in 45.65%, and eliminated work related musculoskeletal disorders (WMSD), medical consultations, and turnover. Further, the output rate of the multifunctional team increased average 3% compared to the production rate of the regular lines following the Taylor/Ford system (with the same shoe model being manufactured), while the rework and spoilage rates were reduced 85% and 69%, respectively. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
A Study on the Surface and Subsurface Water Interaction Based on the Groundwater Recession Curve
Wang, S. T.; Chen, Y. W.; Chang, L. C.; Chiang, C. J.; Wang, Y. S.
2017-12-01
The interaction of surface to subsurface water is an important issue for groundwater resources assessment and management. The influences of surface water to groundwater are mainly through the rainfall recharge, river recharge and discharge and other boundary sources. During a drought period, the interaction of river and groundwater may be one of the main sources of groundwater level recession. Therefore, this study explores the interaction of surface water to groundwater via the groundwater recession. During drought periods, the pumping and river interaction together are the main mechanisms causing the recession of groundwater level. In principle, larger gradient of the recession curve indicates more groundwater discharge and it is an important characteristic of the groundwater system. In this study, to avoid time-consuming manual analysis, the Python programming language is used to develop a statistical analysis model for exploring the groundwater recession information. First, the slopes of the groundwater level hydrograph at every time step were computed for each well. Then, for each well, the represented slope to each groundwater level was defined as the slope with 90% exceedance probability. The relationship between the recession slope and the groundwater level can then be obtained. The developed model is applied to Choushui River Alluvial Fan. In most wells, the results show strong positive correlations between the groundwater levels and the absolute values of the recession slopes.
Search procedure for models based on the evolution of experimental curves
International Nuclear Information System (INIS)
Delforge, J.
1975-01-01
The possibilities offered by numerical analysis regarding the identification of parameters for the model are outlined. The use of a large number of experimental measurements is made possible by the flexibility of the proposed method. It is shown that the errors of numerical identification over all parameters are proportional to experimental errors, and to a proportionality factor called conditioning of the identification problem which is easily computed. Moreover, it is possible to define and calculate, for each parameter, a factor of sensitivity to experimental errors. The numerical values of conditioning and sensitivity factor depend on all experimental conditions, that is, on the one hand, the specific definition of the experiments, and on the other hand, the number and quality of the undertaken measurements. The identification procedure proposed includes several phases. The preliminary phase consists in a first definition of experimental conditions, in agreement with the experimenter. From the data thus obtained, it is generally possible to evaluate the minimum number of equivalence classes required for an interpretation compatible with the morphology of experimental curves. Possibly, from this point, some additional measurements may prove useful or required. The numerical phase comes afterwards to determine a first approximate model by means of the methods previously described. Next phases again require a close collaboration between experimenters and theoreticians. They consist mainly in refining the first model [fr
Chang, Yeun-Chung; Huang, Yan-Hao; Huang, Chiun-Sheng; Chang, Pei-Kang; Chen, Jeon-Hor; Chang, Ruey-Feng
2012-04-01
The purpose of this study is to evaluate the diagnostic efficacy of the representative characteristic kinetic curve of dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) extracted by fuzzy c-means (FCM) clustering for the discrimination of benign and malignant breast tumors using a novel computer-aided diagnosis (CAD) system. About the research data set, DCE-MRIs of 132 solid breast masses with definite histopathologic diagnosis (63 benign and 69 malignant) were used in this study. At first, the tumor region was automatically segmented using the region growing method based on the integrated color map formed by the combination of kinetic and area under curve color map. Then, the FCM clustering was used to identify the time-signal curve with the larger initial enhancement inside the segmented region as the representative kinetic curve, and then the parameters of the Tofts pharmacokinetic model for the representative kinetic curve were compared with conventional curve analysis (maximal enhancement, time to peak, uptake rate and washout rate) for each mass. The results were analyzed with a receiver operating characteristic curve and Student's t test to evaluate the classification performance. Accuracy, sensitivity, specificity, positive predictive value and negative predictive value of the combined model-based parameters of the extracted kinetic curve from FCM clustering were 86.36% (114/132), 85.51% (59/69), 87.30% (55/63), 88.06% (59/67) and 84.62% (55/65), better than those from a conventional curve analysis. The A(Z) value was 0.9154 for Tofts model-based parametric features, better than that for conventional curve analysis (0.8673), for discriminating malignant and benign lesions. In conclusion, model-based analysis of the characteristic kinetic curve of breast mass derived from FCM clustering provides effective lesion classification. This approach has potential in the development of a CAD system for DCE breast MRI. Copyright Â© 2012 Elsevier Inc
Analysis of diffusion paths for photovoltaic technology based on experience curves
International Nuclear Information System (INIS)
Poponi, Daniele
2003-04-01
This paper assesses the prospects for diffusion of photovoltaic (PV) technology for electricity generation in grid-connected systems. The analysis begins with the calculation of the break-even price of PV systems and modules, which is the price that can assure commercial viability without incentives or subsidies. The calculated average break-even price of PV systems for building-integrated applications is about US dollars 3.2/W p but can go up to about US dollars 4.5/W p in areas with very good solar irradiation and if a low real discount rate is applied. These are higher values than the break-even prices estimated in the literature to date. PV system break-even prices for intermediate load generation in utility-owned systems are also calculated, their average being about US dollars 1/W p The methodology of experience curves is used to predict what would be the different levels of cumulative world PV shipments required to reach the calculated break-even prices of PV systems, assuming different trends in the relationship between price and the increase in cumulative shipments. The years in which the break-even levels of cumulative shipments could be theoretically obtained are then calculated by considering different market growth rates. Photovoltaics could enter the niche of building-integrated applications without incentives in the first years of the next decade, provided that the PR is 80% and the average annual world market growth rate is at least 15%. The final part of the paper analyzes the niche markets or applications that seem promising for the diffusion of photovoltaics in the next few years (Author)
Directory of Open Access Journals (Sweden)
Ugur Ozturk
2016-07-01
Full Text Available Early-warning systems (EWSs are crucial to reduce the risk of landslide, especially where the structural measures are not fully capable of preventing the devastating impact of such an event. Furthermore, designing and successfully implementing a complete landslide EWS is a highly complex task. The main technical challenges are linked to the definition of heterogeneous material properties (geotechnical and geomechanical parameters as well as a variety of the triggering factors. In addition, real-time data processing creates a significant complexity, since data collection and numerical models for risk assessment are time consuming tasks. Therefore, uncertainties in the physical properties of a landslide together with the data management represent the two crucial deficiencies in an efficient landslide EWS. Within this study the application is explored of the concept of fragility curves to landslides; fragility curves are widely used to simulate systems response to natural hazards, i.e. floods or earthquakes. The application of fragility curves to landslide risk assessment is believed to simplify emergency risk assessment; even though it cannot substitute detailed analysis during peace-time. A simplified risk assessment technique can remove some of the unclear features and decrease data processing time. The method is based on synthetic samples which are used to define the approximate failure thresholds for landslides, taking into account the materials and the piezometric levels. The results are presented in charts. The method presented in this paper, which is called failure index fragility curve (FIFC, allows assessment of the actual real-time risk in a case study that is based on the most appropriate FIFC. The application of an FIFC to a real case is presented as an example. This method to assess the landslide risk is another step towards a more integrated dynamic approach to a potential landslide prevention system. Even if it does not define
International Nuclear Information System (INIS)
Hong, Sungjun; Chung, Yanghon; Woo, Chungwon
2015-01-01
South Korea, as the 9th largest energy consuming in 2013 and the 7th largest greenhouse gas emitting country in 2011, established ‘Low Carbon Green Growth’ as the national vision in 2008, and is announcing various active energy policies that are set to gain the attention of the world. In this paper, we estimated the decrease of photovoltaic power generation cost in Korea based on the learning curve theory. Photovoltaic energy is one of the leading renewable energy sources, and countries all over the world are currently expanding R and D, demonstration and deployment of photovoltaic technology. In order to estimate the learning rate of photovoltaic energy in Korea, both conventional 1FLC (one-factor learning curve), which considers only the cumulative power generation, and 2FLC, which also considers R and D investment were applied. The 1FLC analysis showed that the cost of power generation decreased by 3.1% as the cumulative power generation doubled. The 2FCL analysis presented that the cost decreases by 2.33% every time the cumulative photovoltaic power generation is doubled and by 5.13% every time R and D investment is doubled. Moreover, the effect of R and D investment on photovoltaic technology took after around 3 years, and the depreciation rate of R and D investment was around 20%. - Highlights: • We analyze the learning effects of photovoltaic energy technology in Korea. • In order to calculate the learning rate, we use 1FLC (one-factor learning curve) and 2FLC methods, respectively. • 1FLC method considers only the cumulative power generation. • 2FLC method considers both cumulative power generation and knowledge stock. • We analyze a variety of scenarios by time lag and depreciation rate of R and D investment
Acid-base titration curves in an integrated computer learning environment
Heck, A.; Kędzierska, E.; Rodgers, L.; Chmurska, M.
2008-01-01
The topic of acid-base reactions is a regular component of many chemistry curricula that requires integrated understanding of various areas of introductory chemistry. Many students have considerable difficulties understanding the concepts and processes involved. It has been suggested and confirmed
Directory of Open Access Journals (Sweden)
Alaauldin Ibrahim
2017-01-01
Full Text Available Information in patients’ medical histories is subject to various security and privacy concerns. Meanwhile, any modification or error in a patient’s medical data may cause serious or even fatal harm. To protect and transfer this valuable and sensitive information in a secure manner, radio-frequency identification (RFID technology has been widely adopted in healthcare systems and is being deployed in many hospitals. In this paper, we propose a mutual authentication protocol for RFID tags based on elliptic curve cryptography and advanced encryption standard. Unlike existing authentication protocols, which only send the tag ID securely, the proposed protocol could also send the valuable data stored in the tag in an encrypted pattern. The proposed protocol is not simply a theoretical construct; it has been coded and tested on an experimental RFID tag. The proposed scheme achieves mutual authentication in just two steps and satisfies all the essential security requirements of RFID-based healthcare systems.
Directory of Open Access Journals (Sweden)
Vebil Yildirim
Full Text Available Abstract This work addresses an accurate and detailed axial static load dependence linearly elastic free vibration analysis of cylindrical helical springs based on the theory of spatially curved bars and the transfer matrix method. For a continuous system, governing equations comprise coupled vibration modes namely transverse vibrations in two orthogonal planes, torsional and axial vibrations. The axial and shear deformation effects together with the rotatory inertia effects are all considered based on the first order shear deformation theory and their effects on the frequencies are investigated. The effects of the initial stress resultants on the frequencies are also studied. After buckling, forward-shifting phenomenon of higher frequencies is noticeably demonstrated. It is also revealed that a free/forced vibration analysis with an axial static load should not be performed individually without checking buckling loads.
Liu, Si-jun; Huang, Zhao-sheng; Wu, Qing-guang; Huang, Zhang-jie; Wu, Li-rong; Yan, Wen-li; Wang, Qi; Wang, Zong-wei; Chang, David Lungpao; Yang, Zheng
2016-04-01
To establish the diagnostic quantitative criteria for fire-heat syndrome (FHS) of Chinese medicine (CM) based on the receiver operating characteristic (ROC) curve and principal component analysis (PCA). The symptoms and signs of FHS cases and healthy subjects from Guangzhou, Henan and Hunan of China were collected through questionnaire, and the diagnostic quantitative score tables were established for the three regions, respectively, with the method of maximum likelihood analysis. The homogeneity test was then performed on the diagnostic score tables for the three regions with ROC curve, and the diagnostic efficiency of diagnostic score tables for the three regions was compared with the prospective test and retrospective test. The method of PCA was adopted to obtain the analysis matrix for classifying the tapes of FHS. Twenty-seven elements of FHS were confirmed through Chi-square test, and the diagnostic score tables for the three regions were established with the method of maximum likelihood analysis on the basis of the collected case data. According to the ROC curve test, the areas under ROC curve of Guangzhou diagnostic score table assessment with candidates in Guangzhou, Henan and Hunan were 0.998, 0.961 and 0.956, respectively. It showed that the diagnostic efficiency of Guangzhou diagnostic score tables was the highest one. With the prospective test, the area under ROC of Guangzhou diagnostic score table was 0.949, and more than any other diagnostic score table. By PCA, FHS was classified into excess fire and deficiency fire, and then classified into syndrome of flaring up of Heart (Xin) fire, syndrome of Lung (Fei)-Stomach (Wei) excess fire, syndrome of deficiency of Liver (Gan)-yin and Kidney (Shen)-yin, and syndrome of deficiency of Lung-yin from the view of viscera. In the retrospective test, the consistency with clinicians' diagnosis was 69.4%, and in the prospective test, it was 70.1%. The Guangzhou diagnostic score table could be used as the
Li, Kenli; Zou, Shuting; Xv, Jin
2008-01-01
Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2n), n ∈ Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2n) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations. PMID:18431451
Directory of Open Access Journals (Sweden)
Kenli Li
2008-01-01
Full Text Available Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP, especially over GF(2n, n∈Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2n are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.
Liu, Boshi; Huang, Renliang; Yu, Yanjun; Su, Rongxin; Qi, Wei; He, Zhimin
2018-01-01
Ochratoxin A (OTA) is a type of mycotoxin generated from the metabolism of Aspergillus and Penicillium , and is extremely toxic to humans, livestock, and poultry. However, traditional assays for the detection of OTA are expensive and complicated. Other than OTA aptamer, OTA itself at high concentration can also adsorb on the surface of gold nanoparticles (AuNPs), and further inhibit AuNPs salt aggregation. We herein report a new OTA assay by applying the localized surface plasmon resonance effect of AuNPs and their aggregates. The result obtained from only one single linear calibration curve is not reliable, and so we developed a "double calibration curve" method to address this issue and widen the OTA detection range. A number of other analytes were also examined, and the structural properties of analytes that bind with the AuNPs were further discussed. We found that various considerations must be taken into account in the detection of these analytes when applying AuNP aggregation-based methods due to their different binding strengths.
Li, Kenli; Zou, Shuting; Xv, Jin
2008-01-01
Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.
Persky, Adam M; Henry, Teague; Campbell, Ashley
2015-03-25
To examine factors that determine the interindividual variability of learning within a team-based learning environment. Students in a pharmacokinetics course were given 4 interim, low-stakes cumulative assessments throughout the semester and a cumulative final examination. Students' Myers-Briggs personality type was assessed, as well as their study skills, motivations, and attitudes towards team-learning. A latent curve model (LCM) was applied and various covariates were assessed to improve the regression model. A quadratic LCM was applied for the first 4 assessments to predict final examination performance. None of the covariates examined significantly impacted the regression model fit except metacognitive self-regulation, which explained some of the variability in the rate of learning. There were some correlations between personality type and attitudes towards team learning, with introverts having a lower opinion of team-learning than extroverts. The LCM could readily describe the learning curve. Extroverted and introverted personality types had the same learning performance even though preference for team-learning was lower in introverts. Other personality traits, study skills, or practice did not significantly contribute to the learning variability in this course.
Genetic Algorithm-Based Optimization to Match Asteroid Energy Deposition Curves
Tarano, Ana; Mathias, Donovan; Wheeler, Lorien; Close, Sigrid
2018-01-01
An asteroid entering Earth's atmosphere deposits energy along its path due to thermal ablation and dissipative forces that can be measured by ground-based and spaceborne instruments. Inference of pre-entry asteroid properties and characterization of the atmospheric breakup is facilitated by using an analytic fragment-cloud model (FCM) in conjunction with a Genetic Algorithm (GA). This optimization technique is used to inversely solve for the asteroid's entry properties, such as diameter, density, strength, velocity, entry angle, and strength scaling, from simulations using FCM. The previous parameters' fitness evaluation involves minimizing error to ascertain the best match between the physics-based calculated energy deposition and the observed meteors. This steady-state GA provided sets of solutions agreeing with literature, such as the meteor from Chelyabinsk, Russia in 2013 and Tagish Lake, Canada in 2000, which were used as case studies in order to validate the optimization routine. The assisted exploration and exploitation of this multi-dimensional search space enables inference and uncertainty analysis that can inform studies of near-Earth asteroids and consequently improve risk assessment.
Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian
2015-03-01
The telecare medical information systems (TMISs) enable patients to conveniently enjoy telecare services at home. The protection of patient's privacy is a key issue due to the openness of communication environment. Authentication as a typical approach is adopted to guarantee confidential and authorized interaction between the patient and remote server. In order to achieve the goals, numerous remote authentication schemes based on cryptography have been presented. Recently, Arshad et al. (J Med Syst 38(12): 2014) presented a secure and efficient three-factor authenticated key exchange scheme to remedy the weaknesses of Tan et al.'s scheme (J Med Syst 38(3): 2014). In this paper, we found that once a successful off-line password attack that results in an adversary could impersonate any user of the system in Arshad et al.'s scheme. In order to thwart these security attacks, an enhanced biometric and smart card based remote authentication scheme for TMISs is proposed. In addition, the BAN logic is applied to demonstrate the completeness of the enhanced scheme. Security and performance analyses show that our enhanced scheme satisfies more security properties and less computational cost compared with previously proposed schemes.
Genetic Algorithm-based Optimization to Match Asteroid Energy Deposition Curves
Tarano, Ana Maria; Mathias, Donovan; Wheeler, Lorien; Close, Sigrid
2017-10-01
An asteroid entering Earth’s atmosphere deposits energy along its path due to thermal ablation and dissipative forces that can be measured by ground-based and space-borne instruments. Inference of pre-entry asteroid properties and characterization of the atmospheric breakup is facilitated by using an analytic fragment-cloud model (FCM) in conjunction with a Genetic Algorithm (GA). This optimization technique is used to inversely solve for the asteroid’s entry properties, such as diameter, density, strength, velocity, entry angle, ablation coefficient, and strength scaling, from simulations using FCM. The previous parameters’ fitness evaluation involves minimizing residuals and comparing the incremental energy deposited to ascertain the best match between the physics-based calculated energy deposition and the observed meteors. This steady-state GA provided sets of solutions agreeing with literature, such as the meteor from Chelyabinsk, Russia in 2013 and Tagish Lake, Canada in 2000, which were used as case studies in order to validate the optimization routine. The assisted exploration and exploitation of this multi-dimensional search space enables inference and uncertainty analysis that can inform studies of near-Earth asteroids and consequently improve risk assessment.
Preparing for TESS: Precision Ground-based Light-curves of Newly Discovered Transiting Exoplanets
Li, Yiting; Stefansson, Gudmundur; Mahadevan, Suvrath; Monson, Andy; Hebb, Leslie; Wisniewski, John; Huehnerhoff, Joseph
2018-01-01
NASA’s Transiting Exoplanet Survey Satellite (TESS), to be launched in early 2018, is expected to catalog a myriad of transiting exoplanet candidates ranging from Earth-sized to gas giants, orbiting a diverse range of stellar types in the solar neighborhood. In particular, TESS will find small planets orbiting the closest and brightest stars, and will enable detailed atmospheric characterizations of planets with current and future telescopes. In the TESS era, ground-based follow-up resources will play a critical role in validating and confirming the planetary nature of the candidates TESS will discover. Along with confirming the planetary nature of exoplanet transits, high precision ground-based transit observations allow us to put further constraints on exoplanet orbital parameters and transit timing variations. In this talk, we present new observations of transiting exoplanets recently discovered by the K2 mission, using the optical diffuser on the 3.5m ARC Telescope at Apache Point Observatory. These include observations of the mini-Neptunes K2-28b and K2-104b orbiting early-to-mid M-dwarfs. In addition, other recent transit observations performed using the robotic 30cm telescope at Las Campanas Observatory in Chile will be presented.
Radial artery pulse waveform analysis based on curve fitting using discrete Fourier series.
Jiang, Zhixing; Zhang, David; Lu, Guangming
2018-04-19
Radial artery pulse diagnosis has been playing an important role in traditional Chinese medicine (TCM). For its non-invasion and convenience, the pulse diagnosis has great significance in diseases analysis of modern medicine. The practitioners sense the pulse waveforms in patients' wrist to make diagnoses based on their non-objective personal experience. With the researches of pulse acquisition platforms and computerized analysis methods, the objective study on pulse diagnosis can help the TCM to keep up with the development of modern medicine. In this paper, we propose a new method to extract feature from pulse waveform based on discrete Fourier series (DFS). It regards the waveform as one kind of signal that consists of a series of sub-components represented by sine and cosine (SC) signals with different frequencies and amplitudes. After the pulse signals are collected and preprocessed, we fit the average waveform for each sample using discrete Fourier series by least squares. The feature vector is comprised by the coefficients of discrete Fourier series function. Compared with the fitting method using Gaussian mixture function, the fitting errors of proposed method are smaller, which indicate that our method can represent the original signal better. The classification performance of proposed feature is superior to the other features extracted from waveform, liking auto-regression model and Gaussian mixture model. The coefficients of optimized DFS function, who is used to fit the arterial pressure waveforms, can obtain better performance in modeling the waveforms and holds more potential information for distinguishing different psychological states. Copyright © 2018 Elsevier B.V. All rights reserved.
Titration Curves: Fact and Fiction.
Chamberlain, John
1997-01-01
Discusses ways in which datalogging equipment can enable titration curves to be measured accurately and how computing power can be used to predict the shape of curves. Highlights include sources of error, use of spreadsheets to generate titration curves, titration of a weak acid with a strong alkali, dibasic acids, weak acid and weak base, and…
Directory of Open Access Journals (Sweden)
Jiang Lin
2016-01-01
Full Text Available The overall efficiency of PV arrays is affected by hot spots which should be detected and diagnosed by applying responsible monitoring techniques. The method using the IR thermal image to detect hot spots has been studied as a direct, noncontact, nondestructive technique. However, IR thermal images suffer from relatively high stochastic noise and non-uniformity clutter, so the conventional methods of image processing are not effective. The paper proposes a method to detect hotspots based on curve fitting of gray histogram. The result of MATLAB simulation proves the method proposed in the paper is effective to detect the hot spots suppressing the noise generated during the process of image acquisition.
Huang, Shijie; Liu, Zanzan; Wen, Huixin; Li, Li; Li, Qingge; Huang, Jianwei
2015-02-01
To develop a high-throughput rapid method for Vibrio (V.) cholerae molecular typing based on Melting Curve-based Multilocus Melt Typing (McMLMT). Seven housekeeping genes of V.cholerae were screened out, and for each gene, the specific primers were designed for correspondent genes as well as 4 probes covering polymorphism loci of sequences. After optimizing all parameters, a method of melting-curve analysis following asymmetric PCR was established with dual-fluorescent-reporter in two reaction tubes for each gene. A set of 28 Tm-values was obtained for each strain and then translated into a set of code of allelic genes, standing for the strain's McMLMT type (MT). Meanwhile, sequences of the 7-locus polymorphism were typed according to the method of MLST. To evaluate the efficiency and reliability of McMLMT, the data were compared with that of sequence-typing and PFGE using BioNumerics software. McMLMT method was established and refined for rapid typing of V. cholerae that a dozen of strains can be finished testing in a 3-hours PCR running using 96-well plates. 108 strains were analyzed and 28-Tm-values could be grouped and encoded according to 7 housekeeping gene to obtain the code set of allelic genes, and classified into 18 types (D = 0.723 3). Sequences of the 7 genes' polymorphism areas were directly clustered into the same 18 types with reference to MLST method. 46 of the strains, each represented a different PFGE type, could be classified into 13 types (D = 0.614 5) with McMLMT method and A- K groups at 85% similarity (D = 0.858 9) with PFGE method. McMLMT method is a rapid high-throughput molecular typing method for batches of strains with a resolution equal to MLST method and comparable to PFGE group.
Energy Technology Data Exchange (ETDEWEB)
Lin, Dexu; Nasher Ahmed, Samer Ali; Rosner, Christoph [Helmholtz-Institut Mainz (Germany); Mainz Univ. (Germany); Dbeyssi, Alaa; Larin, Paul; Morales, Cristina; Wang, Yadi [Helmholtz-Institut Mainz (Germany); Maas, Frank [Helmholtz-Institut Mainz (Germany); Mainz Univ. (Germany); PRISMA Cluster of Excellence, Johannes Gutenberg Universitaet Mainz (Germany); Collaboration: BESIII-Collaboration
2016-07-01
The structure of the proton can be understood through the study of its electromagnetic (EM) form factors. Electron scattering experiments (space-like region) have explored the proton EM form factors with a high accuracy. Only few data on the proton form factors in the time-like region, and only a very coarse determination of the individual electric and magnetic form factors (or their ratio) has been possible so far. The BESIII (Beijing Spectrometer III) at BEPCII (Beijing Electron Positron Collider II) has collected large data samples from J/ψ-mass up to 4.60 GeV. These data can be used to measure proton EM form factors by means of Initial-State-Radiation (ISR) events with the process e{sup +}e{sup -} → p anti pγ{sub ISR}. With 7.408 fb{sup -1} total luminosity of seven data samples from 3.773-4.600 GeV, the proton form factors and the cross section of p anti p have been analyzed with ISR-tagged method. In this talk, the status of this work is reported together with a discussion of the analysis of the background.
2002-01-01
The aim of this experiment is to measure with precision the electromagnetic form factors of the proton in the time-like region via the reaction: .ce @*p @A e|+e|- with antiprotons of momenta between 0 and 2 GeV/c. Up to @= 800 MeV/c, a continuous energy scan in @= 2 MeV (@]s) bins will be performed. The form factor !G(E)! and !G(M)! will be determined separately since large statistics can be collected with LEAR antiproton beams, so that angular distributions can be obtained at many momenta.\\\\ \\\\ In addition, e|+e|- pairs produced via the reaction: .ce @*p @A V|0 + neutrals, .ce !@A e|+e|- where the antiprotons are at rest, will be detected allowing the vector meson mass spectrum between @= 1 GeV and @= 1.7 GeV to be obtained with high statistics and in one run. \\\\ \\\\ The proposed apparatus consists of a central detector, surrounded by a gas Cerenkov counter, wire chambers, hodoscopes, and an electromagnetic calorimeter. The central detector consists of several layers of proportional chambers around a liquid-h...
International Nuclear Information System (INIS)
Pollock, M.D.
1986-01-01
We consider the (4+N)-dimensional theory whose Lagrangian function is Lsub(4+N)=√-g-circumflex α R-circumflex 2 , where R-circumflex is the Ricci scalar and α is a positive constant. The metric is g-circumflexsub(AB)= diag(gsub(ab), phi -1 g-barsub(mn)). Dimensional reduction leads to an effective four-dimensional Lagrangian of induced-gravity type. The positive semi-definiteness of L avoids the difficulties, pointed out recently by Horowitz and by Rubakov, which can arise in quantum cosmology when the (Euclidean) action becomes negative. The compactification is onto a time-like internal space g-barsub(mn), as suggested by Aref'eva and Volovich, giving a four-dimensional de Sitter space-time with phi=constant, which however is classically unstable on a time scale approx. H -1 . Decrease of the radius phisup(-1/2) of the internal space is ultimately halted by quantum effects, via some V(phi), and L 4 then includes the usual Hilbert term and a cosmological constant. (author)
International Nuclear Information System (INIS)
Sriyono; Ismu Wahyono, Puradwi; Mulyanto, Dwijo; Kusmono, Siamet
2001-01-01
The main component of Multipurpose G.A.Siwabessy had been analyzed by its failure rate curve. The main component ha'..e been analyzed namely, the pump of ''Fuel Storage Pool Purification System'' (AK-AP), ''Primary Cooling System'' (JE01-AP), ''Primary Pool Purification System'' (KBE01-AP), ''Warm Layer System'' (KBE02-AP), ''Cooling Tower'' (PA/D-AH), ''Secondary Cooling System'', and Diesel (BRV). The Failure Rate Curve is made by component database that was taken from 'log book' operation of RSG GAS. The total operation of that curve is 2500 hours. From that curve it concluded that the failure rate of components form of bathtub curve. The maintenance processing causes the curve anomaly
Local differential geometry of null curves in conformally flat space-time
International Nuclear Information System (INIS)
Urbantke, H.
1989-01-01
The conformally invariant differential geometry of null curves in conformally flat space-times is given, using the six-vector formalism which has generalizations to higher dimensions. This is then paralleled by a twistor description, with a twofold merit: firstly, sometimes the description is easier in twistor terms, sometimes in six-vector terms, which leads to a mutual enlightenment of both; and secondly, the case of null curves in timelike pseudospheres or 2+1 Minkowski space we were only able to treat twistorially, making use of an invariant differential found by Fubini and Cech. The result is the expected one: apart from stated exceptional cases there is a conformally invariant parameter and two conformally invariant curvatures which, when specified in terms of this parameter, serve to characterize the curve up to conformal transformations. 12 refs. (Author)
International Nuclear Information System (INIS)
Rothman, Dale S.
1998-01-01
Recent research has examined the hypothesis of an environmental Kuznets curve (EKC): the notion that environmental impact increases in the early stages of development followed by declines in the later stages. These studies have focused on the relationship between per capita income and a variety of environmental indicators. Results imply that EKCs may exist for a number of cases. However, the measures of environmental impact used generally focus on production processes and reflect environmental impacts that are local in nature and for which abatement is relatively inexpensive in terms of monetary costs and/or lifestyle changes. Significantly, more consumption-based measures, such as CO 2 emissions and municipal waste, for which impacts are relatively easy to externalize or costly to control, show no tendency to decline with increasing per capita income. By considering consumption and trade patterns, the author re-examines the concept of the EKC and propose the use of alternative, consumption-based measures of environmental impact. The author speculates that what appear to be improvements in environmental quality may in reality be indicators of increased ability of consumers in wealthy nations to distance themselves from the environmental degradation associated with their consumption
Brinkman, Willem M; Luursema, Jan-Maarten; Kengen, Bas; Schout, Barbara M A; Witjes, J Alfred; Bekkers, Ruud L
2013-03-01
To answer 2 research questions: what are the learning curve patterns of novices on the da Vinci skills simulator parameters and what parameters are appropriate for criterion-based robotic training. A total of 17 novices completed 2 simulator sessions within 3 days. Each training session consisted of a warming-up exercise, followed by 5 repetitions of the "ring and rail II" task. Expert participants (n = 3) performed a warming-up exercise and 3 repetitions of the "ring and rail II" task on 1 day. We analyzed all 9 parameters of the simulator. Significant learning occurred on 5 parameters: overall score, time to complete, instrument collision, instruments out of view, and critical errors within 1-10 repetitions (P motion and excessive instrument force only showed improvement within the first 5 repetitions. No significant learning on the parameter drops and master workspace range was found. Using the expert overall performance score (n = 3) as a criterion (overall score 90%), 9 of 17 novice participants met the criterion within 10 repetitions. Most parameters showed that basic robotic skills are learned relatively quickly using the da Vinci skills simulator, but that 10 repetitions were not sufficient for most novices to reach an expert level. Some parameters seemed inappropriate for expert-based criterion training because either no learning occurred or the novice performance was equal to expert performance. Copyright © 2013 Elsevier Inc. All rights reserved.
Wimmers, Paul F; Lee, Ming
2015-05-01
To determine the direction and extent to which medical student scores (as observed by small-group tutors) on four problem-based-learning-related domains change over nine consecutive blocks during a two-year period (Domains: Problem Solving/Use of Information/Group Process/Professionalism). Latent growth curve modeling is used to analyze performance trajectories in each domain of two cohorts of 1st and 2nd year students (n = 296). Slopes of the growth trajectories show similar linear increments in the first three domains. Further analysis revealed relative strong individual variability in initial scores but not in their later increments. Professionalism, on the other hand, shows low variability and has very small, insignificant slope increments. In this study, we showed that the learning domains (Problem Solving, Use of Information, and Group Process) observed during PBL tutorials are not only related to each other but also develop cumulatively over time. Professionalism, in contrast to the other domains studied, is less affected by the curriculum suggesting that this represents a stable characteristic. The observation that the PBL tutorial has an equal benefit to all students is noteworthy and needs further investigation.
Directory of Open Access Journals (Sweden)
Yuanyuan Zhang
2015-01-01
Full Text Available Since the concept of ubiquitous computing is firstly proposed by Mark Weiser, its connotation has been extending and expanding by many scholars. In pervasive computing application environment, many kinds of small devices containing smart cart are used to communicate with others. In 2013, Yang et al. proposed an enhanced authentication scheme using smart card for digital rights management. They demonstrated that their scheme is secure enough. However, Mishra et al. pointed out that Yang et al.’s scheme suffers from the password guessing attack and the denial of service attack. Moreover, they also demonstrated that Yang et al.’s scheme is not efficient enough when the user inputs an incorrect password. In this paper, we analyze Yang et al.’s scheme again, and find that their scheme is vulnerable to the session key attack. And, there are some mistakes in their scheme. To surmount the weakness of Yang et al.’s scheme, we propose a more efficient and provable secure digital rights management authentication scheme using smart card based on elliptic curve cryptography.
Genetic Algorithm-Based Optimization Methodology of Bézier Curves to Generate a DCI Microscale-Model
Directory of Open Access Journals (Sweden)
Jesus A. Basurto-Hurtado
2017-11-01
Full Text Available The aim of this article is to develop a methodology that is capable of generating micro-scale models of Ductile Cast Irons, which have the particular characteristic to preserve the smoothness of the graphite nodules contours that are lost by discretization errors when the contours are extracted using image processing. The proposed methodology uses image processing to extract the graphite nodule contours and a genetic algorithm-based optimization strategy to select the optimal degree of the Bézier curve that best approximate each graphite nodule contour. To validate the proposed methodology, a Finite Element Analysis (FEA was carried out using models that were obtained through three methods: (a using a fixed Bézier degree for all of the graphite nodule contours, (b the present methodology, and (c using a commercial software. The results were compared using the relative error of the equivalent stresses computed by the FEA, where the proposed methodology results were used as a reference. The present paper does not have the aim to define which models are the correct and which are not. However, in this paper, it has been shown that the errors generated in the discretization process should not be ignored when developing geometric models since they can produce relative errors of up to 35.9% when an estimation of the mechanical behavior is carried out.
Yang, Feifei; Hingerl, Ferdinand F; Xiao, Xianghui; Liu, Yijin; Wu, Ziyu; Benson, Sally M; Toney, Michael F
2015-06-03
The elevated level of atmospheric carbon dioxide (CO2) has caused serious concern of the progression of global warming. Geological sequestration is considered as one of the most promising techniques for mitigating the damaging effect of global climate change. Investigations over wide range of length-scales are important for systematic evaluation of the underground formations from prospective CO2 reservoir. Understanding the relationship between the micro morphology and the observed macro phenomena is even more crucial. Here we show Synchrotron based X-ray micro tomographic study of the morphological buildup of Sandstones. We present a numerical method to extract the pore sizes distribution of the porous structure directly, without approximation or complex calculation. We have also demonstrated its capability in predicting the capillary pressure curve in a mercury intrusion porosimetry (MIP) measurement. The method presented in this work can be directly applied to the morphological studies of heterogeneous systems in various research fields, ranging from Carbon Capture and Storage, and Enhanced Oil Recovery to environmental remediation in the vadose zone.
Sun, Haitao; Zhang, Yansong; Lai, Xinmin; Chen, Guanlong
2008-12-01
In order to reduce destructive testing of car sub-assemblies, on-line inspection of weld quality has gained more and more concern in terms of both resistance spot welding (RSW) and weldbonding. Dynamic resistance directly determines the amount of heat generated by current flow and consequently reflects nugget formation and growth, which is one of the most effective technologies for quality inspection. Under the measurements of voltage and current at the secondary circuit of a welding transformer, this paper proposes a method for on-line inspection of weld quality based on two indicators from dynamic resistance curve: time to nugget initiation and durable time to nugget expansion. Firstly, during the welding process of RSW and weldbonding, the proper range of time to nugget initiation and durable time to nugget expansion for good welds is set up. Then on-line inspection of weld quality on the basis of the developed proper range of these two indicators is carried out. The experimental results show the following conclusions: it is clearly able to separate accepted welds without expulsion from the welds of unaccepted nugget size in both RSW and weldbonding; the proper range for good welds, independent of electrode wear, is obtained only for a new electrode.
Energy Technology Data Exchange (ETDEWEB)
Chhipa, Mayur Kumar, E-mail: mayurchhipa1@gmail.com [Deptt. of Electronics and Communication Engineering, Government Engineering College Ajmer Rajasthan INDIA (India); Dusad, Lalit Kumar [Rajasthan Technical University Kota, Rajasthan (India)
2016-05-06
In this paper channel drop filter (CDF) is designed using dual curved photonic crystal ring resonator (PCRR). The photonic band gap (PBG) is calculated by plane wave expansion (PWE) method and the photonic crystal (PhC) based on two dimensional (2D) square lattice periodic arrays of silicon (Si) rods in air structure have been investigated using finite difference time domain (FDTD) method. The number of rods in Z and X directions is 21 and 20 respectively with lattice constant 0.540 nm and rod radius r = 0.1 µm. The channel drop filter has been optimized for telecommunication wavelengths λ = 1.591 µm with refractive indices 3.533. In the designed structure further analysis is also done by changing whole rods refractive index and it has been observed that this filter may be used for filtering several other channels also. The designed structure is useful for CWDM systems. This device may serve as a key component in photonic integrated circuits. The device is ultra compact with the overall size around 123 µm{sup 2}.
International Nuclear Information System (INIS)
Neij, Lena
2008-01-01
Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development
Kim, Jonggun; Engel, Bernard A; Park, Youn Shik; Theller, Larry; Chaubey, Indrajeet; Kong, Dong Soo; Lim, Kyoung Jae
2012-04-30
In many states of the US, the total maximum daily load program has been widely developed for watershed water quality restoration and management. However, the total maximum daily load is often represented as an average daily pollutant load based on average long-term flow conditions, and as such, it does not adequately describe the problems they aim to address. Without an adequate characterization of water quality problems, appropriate solutions cannot be identified and implemented. The total maximum daily load approach should consider adequate water quality characterizations based on overall flow conditions rather than on a single flow event such as average daily flow. The Load Duration Curve, which provides opportunities for enhanced pollutant source and best management practice targeting both in the total maximum daily load development and in water quality restoration efforts, has been used for the determination of appropriate total maximum daily load targets. However, at least 30 min to an hour is needed for unskilled people based on our experiences to generate the Load Duration Curve using a desktop-based spreadsheet computer program. Therefore, in this study, the Web-based Load Duration Curve system (https://engineering.purdue.edu/∼ldc/) was developed and applied to a study watershed for an analysis of the total maximum daily load and water quality characteristics in the watershed. This system provides diverse options for Flow Duration Curve and Load Duration Curve analysis of a watershed of interest in a brief time. The Web-based Load Duration Curve system is useful for characterizing the problem according to flow regimes, and for providing a visual representation that enables an easy understanding of the problem and the total maximum daily load targets. In addition, this system will be able to help researchers identify appropriate best management practices within watersheds. Copyright © 2011 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Ping Wan
2016-08-01
Full Text Available Driving anger, called “road rage”, has become increasingly common nowadays, affecting road safety. A few researches focused on how to identify driving anger, however, there is still a gap in driving anger grading, especially in real traffic environment, which is beneficial to take corresponding intervening measures according to different anger intensity. This study proposes a method for discriminating driving anger states with different intensity based on Electroencephalogram (EEG spectral features. First, thirty drivers were recruited to conduct on-road experiments on a busy route in Wuhan, China where anger could be inducted by various road events, e.g., vehicles weaving/cutting in line, jaywalking/cyclist crossing, traffic congestion and waiting red light if they want to complete the experiments ahead of basic time for extra paid. Subsequently, significance analysis was used to select relative energy spectrum of β band (β% and relative energy spectrum of θ band (θ% for discriminating the different driving anger states. Finally, according to receiver operating characteristic (ROC curve analysis, the optimal thresholds (best cut-off points of β% and θ% for identifying none anger state (i.e., neutral were determined to be 0.2183 ≤ θ% < 1, 0 < β% < 0.2586; low anger state is 0.1539 ≤ θ% < 0.2183, 0.2586 ≤ β% < 0.3269; moderate anger state is 0.1216 ≤ θ% < 0.1539, 0.3269 ≤ β% < 0.3674; high anger state is 0 < θ% < 0.1216, 0.3674 ≤ β% < 1. Moreover, the discrimination performances of verification indicate that, the overall accuracy (Acc of the optimal thresholds of β% for discriminating the four driving anger states is 80.21%, while 75.20% for that of θ%. The results can provide theoretical foundation for developing driving anger detection or warning devices based on the relevant optimal thresholds.
Guo, Lianping; Tian, Shulin; Jiang, Jun
2015-03-01
This paper proposes an algorithm to estimate the channel mismatches in time-interleaved analog-to-digital converter (TIADC) based on fractional delay (FD) and sine curve fitting. Choose one channel as the reference channel and apply FD to the output samples of reference channel to obtain the ideal samples of non-reference channels with no mismatches. Based on least square method, the sine curves are adopted to fit the ideal and the actual samples of non-reference channels, and then the mismatch parameters can be estimated by comparing the ideal sine curves and the actual ones. The principle of this algorithm is simple and easily understood. Moreover, its implementation needs no extra circuits, lowering the hardware cost. Simulation results show that the estimation accuracy of this algorithm can be controlled within 2%. Finally, the practicability of this algorithm is verified by the measurement results of channel mismatch errors of a two-channel TIADC prototype.
International Nuclear Information System (INIS)
Adamuscin, C.; Gakh, G. I.; Tomasi-Gustafsson, E.
2007-01-01
The electron positron annihilation reaction into four-pion production has been studied, through the channel e + +e - →ρ+ρ. The differential (and total) cross sections and various polarization observables for this reaction have been calculated in terms of the electromagnetic form factors of the corresponding γ*ρρ current. The elements of the spin-density matrix of the ρ meson were also calculated. Numerical estimations have been done, with the help of phenomenological form factors obtained in the spacelike region of the momentum transfer squared and analytically extended to the timelike region
Measurement of time-like baryon electro-magnetic form factors in BESIII
Energy Technology Data Exchange (ETDEWEB)
Morales Morales, Cristina; Dbeyssi, Alaa [Helmholtz-Institut Mainz (Germany); Ahmed, Samer Ali Nasher; Lin, Dexu; Rosner, Christoph; Wang, Yadi [Helmholtz-Institut Mainz (Germany); Institut fuer Kernphysik, Johannes Gutenberg-Universitaet Mainz (Germany); Maas, Frank [Helmholtz-Institut Mainz (Germany); Institut fuer Kernphysik, Johannes Gutenberg-Universitaet Mainz (Germany); PRISMA Cluster of Excellence, Johannes Gutenberg-Universitaet Mainz (Germany); Collaboration: BESIII-Collaboration
2016-07-01
BEPCII is a symmetric electron-positron collider located in Beijing running at center-of-mass energies between 2.0 and 4.6 GeV. This energy range allows BESIII experiment to measure baryon form factors both from direct electron-positron annihilation and from initial state radiation processes. We present results on direct electron-positron annihilation into proton anti-proton and preliminary results on direct electron-positron annihilation into lambda anti-lambda based on data collected by BESIII in 2011 and 2012. Finally, expectations on the measurement of nucleon and hyperon electro-magnetic form factors from the BESIII high luminosity energy scan in 2015 and from initial state radiation processes at different center-of-mass energies are also shown.
Mazza, Fabio
2017-08-01
The curved surface sliding (CSS) system is one of the most in-demand techniques for the seismic isolation of buildings; yet there are still important aspects of its behaviour that need further attention. The CSS system presents variation of friction coefficient, depending on the sliding velocity of the CSS bearings, while friction force and lateral stiffness during the sliding phase are proportional to the axial load. Lateral-torsional response needs to be better understood for base-isolated structures located in near-fault areas, where fling-step and forward-directivity effects can produce long-period (horizontal) velocity pulses. To analyse these aspects, a six-storey reinforced concrete (r.c.) office framed building, with an L-shaped plan and setbacks in elevation, is designed assuming three values of the radius of curvature for the CSS system. Seven in-plan distributions of dynamic-fast friction coefficient for the CSS bearings, ranging from a constant value for all isolators to a different value for each, are considered in the case of low- and medium-type friction properties. The seismic analysis of the test structures is carried out considering an elastic-linear behaviour of the superstructure, while a nonlinear force-displacement law of the CSS bearings is considered in the horizontal direction, depending on sliding velocity and axial load. Given the lack of knowledge of the horizontal direction at which near-fault ground motions occur, the maximum torsional effects and residual displacements are evaluated with reference to different incidence angles, while the orientation of the strongest observed pulses is considered to obtain average values.
Directory of Open Access Journals (Sweden)
Iman Eshraghi
2016-09-01
Full Text Available Imperfection sensitivity of large amplitude vibration of curved single-walled carbon nanotubes (SWCNTs is considered in this study. The SWCNT is modeled as a Timoshenko nano-beam and its curved shape is included as an initial geometric imperfection term in the displacement field. Geometric nonlinearities of von Kármán type and nonlocal elasticity theory of Eringen are employed to derive governing equations of motion. Spatial discretization of governing equations and associated boundary conditions is performed using differential quadrature (DQ method and the corresponding nonlinear eigenvalue problem is iteratively solved. Effects of amplitude and location of the geometric imperfection, and the nonlocal small-scale parameter on the nonlinear frequency for various boundary conditions are investigated. The results show that the geometric imperfection and non-locality play a significant role in the nonlinear vibration characteristics of curved SWCNTs.
Lavini, Cristina; Verhoeff, Joost J. C.; Majoie, Charles B.; Stalpers, Lukas J. A.; Richel, Dick J.; Maas, Mario
2011-01-01
To compare time intensity curve (TIC)-shape analysis of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) data with model-based analysis and semiquantitative analysis in patients with high-grade glioma treated with the antiangiogenic drug bevacizumab. Fifteen patients had a pretreatment
Knoope, M.M.J.; Meerman, J.C.; Ramirez, C.A.; Faaij, A.P.C.
2013-01-01
This study aims to investigate the technological and economic prospects of integrated gasification facilities for power (IGCC) and Fischer–Tropsch (FT) liquid production with and without CCS over time. For this purpose, a component based experience curve was constructed and applied to identify the
Directory of Open Access Journals (Sweden)
A. Suppasri
2011-01-01
Full Text Available The 2004 Indian Ocean tsunami damaged and destroyed numerous buildings and houses in Thailand. Estimation of tsunami impact to buildings from this event and evaluation of the potential risks are important but still in progress. The tsunami fragility curve is a function used to estimate the structural fragility against tsunami hazards. This study was undertaken to develop fragility curves using visual inspection of high-resolution satellite images (IKONOS taken before and after tsunami events to classify whether the buildings were destroyed or not based on the remaining roof. Then, a tsunami inundation model is created to reconstruct the tsunami features such as inundation depth, current velocity, and hydrodynamic force of the event. It is assumed that the fragility curves are expressed as normal or lognormal distribution functions and the estimation of the median and log-standard deviation is performed using least square fitting. From the results, the developed fragility curves for different types of building materials (mixed type, reinforced concrete and wood show consistent performance in damage probability and when compared to the existing curves for other locations.
Lagrangian Curves on Spectral Curves of Monopoles
International Nuclear Information System (INIS)
Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.
2010-01-01
We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.
Curves from Motion, Motion from Curves
2000-01-01
tautochrone and brachistochrone properties. To Descartes, however, the rectification of curves such as the spiral (3) and the cycloid (4) was suspect - they...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP012017 TITLE: Curves from Motion, Motion from Curves DISTRIBUTION...Approved for public release, distribution unlimited This paper is part of the following report: TITLE: International Conference on Curves and Surfaces [4th
Optimization on Spaces of Curves
DEFF Research Database (Denmark)
Møller-Andersen, Jakob
This thesis is concerned with computational and theoretical aspects of Riemannian metrics on spaces of regular curves, and their applications. It was recently proved that second order constant coefficient Sobolev metrics on curves are geodesically complete. We extend this result to the case...... of Sobolev metrics with coefficient functions depending on the length of the curve. We show how to apply this result to analyse a wide range of metrics on the submanifold of unit and constant speed curves. We present a numerical discretization of second order Sobolev metrics on the space of regular curves...... of cardiac deformations. Finally we investigate a new application of Riemannian shape analysis in shape optimization. We setup a simple elliptic model problem, and describe how to apply shape calculus to obtain directional derivatives in the manifold of planar curves. We present an implementation based...
Lai, Chia-Lin; Lee, Jhih-Shian; Chen, Jyh-Cheng
2015-02-01
Energy-mapping, the conversion of linear attenuation coefficients (μ) calculated at the effective computed tomography (CT) energy to those corresponding to 511 keV, is an important step in CT-based attenuation correction (CTAC) for positron emission tomography (PET) quantification. The aim of this study was to implement energy-mapping step by using curve fitting ability of artificial neural network (ANN). Eleven digital phantoms simulated by Geant4 application for tomographic emission (GATE) and 12 physical phantoms composed of various volume concentrations of iodine contrast were used in this study to generate energy-mapping curves by acquiring average CT values and linear attenuation coefficients at 511 keV of these phantoms. The curves were built with ANN toolbox in MATLAB. To evaluate the effectiveness of the proposed method, another two digital phantoms (liver and spine-bone) and three physical phantoms (volume concentrations of 3%, 10% and 20%) were used to compare the energy-mapping curves built by ANN and bilinear transformation, and a semi-quantitative analysis was proceeded by injecting 0.5 mCi FDG into a SD rat for micro-PET scanning. The results showed that the percentage relative difference (PRD) values of digital liver and spine-bone phantom are 5.46% and 1.28% based on ANN, and 19.21% and 1.87% based on bilinear transformation. For 3%, 10% and 20% physical phantoms, the PRD values of ANN curve are 0.91%, 0.70% and 3.70%, and the PRD values of bilinear transformation are 3.80%, 1.44% and 4.30%, respectively. Both digital and physical phantoms indicated that the ANN curve can achieve better performance than bilinear transformation. The semi-quantitative analysis of rat PET images showed that the ANN curve can reduce the inaccuracy caused by attenuation effect from 13.75% to 4.43% in brain tissue, and 23.26% to 9.41% in heart tissue. On the other hand, the inaccuracy remained 6.47% and 11.51% in brain and heart tissue when the bilinear transformation
Lei, Yuchuan; Chen, Zhenqian; Shi, Juan
2017-12-01
Numerical simulations of condensation heat transfer of R134a in curved triangle microchannels with various curvatures are proposed. The model is established on the volume of fluid (VOF) approach and user-defined routines which including mass transfer at the vapor-liquid interface and latent heat. Microgravity operating condition is assumed in order to highlight the surface tension. The predictive accuracy of the model is assessed by comparing the simulated results with available correlations in the literature. Both an increased mass flux and the decreased hydraulic diameter could bring better heat transfer performance. No obvious effect of the wall heat flux is observed in condensation heat transfer coefficient. Changes in geometry and surface tension lead to a reduction of the condensate film thickness at the sides of the channel and accumulation of the condensate film at the corners of the channel. Better heat transfer performance is obtained in the curved triangle microchannels over the straight ones, and the performance could be further improved in curved triangle microchannels with larger curvatures. The minimum film thickness where most of the heat transfer process takes place exists near the corners and moves toward the corners in curved triangle microchannels with larger curvatures.
International Nuclear Information System (INIS)
Kim, Woo Gon; Yin, Song Nan; Kim, Yong Wan
2008-01-01
Alloy 617 is a principal candidate alloy for the high temperature gas-cooled reactor (HTGR) components, because of its high creep rupture strength coupled with its good corrosion behavior in simulated HTGR-helium and its sufficient workability. To describe a creep strain-time curve well, various constitutive equations have been proposed by Kachanov-Rabotnov, Andrade, Garofalo, Evans and Maruyama, et al.. Among them, the K-R model has been used frequently, because a secondary creep resulting from a balance between a softening and a hardening of materials and a tertiary creep resulting from an appearance and acceleration of the internal or external damage processes are adequately considered. In the case of nickel-base alloys, it has been reported that a tertiary creep at a low strain range may be generated, and this tertiary stage may govern the total creep deformation. Therefore, a creep curve for nickel-based Alloy 617 will be predicted appropriately by using the K-R model that can reflect a tertiary creep. In this paper, the long-term creep curves for Alloy 617 were predicted by using the nonlinear least square fitting (NLSF) method in the K-R model. The modified K-R model was introduced to fit the full creep curves well. The values for the λ and K parameters in the modified K-R model were obtained with stresses
Modeling the CO2 emissions and energy saved from new energy vehicles based on the logistic-curve
International Nuclear Information System (INIS)
Tang, Bao-jun; Wu, Xiao-feng; Zhang, Xian
2013-01-01
The Chinese government has outlined plans for developing new energy vehicles (NEVs) to achieve energy conservation and emission reduction. This paper used a logistic-curve to predict the market share of NEVs in the next decade, and then calculated the potential environment benefits of each and every car or the total according to the report of IPCC (2006). The results indicated that NEVs were of benefit in achieving above goals, particularly electric vehicles (EVs). However, they will have a limited impact in the short term. Finally, considering the empirical results and the Chinese reality, this paper proposed corresponding recommendations. - Highlights: ► This paper predicted the number of vehicles in China. ► This paper used a logistic-curve to predict the market share of NEVs. ► The potential environment benefits of every car or the total were calculated. ► China's NEVs would produce more CO 2 than those of other countries
Directory of Open Access Journals (Sweden)
Yu Xiu-Juan
2007-10-01
Full Text Available Abstract Background The nucleotide compositional asymmetry between the leading and lagging strands in bacterial genomes has been the subject of intensive study in the past few years. It is interesting to mention that almost all bacterial genomes exhibit the same kind of base asymmetry. This work aims to investigate the strand biases in Chlamydia muridarum genome and show the potential of the Z curve method for quantitatively differentiating genes on the leading and lagging strands. Results The occurrence frequencies of bases of protein-coding genes in C. muridarum genome were analyzed by the Z curve method. It was found that genes located on the two strands of replication have distinct base usages in C. muridarum genome. According to their positions in the 9-D space spanned by the variables u1 – u9 of the Z curve method, K-means clustering algorithm can assign about 94% of genes to the correct strands, which is a few percent higher than those correctly classified by K-means based on the RSCU. The base usage and codon usage analyses show that genes on the leading strand have more G than C and more T than A, particularly at the third codon position. For genes on the lagging strand the biases is reverse. The y component of the Z curves for the complete chromosome sequences show that the excess of G over C and T over A are more remarkable in C. muridarum genome than in other bacterial genomes without separating base and/or codon usages. Furthermore, for the genomes of Borrelia burgdorferi, Treponema pallidum, Chlamydia muridarum and Chlamydia trachomatis, in which distinct base and/or codon usages have been observed, closer phylogenetic distance is found compared with other bacterial genomes. Conclusion The nature of the strand biases of base composition in C. muridarum is similar to that in most other bacterial genomes. However, the base composition asymmetry between the leading and lagging strands in C. muridarum is more significant than that in
M. Papathoma-Köhle
2016-01-01
The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerab...
Czech Academy of Sciences Publication Activity Database
Nold, A.; Malijevský, Alexandr; Kalliadasis, S.
2011-01-01
Roč. 197, č. 1 (2011), s. 185-191 ISSN 1951-6355 R&D Projects: GA AV ČR IAA400720710 Grant - others:EPSRC(GB) EP/E046029; FP7 ITN(XE) 214919; ERC (XE) 247301 Institutional research plan: CEZ:AV0Z40720504 Keywords : wetting phenomena * curved substrates * theory Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.562, year: 2011
Directory of Open Access Journals (Sweden)
Ismed Jauhar
2016-03-01
Full Text Available Along with the many environmental changes, it enables a disaster either natural or man-made objects. One of the efforts made to prevent disasters from happening is to make a system that is able to provide information about the status of the environment that is around. Many developments in the sensor system makes it possible to load a system that will supply real-time on the status of environmental conditions with a good security system. This study created a supply system status data of environmental conditions, especially on bridges by using Ubiquitous Sensor Network. Sensor used to detect vibrations are using an accelerometer. Supply of data between sensors and servers using ZigBee communication protocol wherein the data communication will be done using the Elliptic Curve Integrated security mechanisms Encryption Scheme and on the use of Elliptic Curve key aggrement Menezes-Qu-Vanstone. Test results show the limitation of distance for communication is as far as 55 meters, with the computation time for encryption and decryption with 97 and 42 seconds extra time for key exchange is done at the beginning of communication . Keywords: Ubiquitous Sensor Network, Accelerometer, ZigBee,Elliptic Curve Menezes-Qu-Vanstone
International Nuclear Information System (INIS)
Shi, F; Tian, Z; Jia, X; Jiang, S; Zarepisheh, M; Cervino, L
2014-01-01
Purpose: In treatment plan optimization for Intensity Modulated Radiation Therapy (IMRT), after a plan is initially developed by a dosimetrist, the attending physician evaluates its quality and often would like to improve it. As opposed to having the dosimetrist implement the improvements, it is desirable to have the physician directly and efficiently modify the plan for a more streamlined and effective workflow. In this project, we developed an interactive optimization system for physicians to conveniently and efficiently fine-tune iso-dose curves. Methods: An interactive interface is developed under C++/Qt. The physician first examines iso-dose lines. S/he then picks an iso-dose curve to be improved and drags it to a more desired configuration using a computer mouse or touchpad. Once the mouse is released, a voxel-based optimization engine is launched. The weighting factors corresponding to voxels between the iso-dose lines before and after the dragging are modified. The underlying algorithm then takes these factors as input to re-optimize the plan in near real-time on a GPU platform, yielding a new plan best matching the physician's desire. The re-optimized DVHs and iso-dose curves are then updated for the next iteration of modifications. This process is repeated until a physician satisfactory plan is achieved. Results: We have tested this system for a series of IMRT plans. Results indicate that our system provides the physicians an intuitive and efficient tool to edit the iso-dose curves according to their preference. The input information is used to guide plan re-optimization, which is achieved in near real-time using our GPU-based optimization engine. Typically, a satisfactory plan can be developed by a physician in a few minutes using this tool. Conclusion: With our system, physicians are able to manipulate iso-dose curves according to their preferences. Preliminary results demonstrate the feasibility and effectiveness of this tool
Utrobin, V. P.; Wongwathanarat, A.; Janka, H.-Th.; Müller, E.
2017-09-01
Type II-plateau supernovae (SNe IIP) are the most numerous subclass of core-collapse SNe originating from massive stars. In the framework of the neutrino-driven explosion mechanism, we study the properties of the SN outburst for a red supergiant progenitor model and compare the corresponding light curves with observations of the ordinary Type IIP SN 1999em. Three-dimensional (3D) simulations of (parametrically triggered) neutrino-driven explosions are performed with the (explicit, finite-volume, Eulerian, multifluid hydrodynamics) code Prometheus, using a presupernova model of a 15 M ⊙ star as initial data. On approaching homologous expansion, the hydrodynamic and composition variables of the 3D models are mapped to a spherically symmetric configuration, and the simulations are continued with the (implicit, Lagrangian, radiation hydrodynamics) code Crab to follow the evolution of the blast wave during the SN outburst. Our 3D neutrino-driven explosion model with an explosion energy of about 0.5× {10}51 erg produces 56Ni in rough agreement with the amount deduced from fitting the radioactively powered light-curve tail of SN 1999em. The considered presupernova model, 3D explosion simulations, and light-curve calculations can explain the basic observational features of SN 1999em, except for those connected to the presupernova structure of the outer stellar layers. Our 3D simulations show that the distribution of 56Ni-rich matter in velocity space is asymmetric with a strong dipole component that is consistent with the observations of SN 1999em. The monotonic decline in luminosity from the plateau to the radioactive tail in ordinary SNe IIP is a manifestation of the intense turbulent mixing at the He/H composition interface.
Barbetta, Silvia; Moramarco, Tommaso; Perumal, Muthiah
2017-11-01
Quite often the discharge at a site is estimated using the rating curve developed for that site and its development requires river flow measurements, which are costly, tedious and dangerous during severe floods. To circumvent the conventional rating curve development approach, Perumal et al. in 2007 and 2010 applied the Variable Parameter Muskingum Stage-hydrograph (VPMS) routing method for developing stage-discharge relationships especially at those ungauged river sites where stage measurements and details of section geometry are available, but discharge measurements are not made. The VPMS method enables to estimate rating curves at ungauged river sites with acceptable accuracy. But the application of the method is subjected to the limitation of negligible presence of lateral flow within the routing reach. To overcome this limitation, this study proposes an extension of the VPMS method, henceforth, known herein as the VPMS-Lin method, for enabling the streamflow assessment even when significant lateral inflow occurs along the river reach considered for routing. The lateral inflow is estimated through the continuity equation expressed in the characteristic form as advocated by Barbetta et al. in 2012. The VPMS-Lin, is tested on two rivers characterized by different geometric and hydraulic properties: 1) a 50 km reach of the Tiber River in (central Italy) and 2) a 73 km reach of the Godavari River in the peninsular India. The study demonstrates that both the upstream and downstream discharge hydrographs are well reproduced, with a root mean square error equal on average to about 35 and 1700 m3 s-1 for the Tiber River and the Godavari River case studies, respectively. Moreover, simulation studies carried out on a river stretch of the Tiber River using the one-dimensional hydraulic model MIKE11 and the VPMS-Lin models demonstrate the accuracy of the VMPS-Lin model, which besides enabling the estimation of streamflow, also enables the estimation of reach averaged
Energy Technology Data Exchange (ETDEWEB)
Chiriac, Horia [National Institute of Research and Development for Technical Physics, 47 Mangeron Boulevard, 700050, Iasi (Romania); Lupu, Nicoleta [National Institute of Research and Development for Technical Physics, 47 Mangeron Boulevard, 700050, Iasi (Romania); Stoleriu, Laurentiu [Al. I. Cuza University, Department of Solid State and Theoretical Physics, Blvd. Carol I, 11, 700506, Iasi (Romania)]. E-mail: lstoler@uaic.ro; Postolache, Petronel [Al. I. Cuza University, Department of Solid State and Theoretical Physics, Blvd. Carol I, 11, 700506, Iasi (Romania); Stancu, Alexandru [Al. I. Cuza University, Department of Solid State and Theoretical Physics, Blvd. Carol I, 11, 700506, Iasi (Romania)
2007-09-15
In this paper we present the results of applying the first-order reversal curves (FORC) diagram experimental method to the analysis of the magnetization processes of NdFeB-based permanents magnets. The FORC diagrams for this kind of exchange spring magnets show the existence of two magnetic phases-a soft magnetic phase and a hard magnetic one. Micromagnetic modeling is used for validating the hypotheses regarding the origin of the different features of the experimental FORC diagrams.
International Nuclear Information System (INIS)
Jesenik, M.; Gorican, V.; Trlep, M.; Hamler, A.; Stumberger, B.
2006-01-01
A lot of magnetic materials are anisotropic. In the 3D finite element method calculation, anisotropy of the material is taken into account. Anisotropic magnetic material is described with magnetization curves for different magnetization directions. The 3D transient calculation of the rotational magnetic field in the sample of the round rotational single sheet tester with circular sample considering eddy currents is made and compared with the measurement to verify the correctness of the method and to analyze the magnetic field in the sample
Multiphasic growth curve analysis.
Koops, W.J.
1986-01-01
Application of a multiphasic growth curve is demonstrated with 4 data sets, adopted from literature. The growth curve used is a summation of n logistic growth functions. Human height growth curves of this type are known as "double logistic" (n = 2) and "triple logistic" (n = 3) growth curves (Bock
International Nuclear Information System (INIS)
Amendolia, S.R.; Badelek, B.; Bertolucci, E.; Bettoni, D.; Bizzeti, A.; Bosisio, L.; Bradaschia, C.; Dell'Orso, M.; Foa, L.; Focardi, E.; Giannetti, P.; Giazotto, A.; Giorgi, M.A.; Marrocchesi, P.S.; Menzione, A.; Ristori, L.; Scribano, A.; Tenchini, R.; Tonelli, G.; Triggiani, G.; Frank, S.G.F.; Harvey, J.; Menasce, D.; Meroni, E.; Moroni, L.; Milan Univ.
1984-01-01
The EM form factor of the pion has been studied in the time-like region by measuring sigma(e + e - ->π + π - ) normalization to sigma(e + e - ->μ + μ - ). Results have been obtained for q 2 down to the physical threshold. (orig.)
The Cost Monitoring of the Construction Machinery with Using the Stochastic Progress-Based S-curves
Directory of Open Access Journals (Sweden)
Krajňák Marek
2015-06-01
Full Text Available Contribution presents methodology for evaluating at-completion project performance status. Accurate cost and schedule project forecasts are difficult to generate when considering the impact of such events as unforeseen cost changes, material delays, scope deviation, changes to the project execution plan and poor subcontractor performance. In reality, the original estimate may be considered the first project forecast and at the point of project completion, the latest updated estimate (last forecast and the actual amount of what is being expended should be the same. Final project performance is determined by comparing the planned budget and project duration, with the expected forecasted final budget and elapsed time. The stochastic S-curve methodology permits objective evaluation of project performance without the limitations inherent in a deterministic approach. This paper used the stochastic S curve to monitor the cost and time consumption in operation of the construction machines. The contribution presents a partial outcome from the dissertation thesis called the Interactive tools for resource optimization in construction.
Energy Technology Data Exchange (ETDEWEB)
Zambelli, Monica de S.; Cicogna, Marcelo A.; Soares, Secundino [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Eletrica
2006-07-01
The proposal of this work is to present a long term hydrothermal scheduling operating policy based on the concept of storage guide-curves. According to this policy, at each stage of the planning period the decision of the amount of water to be discharged by each hydrothermal unit must be such that keep its reservoir at levels pre-determined by curves obtained by an optimization method. The performance analysis for this operating policy is given by simulation with historical inflow data, considering a single hydrothermal system, constituted by a single hydro plant, and a composite system, constituted by hydro plants in cascade, adopting as performance criteria the minimization of the expected operating cost. The results demonstrate that, although simple and clear, this operating policy presents a competitive performance in the long term hydrothermal scheduling. (author)
Curved Microneedle Array-Based sEMG Electrode for Robust Long-Term Measurements and High Selectivity
Directory of Open Access Journals (Sweden)
Minjae Kim
2015-07-01
Full Text Available Surface electromyography is widely used in many fields to infer human intention. However, conventional electrodes are not appropriate for long-term measurements and are easily influenced by the environment, so the range of applications of sEMG is limited. In this paper, we propose a flexible band-integrated, curved microneedle array electrode for robust long-term measurements, high selectivity, and easy applicability. Signal quality, in terms of long-term usability and sensitivity to perspiration, was investigated. Its motion-discriminating performance was also evaluated. The results show that the proposed electrode is robust to perspiration and can maintain a high-quality measuring ability for over 8 h. The proposed electrode also has high selectivity for motion compared with a commercial wet electrode and dry electrode.
Energy Technology Data Exchange (ETDEWEB)
Ryss, J.M.; Bakhtin, J.G.; Chamaev, V.N.; Panteleimonov, V.M.
1976-03-30
A device is described for geophysical prospecting of ore deposits, wherein the supply circuit is made up of a direct-current source provided with apparatus for changing current intensity, a main current-carrying electrode having electrical contact with an ore body, and an auxiliary current-carrying electrode electrically connected with the medium enclosing said ore body. Connected in said supply circuit at the main current carrying electrode is a current intensity detector connected whereto is a series circuit made up of a compensating voltage generator, a summing unit and a unit for measuring the potentials of electrochemical reactions on the surface of the ore body. A recording unit is connected to the unit for setting values of the potentials of electrochemical reactions and to record in the form of polarization curves the relationships between the set potentials of electrochemical reactions on the surface of the ore body and the currents flowing through the surface of that body. (DDA)
Su, Chiu-Wen; Yen, Amy Ming-Fang; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng
2017-12-01
The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area under a receiver operating characteristics (AUROC) curve. How the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiologic study, and affects the performance in a prediction model, has not been researched yet. A two-stage design was conducted by first proposing a validation study to calibrate CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected performance of the updated prediction model was quantified by comparing AUROC curves between the original and updated models. Estimates regarding calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% confidence interval [CI]: 61.7% to 63.6%) for the non-updated model to 68.9% (95% CI: 68.0% to 69.6%) for the updated one, reaching a statistically significant difference (P prediction model was demonstrated for periodontal disease as measured by the calibrated CPI derived from a large epidemiologic survey.
Directory of Open Access Journals (Sweden)
Janusz Charatonik
1991-11-01
Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.
Analysis of the Tax Burden in Romania based on the Laffer Curve in the Period 1991-2009
Directory of Open Access Journals (Sweden)
Gabriela DOBROTA
2012-04-01
Full Text Available The fiscal pressure requires certain limits of affordability for taxpayers. These limits are imposed by the reactions of taxpayers who can resist to compulsory levies increase, reacting with evasion, fraud, reduce productive activity or even riots. If by a certain time, the tax pay is made voluntarily by the honest taxpayer, at a time when taxes exceed certain limits of endurance events occur that bring serious damages to state's desire to collect these revenues. Taxpayer behavior becomes abnormal in any way always trying to avoid paying tax, hoping for a reduction in tax burden. The phenomenon to increase or decrease the size of the tax burden is the result of economic and social role of the state. Analysis of state intervention in the economy led to a new liberal economic thinking, thinking that was approached it by the American economist Arthur Laffer. The aim of this paper consists in the empirical analysis of the correlation betwen the tax pressure rate and the tax incomes for Romania in the period 1991 – 2009 using the methodology creating by the Laffer curve.
Directory of Open Access Journals (Sweden)
M. E. Shimpi
2012-01-01
Full Text Available Efforts have been directed to study and analyze the squeeze film performance between rotating transversely rough curved porous annular plates in the presence of a magnetic fluid lubricant considering the effect of elastic deformation. A stochastic random variable with nonzero mean, variance, and skewness characterizes the random roughness of the bearing surfaces. With the aid of suitable boundary conditions, the associated stochastically averaged Reynolds' equation is solved to obtain the pressure distribution in turn, which results in the calculation of the load-carrying capacity. The graphical representations establish that the transverse roughness, in general, adversely affects the performance characteristics. However, the magnetization registers a relatively improved performance. It is found that the deformation causes reduced load-carrying capacity which gets further decreased by the porosity. This investigation tends to indicate that the adverse effect of porosity, standard deviation and deformation can be compensated to certain extent by the positive effect of the magnetic fluid lubricant in the case of negatively skewed roughness by choosing the rotational inertia and the aspect ratio, especially for suitable ratio of curvature parameters.
Indian Academy of Sciences (India)
In this article some Peano curves are exhibited and some of their recent applications are dis- cussed. A C++ program to draw the Hilbert curve approximately is given. 1. Introduction. A 'continuous curve' in the plane is usually defined as the path traced by a moving point (x (t), Y (t)) as t runs over an interval of the real line, ...
Indian Academy of Sciences (India)
Institute, Calcutta. Apart from mathematics, he likes painting and reading. Unlike most others he dislikes computers. Ritabrata Munshi. Introduction. In this two-part article we will consider one of the classi- cal theorems of mathematics, the Jordan curve theorem. It states that a simple closed curve (i.e., a closed curve which.
Behets, Jonas; Declerck, Priscilla; Delaedt, Yasmine; Verelst, Lieve; Ollevier, Frans
2006-12-01
Real-time polymerase chain reaction melting curve analysis (MCA) allows differentiation of several free-living amoebae species. Distinctive characteristics were found for Naegleria fowleri, N. lovaniensis, N. australiensis, N. gruberi, Hartmanella vermiformis, and Willaertia magna. Species specificity of the amplicons was confirmed using agarose gel electrophoresis and sequence-based approaches. Amplification efficiency ranged from 91% to 98%, indicating the quantitative potential of the assay. This MCA approach can be used for quantitative detection of free-living amoebae after cultivation but also as a culture-independent detection method.
International Nuclear Information System (INIS)
Haverkamp, U.; Wiezorek, C.; Poetter, R.
1990-01-01
Lyoluminescence dosimetry is based upon light emission during dissolution of previously irradiated dosimetric materials. The lyoluminescence signal is expressed in the dissolution glow curve. These curves begin, depending on the dissolution system, with a high peak followed by an exponentially decreasing intensity. System parameters that influence the graph of the dissolution glow curve, are, for example, injection speed, temperature and pH value of the solution and the design of the dissolution cell. The initial peak does not significantly correlate with the absorbed dose, it is mainly an effect of the injection. The decay of the curve consists of two exponential components: one fast and one slow. The components depend on the absorbed dose and the dosimetric materials used. In particular, the slow component correlates with the absorbed dose. In contrast to the fast component the argument of the exponential function of the slow component is independent of the dosimetric materials investigated: trehalose, glucose and mannitol. The maximum value, following the peak of the curve, and the integral light output are a measure of the absorbed dose. The reason for the different light outputs of various dosimetric materials after irradiation with the same dose is the differing solubility. The character of the dissolution glow curves is the same following irradiation with photons, electrons or neutrons. (author)
Magnetism in curved geometries
Streubel, Robert
Deterministically bending and twisting two-dimensional structures in the three-dimensional (3D) space provide means to modify conventional or to launch novel functionalities by tailoring curvature and 3D shape. The recent developments of 3D curved magnetic geometries, ranging from theoretical predictions over fabrication to characterization using integral means as well as advanced magnetic tomography, will be reviewed. Theoretical works predict a curvature-induced effective anisotropy and effective Dzyaloshinskii-Moriya interaction resulting in a vast of novel effects including magnetochiral effects (chirality symmetry breaking) and topologically induced magnetization patterning. The remarkable development of nanotechnology, e.g. preparation of high-quality extended thin films, nanowires and frameworks via chemical and physical deposition as well as 3D nano printing, has granted first insights into the fundamental properties of 3D shaped magnetic objects. Optimizing magnetic and structural properties of these novel 3D architectures demands new investigation methods, particularly those based on vector tomographic imaging. Magnetic neutron tomography and electron-based 3D imaging, such as electron holography and vector field electron tomography, are well-established techniques to investigate macroscopic and nanoscopic samples, respectively. At the mesoscale, the curved objects can be investigated using the novel method of magnetic X-ray tomography. In spite of experimental challenges to address the appealing theoretical predictions of curvature-induced effects, those 3D magnetic architectures have already proven their application potential for life sciences, targeted delivery, realization of 3D spin-wave filters, and magneto-encephalography devices, to name just a few. DOE BES MSED (DE-AC02-05-CH11231).
Fuchs, Susanne I; Junge, Sibylle; Ellemunter, Helmut; Ballmann, Manfred; Gappa, Monika
2013-05-01
Volumetric capnography reflecting the course of CO2-exhalation is used to assess ventilation inhomogeneity. Calculation of the slope of expiratory phase 3 and the capnographic index (KPIv) from expirograms allows quantification of extent and severity of small airway impairment. However, technical limitations have hampered more widespread use of this technique. Using expiratory molar mass-volume-curves sampled with a handheld ultrasonic flow sensor during tidal breathing is a novel approach to extract similar information from expirograms in a simpler manner possibly qualifying as a screening tool for clinical routine. The aim of the present study was to evaluate calculation of the KPIv based on molar mass-volume-curves sampled with an ultrasonic flow sensor in patients with CF and controls by assessing feasibility, reproducibility and comparability with the Lung Clearance Index (LCI) derived from multiple breath washout (MBW) used as the reference method. Measurements were performed in patients with CF and healthy controls during a single test occasion using the EasyOne Pro, MBW Module (ndd Medical Technologies, Switzerland). Capnography and MBW were performed in 87/96 patients with CF and 38/42 controls, with a success rate of 90.6% for capnography. Mean age (range) was 12.1 (4-25) years. Mean (SD) KPIv was 6.94 (3.08) in CF and 5.10 (2.06) in controls (p=0.001). Mean LCI (SD) was 8.0 (1.4) in CF and 6.2 (0.4) in controls (p=molar mass-volume-curves is feasible. KPIv is significantly different between patients with CF and controls and correlates with the LCI. However, individual data revealed a relevant overlap between patients and controls requiring further evaluation, before this method can be recommended for clinical use. Copyright © 2012 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Hamed Bashirpour
2018-03-01
Full Text Available In wireless sensor networks (WSNs, users can use broadcast authentication mechanisms to connect to the target network and disseminate their messages within the network. Since data transfer for sensor networks is wireless, as a result, attackers can easily eavesdrop deployed sensor nodes and the data sent between them or modify the content of eavesdropped data and inject false data into the sensor network. Hence, the implementation of the message authentication mechanisms (in order to avoid changes and injecting messages into the network of wireless sensor networks is essential. In this paper, we present an improved protocol based on elliptic curve cryptography (ECC to accelerate authentication of multi-user message broadcasting. In comparison with previous ECC-based schemes, complexity and computational overhead of proposed scheme is significantly decreased. Also, the proposed scheme supports user anonymity, which is an important property in broadcast authentication schemes for WSNs to preserve user privacy and user untracking.
Directory of Open Access Journals (Sweden)
A. M. Hashemi
2000-01-01
Full Text Available Regionalized and at-site flood frequency curves exhibit considerable variability in their shapes, but the factors controlling the variability (other than sampling effects are not well understood. An application of the Monte Carlo simulation-based derived distribution approach is presented in this two-part paper to explore the influence of climate, described by simulated rainfall and evapotranspiration time series, and basin factors on the flood frequency curve (ffc. The sensitivity analysis conducted in the paper should not be interpreted as reflecting possible climate changes, but the results can provide an indication of the changes to which the flood frequency curve might be sensitive. A single site Neyman Scott point process model of rainfall, with convective and stratiform cells (Cowpertwait, 1994; 1995, has been employed to generate synthetic rainfall inputs to a rainfall runoff model. The time series of the potential evapotranspiration (ETp demand has been represented through an AR(n model with seasonal component, while a simplified version of the ARNO rainfall-runoff model (Todini, 1996 has been employed to simulate the continuous discharge time series. All these models have been parameterised in a realistic manner using observed data and results from previous applications, to obtain ‘reference’ parameter sets for a synthetic case study. Subsequently, perturbations to the model parameters have been made one-at-a-time and the sensitivities of the generated annual maximum rainfall and flood frequency curves (unstandardised, and standardised by the mean have been assessed. Overall, the sensitivity analysis described in this paper suggests that the soil moisture regime, and, in particular, the probability distribution of soil moisture content at the storm arrival time, can be considered as a unifying link between the perturbations to the several parameters and their effects on the standardised and unstandardised ffcs, thus revealing the
Directory of Open Access Journals (Sweden)
Anup Kumar Maurya
2017-10-01
Full Text Available To improve the quality of service and reduce the possibility of security attacks, a secure and efficient user authentication mechanism is required for Wireless Sensor Networks (WSNs and the Internet of Things (IoT. Session key establishment between the sensor node and the user is also required for secure communication. In this paper, we perform the security analysis of A.K.Das’s user authentication scheme (given in 2015, Choi et al.’s scheme (given in 2016, and Park et al.’s scheme (given in 2016. The security analysis shows that their schemes are vulnerable to various attacks like user impersonation attack, sensor node impersonation attack and attacks based on legitimate users. Based on the cryptanalysis of these existing protocols, we propose a secure and efficient authenticated session key establishment protocol which ensures various security features and overcomes the drawbacks of existing protocols. The formal and informal security analysis indicates that the proposed protocol withstands the various security vulnerabilities involved in WSNs. The automated validation using AVISPA and Scyther tool ensures the absence of security attacks in our scheme. The logical verification using the Burrows-Abadi-Needham (BAN logic confirms the correctness of the proposed protocol. Finally, the comparative analysis based on computational overhead and security features of other existing protocol indicate that the proposed user authentication system is secure and efficient. In future, we intend to implement the proposed protocol in real-world applications of WSNs and IoT.
Method of construction spatial transition curve
Directory of Open Access Journals (Sweden)
S.V. Didanov
2013-04-01
Full Text Available Purpose. The movement of rail transport (speed rolling stock, traffic safety, etc. is largely dependent on the quality of the track. In this case, a special role is the transition curve, which ensures smooth insertion of the transition from linear to circular section of road. The article deals with modeling of spatial transition curve based on the parabolic distribution of the curvature and torsion. This is a continuation of research conducted by the authors regarding the spatial modeling of curved contours. Methodology. Construction of the spatial transition curve is numerical methods for solving nonlinear integral equations, where the initial data are taken coordinate the starting and ending points of the curve of the future, and the inclination of the tangent and the deviation of the curve from the tangent plane at these points. System solutions for the numerical method are the partial derivatives of the equations of the unknown parameters of the law of change of torsion and length of the transition curve. Findings. The parametric equations of the spatial transition curve are calculated by finding the unknown coefficients of the parabolic distribution of the curvature and torsion, as well as the spatial length of the transition curve. Originality. A method for constructing the spatial transition curve is devised, and based on this software geometric modeling spatial transition curves of railway track with specified deviations of the curve from the tangent plane. Practical value. The resulting curve can be applied in any sector of the economy, where it is necessary to ensure a smooth transition from linear to circular section of the curved space bypass. An example is the transition curve in the construction of the railway line, road, pipe, profile, flat section of the working blades of the turbine and compressor, the ship, plane, car, etc.
Chen, Hui; Cai, Li-Xun
2018-04-01
Based on the power-law stress-strain relation and equivalent energy principle, theoretical equations for converting between Brinell hardness (HB), Rockwell hardness (HR), and Vickers hardness (HV) were established. Combining the pre-existing relation between the tensile strength ( σ b ) and Hollomon parameters ( K, N), theoretical conversions between hardness (HB/HR/HV) and tensile strength ( σ b ) were obtained as well. In addition, to confirm the pre-existing σ b -( K, N) relation, a large number of uniaxial tensile tests were conducted in various ductile materials. Finally, to verify the theoretical conversions, plenty of statistical data listed in ASTM and ISO standards were adopted to test the robustness of the converting equations with various hardness and tensile strength. The results show that both hardness conversions and hardness-strength conversions calculated from the theoretical equations accord well with the standard data.
Directory of Open Access Journals (Sweden)
M.E. Shimpi
2012-06-01
Full Text Available This investigation aims at analyzing the behaviour of a magnetic fluid based squeeze film between two rotating transversely rough porous circular plates taking bearing deformation into consideration. The results presented in graphical form inform that the transverse surface roughness introduces an adverse effect on the performance characteristics while the magnetic fluid lubricant turn in an improved performance. It is found that the combined effect of rotation and deformation causes significantly reduced load carrying capacity. However, this investigation establishes that the adverse effect of porosity, deformation and standard deviation can be compensated up to some extent by the positive effect of magnetic fluid lubricant in the case of negatively skewed roughness by choosing curvature parameters. To compensate, the rotational inertia needs to have smaller values.
Indian Academy of Sciences (India)
mathematics and computer applications for the last 20 years. He has been a National Science. Talent awardee of. NCERT in mathematics. GENERAL I ARTICLE. Space-filling Curves. ReMittal. In this article some Peano curves are exhibited and some of their recent applications are dis- cussed. A C++ program to draw the ...
Simulating Supernova Light Curves
Energy Technology Data Exchange (ETDEWEB)
Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-05
This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.
Tempo curves considered harmful
Desain, P.; Honing, H.
1993-01-01
In the literature of musicology, computer music research and the psychology of music, timing or tempo measurements are mostly presented in the form of continuous curves. The notion of these tempo curves is dangerous, despite its widespread use, because it lulls its users into the false impression
DEFF Research Database (Denmark)
Federici, Paolo; Georgieva Yankova, Ginka
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to a draft of IEC 61400-12-1 Ed.2.......The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to a draft of IEC 61400-12-1 Ed.2....
The aeolian dust accumulation curve
Goossens, D.
2001-01-01
This article presents a simple physical concept of aeolian dust accumulation, based on the behaviour of the subprocesses of dust deposition and dust erosion. The concept is tested in an aeolian dust wind tunnel. The agreement between the accumulation curve predicted by the model and the accumulation
Characterization of Elliptic Curve Traces under FR-reduction
Miyaji, Atsuko; Nakabayashi, Masaki; Takano, Shunzo
2001-01-01
Elliptic curve cryptosystems([19],[25]) are based on the elliptic curve discrete logarithm problem(ECDLP). If elliptic curve cryptosystems avoid FR-reduction([11],[17]) and anomalous elliptic curve over F_q ([34],[3],[36]), then with current knowledge we can construct elliptic curve cryptosystems over a smaller definition field. ECDLP has an interesting property that the security deeply depends on elliptic curve traces rather than definition fields, which does not occur in the case of the dis...
DEFF Research Database (Denmark)
Brücker, Herbert; Jahn, Elke J.
Based on a wage curve approach we examine the labor market effects of migration in Germany. The wage curve relies on the assumption that wages respond to a change in the unemployment rate, albeit imperfectly. This allows one to derive the wage and employment effects of migration simultaneously...... with a vocational degree. The wage and employment effects of migration are moderate: a 1 percent increase in the German labor force through immigration increases the aggregate unemployment rate by less than 0.1 percentage points and reduces average wages by less 0.1 percent. While native workers benefit from...... increased wages and lower unemployment, foreign workers are adversely affected....
Tian, Dayong; Lin, Zhifen; Yin, Daqiang
2013-01-01
The present study proposed a QSAR model to predict joint effects at non-equitoxic ratios for binary mixtures containing reactive toxicants, cyanogenic compounds and aldehydes. Toxicity of single and binary mixtures was measured by quantifying the decrease in light emission from the Photobacterium phosphoreum for 15 min. The joint effects of binary mixtures (TU sum) can thus be obtained. The results showed that the relationships between toxic ratios of the individual chemicals and their joint effects can be described by normal distribution function. Based on normal distribution equations, the joint effects of binary mixtures at non-equitoxic ratios ( [Formula: see text]) can be predicted quantitatively using the joint effects at equitoxic ratios ( [Formula: see text]). Combined with a QSAR model of [Formula: see text]in our previous work, a novel QSAR model can be proposed to predict the joint effects of mixtures at non-equitoxic ratios ( [Formula: see text]). The proposed model has been validated using additional mixtures other than the one used for the development of the model. Predicted and observed results were similar (p>0.05). This study provides an approach to the prediction of joint effects for binary mixtures at non-equitoxic ratios.
Cohen, Yarden; Schneidman, Elad
2013-01-08
Pattern classification learning tasks are commonly used to explore learning strategies in human subjects. The universal and individual traits of learning such tasks reflect our cognitive abilities and have been of interest both psychophysically and clinically. From a computational perspective, these tasks are hard, because the number of patterns and rules one could consider even in simple cases is exponentially large. Thus, when we learn to classify we must use simplifying assumptions and generalize. Studies of human behavior in probabilistic learning tasks have focused on rules in which pattern cues are independent, and also described individual behavior in terms of simple, single-cue, feature-based models. Here, we conducted psychophysical experiments in which people learned to classify binary sequences according to deterministic rules of different complexity, including high-order, multicue-dependent rules. We show that human performance on such tasks is very diverse, but that a class of reinforcement learning-like models that use a mixture of features captures individual learning behavior surprisingly well. These models reflect the important role of subjects' priors, and their reliance on high-order features even when learning a low-order rule. Further, we show that these models predict future individual answers to a high degree of accuracy. We then use these models to build personally optimized teaching sessions and boost learning.
Chaudhry, Shehzad Ashraf; Mahmood, Khalid; Naqvi, Husnain; Khan, Muhammad Khurram
2015-11-01
Telecare medicine information system (TMIS) offers the patients convenient and expedite healthcare services remotely anywhere. Patient security and privacy has emerged as key issues during remote access because of underlying open architecture. An authentication scheme can verify patient's as well as TMIS server's legitimacy during remote healthcare services. To achieve security and privacy a number of authentication schemes have been proposed. Very recently Lu et al. (J. Med. Syst. 39(3):1-8, 2015) proposed a biometric based three factor authentication scheme for TMIS to confiscate the vulnerabilities of Arshad et al.'s (J. Med. Syst. 38(12):136, 2014) scheme. Further, they emphasized the robustness of their scheme against several attacks. However, in this paper we establish that Lu et al.'s scheme is vulnerable to numerous attacks including (1) Patient anonymity violation attack, (2) Patient impersonation attack, and (3) TMIS server impersonation attack. Furthermore, their scheme does not provide patient untraceability. We then, propose an improvement of Lu et al.'s scheme. We have analyzed the security of improved scheme using popular automated tool ProVerif. The proposed scheme while retaining the plusses of Lu et al.'s scheme is also robust against known attacks.
Directory of Open Access Journals (Sweden)
Chase A. Klingaman
2017-02-01
Full Text Available The data presented in this article are related to the research article, “HPLC-based enzyme kinetics assay for glucosinolate hydrolysis facilitate analysis of systems with both multiple reaction products and thermal enzyme denaturation” (C.K. Klingaman, M.J. Wagner, J.R. Brown, J.B. Klecker, E.H. Pauley, C.J. Noldner, J.R. Mays, [1]. This data article describes (1 the synthesis and spectral characterization data of a non-natural glucosinolate analogue, 2,2-diphenylethyl glucosinolate, (2 HPLC standardization data for glucosinolate, isothiocyanate, nitrile, and amine analytes, (3 reaction progress curve data for enzymatic hydrolysis reactions with variable substrate concentration, enzyme concentration, buffer pH, and temperature, and (4 normalized initial velocities of hydrolysis/formation for analytes. These data provide a comprehensive description of the enzyme-catalyzed hydrolysis of 2,2-diphenylethyl glucosinolate (5 and glucotropaeolin (6 under widely varied conditions.
Laffer Curves and Home Production
Directory of Open Access Journals (Sweden)
Kotamäki Mauri
2017-06-01
Full Text Available In the earlier related literature, consumption tax rate Laffer curve is found to be strictly increasing (see Trabandt and Uhlig (2011. In this paper, a general equilibrium macro model is augmented by introducing a substitute for private consumption in the form of home production. The introduction of home production brings about an additional margin of adjustment – an increase in consumption tax rate not only decreases labor supply and reduces the consumption tax base but also allows a substitution of market goods with home-produced goods. The main objective of this paper is to show that, after the introduction of home production, the consumption tax Laffer curve exhibits an inverse U-shape. Also the income tax Laffer curves are significantly altered. The result shown in this paper casts doubt on some of the earlier results in the literature.
GEOMETRIC PROGRESSIONS ON ELLIPTIC CURVES.
Ciss, Abdoul Aziz; Moody, Dustin
2017-01-01
In this paper, we look at long geometric progressions on different model of elliptic curves, namely Weierstrass curves, Edwards and twisted Edwards curves, Huff curves and general quartics curves. By a geometric progression on an elliptic curve, we mean the existence of rational points on the curve whose x -coordinate (or y -coordinate) are in geometric progression. We find infinite families of twisted Edwards curves and Huff curves with geometric progressions of length 5, an infinite family of Weierstrass curves with 8 term progressions, as well as infinite families of quartic curves containing 10-term geometric progressions.
On the Quaternionic Focal Curves
Directory of Open Access Journals (Sweden)
Nurten (BAYRAK GÜRSES
2017-06-01
Full Text Available In this study, a brief summary about quaternions and quaternionic curves are firstly presented. Also, the definition of focal curve is given. The focal curve of a smooth curve consists of the centers of its osculating hypersphere. By using this definition and the quaternionic osculating hyperspheres of these curves, the quaternionic focal curves in the spaces $\\Q$ and $\\Q_\
Directory of Open Access Journals (Sweden)
Paulo Prochno
2004-07-01
Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.
De Luca, Michele; Ioele, Giuseppina; Mas, Sílvia; Tauler, Romà; Ragno, Gaetano
2012-11-21
Amiloride photostability at different pH values was studied in depth by applying Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) to the UV spectrophotometric data from drug solutions exposed to stressing irradiation. Resolution of all degradation photoproducts was possible by simultaneous spectrophotometric analysis of kinetic photodegradation and acid-base titration experiments. Amiloride photodegradation showed to be strongly dependent on pH. Two hard modelling constraints were sequentially used in MCR-ALS for the unambiguous resolution of all the species involved in the photodegradation process. An amiloride acid-base system was defined by using the equilibrium constraint, and the photodegradation pathway was modelled taking into account the kinetic constraint. The simultaneous analysis of photodegradation and titration experiments revealed the presence of eight different species, which were differently distributed according to pH and time. Concentration profiles of all the species as well as their pure spectra were resolved and kinetic rate constants were estimated. The values of rate constants changed with pH and under alkaline conditions the degradation pathway and photoproducts also changed. These results were compared to those obtained by LC-MS analysis from drug photodegradation experiments. MS analysis allowed the identification of up to five species and showed the simultaneous presence of more than one acid-base equilibrium.
Rahmani, O.; Hosseini, S. A. H.; Ghoytasi, I.; Golmohammadi, H.
2017-01-01
In this study, influences of a uniform thermomechanical loading in buckling and free vibration of a curved FG microbeam have been investigated, based on strain gradient theory (SGT) theory and Timoshenko beam model. Distribution of structural materials varies continuously in thickness direction due to power-law exponent. Unlike classical models, this novel model employs three length scale parameters which can capture the size effect. This work is based on SGT theory and Timoshenko beam model. Governing equation of motion and associated boundary condition have been developed based on Hamilton's principle, which is the specified case of virtual work theorem. In continuance, final differential equations were solved by Navier's solution method and the results have been presented. Moreover, influences of dimensionless length-to-thickness ratio (aspect ratio), dimensionless length scale parameter, power-law exponent, temperature difference and arc angle for various values of mode numbers on natural frequency and critical temperature by considering temperature-dependent material properties have been investigated. In order to validate accomplished study, some of the results were compared with those of previous works. It has been concluded that applying a thermomechanical loading on a FG microbeam causes the natural frequency to become more sensitive about variations of geometrical, physical and mechanical properties and characteristics.
Johnson, L. E.; Kim, J.; Cifelli, R.; Chandra, C. V.
2016-12-01
Potential water retention, S, is one of parameters commonly used in hydrologic modeling for soil moisture accounting. Physically, S indicates total amount of water which can be stored in soil and is expressed in units of depth. S can be represented as a change of soil moisture content and in this context is commonly used to estimate direct runoff, especially in the Soil Conservation Service (SCS) curve number (CN) method. Generally, the lumped and the distributed hydrologic models can easily use the SCS-CN method to estimate direct runoff. Changes in potential water retention have been used in previous SCS-CN studies; however, these studies have focused on long-term hydrologic simulations where S is allowed to vary at the daily time scale. While useful for hydrologic events that span multiple days, the resolution is too coarse for short-term applications such as flash flood events where S may not recover its full potential. In this study, a new method for estimating a time-variable potential water retention at hourly time-scales is presented. The methodology is applied for the Napa River basin, California. The streamflow gage at St Helena, located in the upper reaches of the basin, is used as the control gage site to evaluate the model performance as it is has minimal influences by reservoirs and diversions. Rainfall events from 2011 to 2012 are used for estimating the event-based SCS CN to transfer to S. As a result, we have derived the potential water retention curve and it is classified into three sections depending on the relative change in S. The first is a negative slope section arising from the difference in the rate of moving water through the soil column, the second is a zero change section representing the initial recovery the potential water retention, and the third is a positive change section representing the full recovery of the potential water retention. Also, we found that the soil water moving has traffic jam within 24 hours after finished first
U.S. Environmental Protection Agency — an UV calibration curve for SRHA quantitation. This dataset is associated with the following publication: Chang, X., and D. Bouchard. Surfactant-Wrapped Multiwalled...
Directory of Open Access Journals (Sweden)
Kožul Nataša
2014-01-01
Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.
Indian Academy of Sciences (India)
We had defined when an arc is said to cross a circle. We broaden the definition of crossing as follows: Definition: Suppose f is a piece-wise circular simple closed curve and, is a piece-wise circular arc. Suppose ..... curve formed by p' pp", q' qq", part of r between p' and q' and part of r between pI! and q", as shown (Figures 6 ...
DEFF Research Database (Denmark)
Georgieva Yankova, Ginka; Federici, Paolo
This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2.......This report describes power curve measurements carried out on a given turbine in a chosen period. The measurements are carried out in accordance to IEC 61400-12-1 Ed. 1 and FGW Teil 2....
Chouaib, Wafa; Caldwell, Peter V.; Alila, Younes
2018-04-01
This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the Sacramento model (SAC-SMA) to simulate soil moisture and flow components FDCs. The catchments classification based on storm characteristics pointed to the effect of catchments landscape properties on the precipitation variability and consequently on the FDC shapes. The landscape properties effect was pronounce such that low value of the slope of FDC (SFDC)-hinting at limited flow variability-were present in regions of high precipitation variability. Whereas, in regions with low precipitation variability the SFDCs were of larger values. The topographic index distribution, at the catchment scale, indicated that saturation excess overland flow mitigated the flow variability under conditions of low elevations with large soil moisture storage capacity and high infiltration rates. The SFDCs increased due to the predominant subsurface stormflow in catchments at high elevations with limited soil moisture storage capacity and low infiltration rates. Our analyses also highlighted the major role of soil infiltration rates on the FDC despite the impact of the predominant runoff generation mechanism and catchment elevation. In conditions of slow infiltration rates in soils of large moisture storage capacity (at low elevations) and predominant saturation excess, the SFDCs were of larger values. On the other hand, the SFDCs decreased in catchments of prevalent subsurface stormflow and poorly drained soils of small soil moisture storage capacity. The analysis of the flow components FDCs demonstrated that the interflow contribution to the response was the higher in catchments with large value of slope of the FDC. The surface flow
Directory of Open Access Journals (Sweden)
Jimit R Patel
2014-12-01
Full Text Available Efforts have been made to analyze the Shliomis model based ferrofluid lubrication of a squeeze film between rotating rough curved circular plates where the upper plate has a porous facing. Different models of porosity are treated. The stochastic modeling of Christensen and Tonder has been employed to evaluate the effect of surface roughness. The related stochastically averaged Reynolds type equation is numerically solved to obtain the pressure distribution, leading to the calculation of load carrying capacity. The results presented in graphical form establish that the Kozeny-Carman model is more favorable as compared to the Irmay one from the design point of view. It is observed that the Shliomis model based ferrofluid lubrication performs relatively better than the Neuringer-Rosensweig one. Although the bearing suffers due to transverse surface roughness, with a suitable choice of curvature parameters and rotational ratio, the negative effect of porosity and standard deviation can be minimized by the ferrofluid lubrication at least in the case of negatively skewed roughness.
Li, Ying; Shi, Xiaohu; Liang, Yanchun; Xie, Juan; Zhang, Yu; Ma, Qin
2017-01-21
RNAs have been found to carry diverse functionalities in nature. Inferring the similarity between two given RNAs is a fundamental step to understand and interpret their functional relationship. The majority of functional RNAs show conserved secondary structures, rather than sequence conservation. Those algorithms relying on sequence-based features usually have limitations in their prediction performance. Hence, integrating RNA structure features is very critical for RNA analysis. Existing algorithms mainly fall into two categories: alignment-based and alignment-free. The alignment-free algorithms of RNA comparison usually have lower time complexity than alignment-based algorithms. An alignment-free RNA comparison algorithm was proposed, in which novel numerical representations RNA-TVcurve (triple vector curve representation) of RNA sequence and corresponding secondary structure features are provided. Then a multi-scale similarity score of two given RNAs was designed based on wavelet decomposition of their numerical representation. In support of RNA mutation and phylogenetic analysis, a web server (RNA-TVcurve) was designed based on this alignment-free RNA comparison algorithm. It provides three functional modules: 1) visualization of numerical representation of RNA secondary structure; 2) detection of single-point mutation based on secondary structure; and 3) comparison of pairwise and multiple RNA secondary structures. The inputs of the web server require RNA primary sequences, while corresponding secondary structures are optional. For the primary sequences alone, the web server can compute the secondary structures using free energy minimization algorithm in terms of RNAfold tool from Vienna RNA package. RNA-TVcurve is the first integrated web server, based on an alignment-free method, to deliver a suite of RNA analysis functions, including visualization, mutation analysis and multiple RNAs structure comparison. The comparison results with two popular RNA
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Vesth, Allan
This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere......This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here......, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...
Energy Technology Data Exchange (ETDEWEB)
Smith, A. M. S.; Anderson, D. R.; Hellier, C.; Maxted, P. F. L.; Smalley, B.; Southworth, J. [Astrophysics Group, Keele University, Staffordshire, ST5 5BG (United Kingdom); Collier Cameron, A. [SUPA, School of Physics and Astronomy, University of St Andrews, North Haugh, Fife, KY16 9SS (United Kingdom); Gillon, M.; Jehin, E. [Institut d' Astrophysique et de Geophysique, Universite de Liege, Allee du 6 Aout, 17 Bat. B5C, Liege 1 (Belgium); Lendl, M.; Queloz, D.; Triaud, A. H. M. J.; Pepe, F.; Segransan, D.; Udry, S. [Observatoire de Geneve, Universite de Geneve, 51 Chemin des Maillettes, 1290 Sauverny (Switzerland); West, R. G. [Department of Physics and Astronomy, University of Leicester, Leicester, LE1 7RH (United Kingdom); Barros, S. C. C.; Pollacco, D. [Astrophysics Research Centre, School of Mathematics and Physics, Queen' s University, University Road, Belfast, BT7 1NN (United Kingdom); Street, R. A., E-mail: amss@astro.keele.ac.uk [Las Cumbres Observatory, 6740 Cortona Drive Suite 102, Goleta, CA 93117 (United States)
2012-04-15
We report the discovery, from WASP and CORALIE, of a transiting exoplanet in a 1.54 day orbit. The host star, WASP-36, is a magnitude V = 12.7, metal-poor G2 dwarf (T{sub eff} = 5959 {+-} 134 K), with [Fe/H] =-0.26 {+-} 0.10. We determine the planet to have mass and radius, respectively, 2.30 {+-} 0.07 and 1.28 {+-} 0.03 times that of Jupiter. We have eight partial or complete transit light curves, from four different observatories, which allow us to investigate the potential effects on the fitted system parameters of using only a single light curve. We find that the solutions obtained by analyzing each of these light curves independently are consistent with our global fit to all the data, despite the apparent presence of correlated noise in at least two of the light curves.
Tao, Laifa; Lu, Chen; Noktehdan, Azadeh
2015-10-01
Battery capacity estimation is a significant recent challenge given the complex physical and chemical processes that occur within batteries and the restrictions on the accessibility of capacity degradation data. In this study, we describe an approach called dynamic spatial time warping, which is used to determine the similarities of two arbitrary curves. Unlike classical dynamic time warping methods, this approach can maintain the invariance of curve similarity to the rotations and translations of curves, which is vital in curve similarity search. Moreover, it utilizes the online charging or discharging data that are easily collected and do not require special assumptions. The accuracy of this approach is verified using NASA battery datasets. Results suggest that the proposed approach provides a highly accurate means of estimating battery capacity at less time cost than traditional dynamic time warping methods do for different individuals and under various operating conditions.
Algebraic curves and cryptography
Murty, V Kumar
2010-01-01
It is by now a well-known paradigm that public-key cryptosystems can be built using finite Abelian groups and that algebraic geometry provides a supply of such groups through Abelian varieties over finite fields. Of special interest are the Abelian varieties that are Jacobians of algebraic curves. All of the articles in this volume are centered on the theme of point counting and explicit arithmetic on the Jacobians of curves over finite fields. The topics covered include Schoof's \\ell-adic point counting algorithm, the p-adic algorithms of Kedlaya and Denef-Vercauteren, explicit arithmetic on
Complementary curves of descent
Mungan, Carl E.; Lipscombe, Trevor C.
2013-01-01
The shapes of two wires in a vertical plane with the same starting and ending points are described as complementary curves of descent if beads frictionlessly slide down both of them in the same time, starting from rest. Every analytic curve has a unique complement, except for a cycloid (solution of the brachistochrone problem), which is self complementary. A striking example is a straight wire whose complement is a lemniscate of Bernoulli. Alternatively, the wires can be tracks down which round objects undergo a rolling race. The level of presentation is appropriate for an intermediate undergraduate course in classical mechanics.
Groot, L.F.M.|info:eu-repo/dai/nl/073642398
2008-01-01
The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across
Paulton, Richard J. L.
1991-01-01
A procedure that allows students to view an entire bacterial growth curve during a two- to three-hour student laboratory period is described. Observations of the lag phase, logarithmic phase, maximum stationary phase, and phase of decline are possible. A nonpathogenic, marine bacterium is used in the investigation. (KR)
African Journals Online (AJOL)
Adele
Introduction. Both the Unique™ LMA, and lately the Cobra™ PLA, is available in most of the larger state hospitals in South Africa. This study's objective is to evaluate and compare the learning curves for insertion of these two single-use airway devices. This is to ascertain which of these two devices is easier and safer to ...
DEFF Research Database (Denmark)
Kock, Carsten Weber; Vesth, Allan
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....
DEFF Research Database (Denmark)
Federici, Paolo; Kock, Carsten Weber
This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Villanueva, Héctor
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Kock, Carsten Weber
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....
DEFF Research Database (Denmark)
Georgieva Yankova, Ginka; Villanueva, Héctor
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present anal...
DEFF Research Database (Denmark)
Villanueva, Héctor; Vesth, Allan
This report describes the power curve measurements carried out on a given wind turbine in a chosen period. The measurements were carried out following the measurement procedure in the draft of IEC 61400-12-1 Ed.2 [1], with some deviations mostly regarding uncertainty calculation. Here, the refere...
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Villanueva, Héctor
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present anal...
Textbook Factor Demand Curves.
Davis, Joe C.
1994-01-01
Maintains that teachers and textbook graphics follow the same basic pattern in illustrating changes in demand curves when product prices increase. Asserts that the use of computer graphics will enable teachers to be more precise in their graphic presentation of price elasticity. (CFR)
DEFF Research Database (Denmark)
Vesth, Allan; Kock, Carsten Weber
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine....
DEFF Research Database (Denmark)
Villanueva, Héctor; Gómez Arranz, Paula
, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...
Power Curve Measurements, REWS
DEFF Research Database (Denmark)
Villanueva, Héctor; Gómez Arranz, Paula
, the reference wind speed used in the power curve is the equivalent wind speed obtained from lidar measurements at several heights between lower and upper blade tip, in combination with a hub height meteorological mast. The measurements have been performed using DTU’s measurement equipment, the analysis...
DEFF Research Database (Denmark)
Federici, Paolo; Kock, Carsten Weber
The report describes power curve measurements carried out on a given wind turbine. The measurements are carried out in accordance to Ref. [1]. A site calibration has been carried out; see Ref. [2], and the measured flow correction factors for different wind directions are used in the present...... analyze of power performance of the turbine...
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Wagner, Rozenn
This report describes the power curve measurements performed with a nacelle LIDAR on a given wind turbine in a wind farm and during a chosen measurement period. The measurements and analysis are carried out in accordance to the guidelines in the procedure “DTU Wind Energy-E-0019” [1]. The reporting...
Heterozygote PCR product melting curve prediction.
Dwight, Zachary L; Palais, Robert; Kent, Jana; Wittwer, Carl T
2014-03-01
Melting curve prediction of PCR products is limited to perfectly complementary strands. Multiple domains are calculated by recursive nearest neighbor thermodynamics. However, the melting curve of an amplicon containing a heterozygous single-nucleotide variant (SNV) after PCR is the composite of four duplexes: two matched homoduplexes and two mismatched heteroduplexes. To better predict the shape of composite heterozygote melting curves, 52 experimental curves were compared with brute force in silico predictions varying two parameters simultaneously: the relative contribution of heteroduplex products and an ionic scaling factor for mismatched tetrads. Heteroduplex products contributed 25.7 ± 6.7% to the composite melting curve, varying from 23%-28% for different SNV classes. The effect of ions on mismatch tetrads scaled to 76%-96% of normal (depending on SNV class) and averaged 88 ± 16.4%. Based on uMelt (www.dna.utah.edu/umelt/umelt.html) with an expanded nearest neighbor thermodynamic set that includes mismatched base pairs, uMelt HETS calculates helicity as a function of temperature for homoduplex and heteroduplex products, as well as the composite curve expected from heterozygotes. It is an interactive Web tool for efficient genotyping design, heterozygote melting curve prediction, and quality control of melting curve experiments. The application was developed in Actionscript and can be found online at http://www.dna.utah.edu/hets/. © 2013 WILEY PERIODICALS, INC.
LINS Curve in Romanian Economy
Directory of Open Access Journals (Sweden)
Emilian Dobrescu
2016-02-01
Full Text Available The paper presents theoretical considerations and empirical evidence to test the validity of the Laffer in Narrower Sense (LINS curve as a parabola with a maximum. Attention is focused on the so-called legal-effective tax gap (letg. The econometric application is based on statistical data (1990-2013 for Romania as an emerging European economy. Three cointegrating regressions (fully modified least squares, canonical cointegrating regression and dynamic least squares and three algorithms, which are based on instrumental variables (two-stage least squares, generalized method of moments, and limited information maximum likelihood, are involved.
Ait-Haddou, Rachid
2013-02-01
We show that the generalized Bernstein bases in Müntz spaces defined by Hirschman and Widder (1949) and extended by Gelfond (1950) can be obtained as pointwise limits of the Chebyshev–Bernstein bases in Müntz spaces with respect to an interval [a,1][a,1] as the positive real number a converges to zero. Such a realization allows for concepts of curve design such as de Casteljau algorithm, blossom, dimension elevation to be transferred from the general theory of Chebyshev blossoms in Müntz spaces to these generalized Bernstein bases that we termed here as Gelfond–Bernstein bases. The advantage of working with Gelfond–Bernstein bases lies in the simplicity of the obtained concepts and algorithms as compared to their Chebyshev–Bernstein bases counterparts.
Directory of Open Access Journals (Sweden)
M. Franchini
2000-01-01
Full Text Available The sensitivity analysis described in Hashemi et al. (2000 is based on one-at-a-time perturbations to the model parameters. This type of analysis cannot highlight the presence of parameter interactions which might indeed affect the characteristics of the flood frequency curve (ffc even more than the individual parameters. For this reason, the effects of the parameters of the rainfall, rainfall runoff models and of the potential evapotranspiration demand on the ffc are investigated here through an analysis of the results obtained from a factorial experimental design, where all the parameters are allowed to vary simultaneously. This latter, more complex, analysis confirms the results obtained in Hashemi et al. (2000 thus making the conclusions drawn there of wider validity and not related strictly to the reference set selected. However, it is shown that two-factor interactions are present not only between different pairs of parameters of an individual model, but also between pairs of parameters of different models, such as rainfall and rainfall-runoff models, thus demonstrating the complex interaction between climate and basin characteristics affecting the ffc and in particular its curvature. Furthermore, the wider range of climatic regime behaviour produced within the factorial experimental design shows that the probability distribution of soil moisture content at the storm arrival time is no longer sufficient to explain the link between the perturbations to the parameters and their effects on the ffc, as was suggested in Hashemi et al. (2000. Other factors have to be considered, such as the probability distribution of the soil moisture capacity, and the rainfall regime, expressed through the annual maximum rainfalls over different durations. Keywords: Monte Carlo simulation; factorial experimental design; analysis of variance (ANOVA
Approximation by planar elastic curves
DEFF Research Database (Denmark)
Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge
2016-01-01
We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient-driven...
Energy Technology Data Exchange (ETDEWEB)
Groot, L. [Utrecht University, Utrecht School of Economics, Janskerkhof 12, 3512 BL Utrecht (Netherlands)
2008-11-15
The purpose of this paper is twofold. First, it exhibits that standard tools in the measurement of income inequality, such as the Lorenz curve and the Gini-index, can successfully be applied to the issues of inequality measurement of carbon emissions and the equity of abatement policies across countries. These tools allow policy-makers and the general public to grasp at a single glance the impact of conventional distribution rules such as equal caps or grandfathering, or more sophisticated ones, on the distribution of greenhouse gas emissions. Second, using the Samuelson rule for the optimal provision of a public good, the Pareto-optimal distribution of carbon emissions is compared with the distribution that follows if countries follow Nash-Cournot abatement strategies. It is shown that the Pareto-optimal distribution under the Samuelson rule can be approximated by the equal cap division, represented by the diagonal in the Lorenz curve diagram.
DEFF Research Database (Denmark)
Villanueva, Héctor; Gómez Arranz, Paula
This report describes the analysis carried out with data from a given turbine in a wind farm and a chosen period. The purpose of the analysis is to correlate the power output of the wind turbine to the wind speed measured by a nacelle-mounted anemometer. The measurements and analysis are not perf......This report describes the analysis carried out with data from a given turbine in a wind farm and a chosen period. The purpose of the analysis is to correlate the power output of the wind turbine to the wind speed measured by a nacelle-mounted anemometer. The measurements and analysis...... are not performed according to IEC 61400-12-1 [1]. Therefore, the results presented in this report cannot be considered a power curve according to the reference standard, and are referred to as “power curve investigation” instead. The measurements have been performed by a customer and the data analysis has been...
Directory of Open Access Journals (Sweden)
Iram Ansari
2012-01-01
Full Text Available Dilaceration is the result of a developmental anomaly in which there has been an abrupt change in the axial inclination between the crown and the root of a tooth. Dilaceration can be seen in both the permanent and deciduous dentitions, and is more commonly found in posterior teeth and in maxilla. Periapical radiographs are the most appropriate way to diagnose the presence of root dilacerations. The controlled regularly tapered preparation of the curved canals is the ultimate challenge in endodontics. Careful and meticulous technique will yield a safe and sufficient enlargement of the curved canals. This article gives a review of the literature and three interesting case reports of root dilacerations.
Kronberg, Max; Soomro, Muhammad Afzal; Top, Jaap
2017-10-01
In this note we extend the theory of twists of elliptic curves as presented in various standard texts for characteristic not equal to two or three to the remaining characteristics. For this, we make explicit use of the correspondence between the twists and the Galois cohomology set H^1\\big({G}_{\\overline{K}/K}, \\operatorname{Aut}_{\\overline{K}}(E)\\big). The results are illustrated by examples.
Transvaginal cholecystectomy learning curve.
Wood, Stephanie G; Dai, Feng; Dabu-Bondoc, Susan; Mikhael, Hosni; Vadivelu, Nalini; Duffy, Andrew; Roberts, Kurt E
2015-07-01
There are few surgeons in the United States, within private practice and academic centers, currently performing transvaginal cholecystectomies (TVC). The lack of exposure to TVC during residency or fellowship training, coupled with a poorly defined learning curve, further limits interested surgeons who want to apply this technique to their practice. This study describes the learning curve encountered during the introduction of TVC to our academic facility. This study is an analysis of consecutive TVCs performed between August 14, 2009 and August 3, 2012 at an academic center. The TVC patients were divided into sequential quartiles (n = 15/16). The learning curve outcome was measured as the operative time of TVC patients and compared to the operative time of female laparoscopic cholecystectomy (LC) patients performed during the same time period. Sixty-one patients underwent a TVC with a mean age of 38 ± 12 years and mean BMI was 29 ± 6 kg/m(2). Sixty-seven female patients who underwent a LC with average age 41 ± 15 years and average BMI 33 ± 12 kg/m(2). The average operative time of LC patients and TVC patients was 48 ± 20 and 60 ± 17 min, respectively. Significant improvement in TVC operative times was seen between the first (n = 15 TVCs) and second quartiles (p = 0.04) and stayed relatively constant for third quartile, during which there was no statistically significant difference between the mean LC operative time for the second and third TVC quartiles The learning curve of a fellowship-trained surgeon introducing TVC to their surgical repertoire, as measured by improved operative times, can be achieved with approximately 15 cases.
Pelce, Pierre
1989-01-01
In recent years, much progress has been made in the understanding of interface dynamics of various systems: hydrodynamics, crystal growth, chemical reactions, and combustion. Dynamics of Curved Fronts is an important contribution to this field and will be an indispensable reference work for researchers and graduate students in physics, applied mathematics, and chemical engineering. The book consist of a 100 page introduction by the editor and 33 seminal articles from various disciplines.
Vo, Martin
2017-08-01
Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.
Hammer, A
2017-11-01
It was 140 years ago that George von Meyer presented his anatomical diagrams of human bones to a meeting in Zurich. There he was told by Prof. Karl Culmann that the trabecular lines shown within the diagram of the upper femur closely resembled those lines of force which Culmann had determined with Graphic Statics to be passing through a curved, loaded Fairbairn crane. This drew the attention of Julius Wolff, who used this as the basis for his 'Trajectorial theory' which was widely accepted and, to date, has been the underlying basis for all biomechanical investigations of this region. Following Wolff and Culmann, the upper femur is considered to be a curved structure and is investigated as such. Unfortunately, this concept is wrong. The upper femur is not curved but is angular. It is formed by the junction of two straight bones, the femoral neck and the femoral shaft, as may be simply seen as the neck/shaft angle constructed on the antero-posterior radiograph of any normal femur. The internal trabecular bone forms only part of the load bearing structure of the femoral neck. The configuration of this trabecular substance in this region suggests that it is related specifically to the force present during flexion and extension movements of the hip joint. This being so, combined with the delayed timing of the appearance of the trabecular columns, it must be questioned as to whether the remodelling of the upper femur is in response to one or to two distinct forces.
Molladavoodi, H.
2013-09-01
Analysis of stresses and displacements around underground openings is necessary in a wide variety of civil, petroleum and mining engineering problems. In addition, an excavation damaged zone (EDZ) is generally formed around underground openings as a result of high stress magnitudes even in the absence of blasting effects. The rock materials surrounding the underground excavations typically demonstrate nonlinear and irreversible mechanical response in particular under high in situ stress states. The dominant cause of irreversible deformations in brittle rocks is damage process. One of the most widely used methods in tunnel design is the convergence-confinement method (CCM) for its practical application. The elastic-plastic models are usually used in the convergence-confinement method as a constitutive model for rock behavior. The plastic models used to simulate the rock behavior, do not consider the important issues such as stiffness degradation and softening. Therefore, the use of damage constitutive models in the convergence-confinement method is essential in the design process of rock structures. In this paper, the basic concepts of continuum damage mechanics are outlined. Then a numerical stepwise procedure for a circular tunnel under hydrostatic stress field, with consideration of a damage model for rock mass has been implemented. The ground response curve and radius of excavation damage zone were calculated based on an isotropic damage model. The convergence-confinement method based on damage model can consider the effects of post-peak rock behavior on the ground response curve and excavation damage zone. The analysis of results show the important effect of brittleness parameter on the tunnel wall convergence, ground response curve and excavation damage radius. Analiza naprężeń i przemieszczeń powstałych wokół otworu podziemnego wymagana jest przy szerokiej gamie projektów z zakresu budownictwa lądowego, inżynierii górniczej oraz naftowej. Ponadto
Xie, Huiqiao; Cai, Weixing; Yang, Lily; Mao, Hui; Tang, Xiangyang
2016-11-01
Differential phase contrast CT has been recognized as an x-ray imaging method with the potential to greatly improve the differentiation of soft tissues. Talbot interferometry has been one of the promising solutions allowing implementation with commercially available x-ray tubes with a polychromatic spectrum. Mainly due to imperfections in grating fabrication and the polychromatic spectrum of x-ray beam, a twin-peaks phenomenon may exist in phase stepping curves (PSCs) and degrade the performance of phase retrieval. The authors have previously proposed a Fourier analysis based method for phase retrieval in the scenario wherein the twin-peaks phenomenon occurs in PSCs. In this work, the authors propose a 5-step algebraic method for phase retrieval and investigate the potential of reducing radiation dose while both the Fourier and algebraic methods are being utilized for phase retrieval. The algebraic method to deal with the twin-peaks phenomenon, in which a set of linear equations with five unknown variables is needed for phase retrieval, is an extension of the so-called 3-step method that has been used in the scenario wherein only single-peak exists in the PSCs. In addition to a numerical phantom, two sets of experimental data (a phantom made of organic materials and a small animal) acquired by a prototype differential phase contrast CT system are employed to evaluate the performance of the Fourier and algebraic phase retrieval methods and their potential in radiation dose reduction. The evaluation by both numerical phantom and experimental data shows that the algebraic method works as well as the Fourier method in phase retrieval if the twin-peaks phenomenon in the PSCs is appropriately dealt with. In addition, while the radiation dose associated with data acquisition is being reduced via fewer phase shifting steps, the algebraic method can maintain a better performance compared to the Fourier method. Along with the Fourier method, the proposed 5-step algebraic
Yang, Yi; Xie, Huiqiao; Cai, Weixing; Mao, Hui; Tang, Xiangyang
2016-06-01
X-ray differential phase contrast CT implemented with Talbot interferometry employs phase-stepping to extract information of x-ray attenuation, phase shift, and small-angle scattering. Since inaccuracy may exist in the absorption grating G2 due to an imperfect fabrication, the effective period of G2 can be as large as twice the nominal period, leading to a phenomenon of twin peaks that differ remarkably in their heights. In this work, the authors investigate how to retrieve and dewrap the phase signal from the phase-stepping curve (PSC) with the feature of twin peaks for x-ray phase contrast imaging. Based on the paraxial Fresnel-Kirchhoff theory, the analytical formulae to characterize the phenomenon of twin peaks in the PSC are derived. Then an approach to dewrap the retrieved phase signal by jointly using the phases of the first- and second-order Fourier components is proposed. Through an experimental investigation using a prototype x-ray phase contrast imaging system implemented with Talbot interferometry, the authors evaluate and verify the derived analytic formulae and the proposed approach for phase retrieval and dewrapping. According to theoretical analysis, the twin-peak phenomenon in PSC is a consequence of combined effects, including the inaccuracy in absorption grating G2, mismatch between phase grating and x-ray source spectrum, and finite size of x-ray tube's focal spot. The proposed approach is experimentally evaluated by scanning a phantom consisting of organic materials and a lab mouse. The preliminary data show that compared to scanning G2 over only one single nominal period and correcting the measured phase signal with an intuitive phase dewrapping method that is being used in the field, stepping G2 over twice its nominal period and dewrapping the measured phase signal with the proposed approach can significantly improve the quality of x-ray differential phase contrast imaging in both radiograph and CT. Using the phase retrieval and dewrapping
Jones, Steven R.
2015-01-01
This study aims to broadly examine how commonly various conceptualizations of the definite integral are drawn on by students as they attempt to explain the meaning of integral expressions. Previous studies have shown that certain conceptualizations, such as the area under a curve or the values of an anti-derivative, may be less productive in…
Separation control on curved boundaries
Kamal Kumar, R.; Mathur, Manikandan
2017-11-01
Flow separation and its characteristics are an important consideration in the field of bluff body aerodynamics. Specifically, the location and slope of the separation, and the size of the re-circulation bubble that forms downstream of the bluff body significantly affect the resulting aerodynamic forces. Recent theories based on dynamical systems (Haller, 2004) have established criteria based on wall-based quantities that identify the location and slope of separation in unsteady flows. In this work, we adapt the closed-loop separation control algorithm proposed by Alam, Liu & Haller (2006) to curved boundaries, and demonstrate the effectiveness of the same via numerical simulations on the flow past a cylinder in the vortex-shedding regime. Using appropriately placed wall-based actuators that use inputs from shear stress sensors placed between the actuators, we demonstrate that the separation characteristics including the re-circulation bubble length, can be desirably modified.
Replication and Analysis of Ebbinghaus' Forgetting Curve.
Murre, Jaap M J; Dros, Joeri
2015-01-01
We present a successful replication of Ebbinghaus' classic forgetting curve from 1880 based on the method of savings. One subject spent 70 hours learning lists and relearning them after 20 min, 1 hour, 9 hours, 1 day, 2 days, or 31 days. The results are similar to Ebbinghaus' original data. We analyze the effects of serial position on forgetting and investigate what mathematical equations present a good fit to the Ebbinghaus forgetting curve and its replications. We conclude that the Ebbinghaus forgetting curve has indeed been replicated and that it is not completely smooth but most probably shows a jump upwards starting at the 24 hour data point.
Replication and Analysis of Ebbinghaus’ Forgetting Curve
Murre, Jaap M. J.; Dros, Joeri
2015-01-01
We present a successful replication of Ebbinghaus’ classic forgetting curve from 1880 based on the method of savings. One subject spent 70 hours learning lists and relearning them after 20 min, 1 hour, 9 hours, 1 day, 2 days, or 31 days. The results are similar to Ebbinghaus' original data. We analyze the effects of serial position on forgetting and investigate what mathematical equations present a good fit to the Ebbinghaus forgetting curve and its replications. We conclude that the Ebbinghaus forgetting curve has indeed been replicated and that it is not completely smooth but most probably shows a jump upwards starting at the 24 hour data point. PMID:26148023
Replication and Analysis of Ebbinghaus' Forgetting Curve.
Directory of Open Access Journals (Sweden)
Jaap M J Murre
Full Text Available We present a successful replication of Ebbinghaus' classic forgetting curve from 1880 based on the method of savings. One subject spent 70 hours learning lists and relearning them after 20 min, 1 hour, 9 hours, 1 day, 2 days, or 31 days. The results are similar to Ebbinghaus' original data. We analyze the effects of serial position on forgetting and investigate what mathematical equations present a good fit to the Ebbinghaus forgetting curve and its replications. We conclude that the Ebbinghaus forgetting curve has indeed been replicated and that it is not completely smooth but most probably shows a jump upwards starting at the 24 hour data point.
Sadek, Mohammad
2012-01-01
In this paper we consider genus one equations of degree $n$, namely a (generalised) binary quartic when $n=2$, a ternary cubic when $n=3$, and a pair of quaternary quadrics when $n=4$. A new definition for the minimality of genus one equations of degree $n$ over local fields is introduced. The advantage of this definition is that it does not depend on invariant theory of genus one curves. We prove that this definition coincides with the classical definition of minimality for all $n\\le4$. As a...
Learning from uncertain curves
DEFF Research Database (Denmark)
Mallasto, Anton; Feragen, Aasa
2017-01-01
We introduce a novel framework for statistical analysis of populations of nondegenerate Gaussian processes (GPs), which are natural representations of uncertain curves. This allows inherent variation or uncertainty in function-valued data to be properly incorporated in the population analysis....... Using the 2-Wasserstein metric we geometrize the space of GPs with L2 mean and covariance functions over compact index spaces. We prove uniqueness of the barycenter of a population of GPs, as well as convergence of the metric and the barycenter of their finite-dimensional counterparts. This justifies...
Modelling curves of manufacturing feasibilities and demand
Directory of Open Access Journals (Sweden)
Soloninko K.S.
2017-03-01
Full Text Available The authors research the issue of functional properties of curves of manufacturing feasibilities and demand. Settlement of the problem, and its connection with important scientific and practical tasks. According to its nature, the market economy is unstable and is in constant movement. Economy has an effective instrument for explanation of changes in economic environment; this tool is called the modelling of economic processes. The modelling of economic processes depends first and foremost on the building of economic model which is the base for the formalization of economic process, that is, the building of mathematical model. The effective means for formalization of economic process is the creation of the model of hypothetic or imaginary economy. The building of demand model is significant for the market of goods and services. The problem includes the receiving (as the result of modelling definite functional properties of curves of manufacturing feasibilities and demand according to which one can determine their mathematical model. Another problem lies in obtaining majorant properties of curves of joint demand on the market of goods and services. Analysis of the latest researches and publications. Many domestic and foreign scientists dedicated their studies to the researches and building of the models of curves of manufacturing feasibilities and demand. In spite of considerable work of the scientists, such problems as functional properties of the curves and their practical use in modelling. The purpose of the article is to describe functional properties of curves of manufacturing feasibilities and demand on the market of goods and services on the base of modelling of their building. Scientific novelty and practical value. The theoretical regulations (for functional properties of curves of manufacturing feasibilities and demand received as a result of the present research, that is convexity, give extra practical possibilities in a microeconomic
Replication and Analysis of Ebbinghaus? Forgetting Curve
Murre, Jaap M. J.; Dros, Joeri
2015-01-01
We present a successful replication of Ebbinghaus’ classic forgetting curve from 1880 based on the method of savings. One subject spent 70 hours learning lists and relearning them after 20 min, 1 hour, 9 hours, 1 day, 2 days, or 31 days. The results are similar to Ebbinghaus' original data. We analyze the effects of serial position on forgetting and investigate what mathematical equations present a good fit to the Ebbinghaus forgetting curve and its replications. We conclude that the Ebbingha...
Verified Indifferentiable Hashing into Elliptic Curves
Barthe, Gilles; Grégoire, Benjamin; Heraud, Sylvain; Olmedo, Federico; Zanella-Béguelin, Santiago
2012-01-01
International audience; Many cryptographic systems based on elliptic curves are proven secure in the Random Oracle Model, assuming there exist probabilistic functions that map elements in some domain (e.g. bitstrings) onto uniformly and independently distributed points in a curve. When implementing such systems, and in order for the proof to carry over to the implementation, those mappings must be instantiated with concrete constructions whose behavior does not deviate significantly from rand...
Liu, Xiao-Hui; Wang, Wei-Liang; Lu, Shao-Yong; Wang, Yu-Fan; Ren, Zongming
2016-08-01
In Zaozhuang, economic development affects the discharge amount of industrial wastewater, chemical oxygen demand (COD), and ammonia nitrogen (NH3-N). To reveal the trend of water environmental quality related to the economy in Zaozhuang, this paper simulated the relationships between industrial wastewater discharge, COD, NH3-N load, and gross domestic product (GDP) per capita for Zaozhuang (2002-2012) using environmental Kuznets curve (EKC) models. The results showed that the added value of industrial GDP, the per capita GDP, and wastewater emission had average annual growth rates of 16.62, 16.19, and 17.89 %, respectively, from 2002 to 2012, while COD and NH3-N emission in 2012, compared with 2002, showed average annual decreases of 10.70 and 31.12 %, respectively. The export of EKC models revealed that industrial wastewater discharge had a typical inverted-U-shaped relationship with per capita GDP. However, both COD and NH3-N showed the binding curve of the left side of the "U" curve and left side U-shaped curve. The economy in Zaozhuang had been at the "fast-growing" stage, with low environmental pollution according to the industrial pollution level. In recent years, Zaozhuang has abated these heavy-pollution industries emphatically, so pollutants have been greatly reduced. Thus, Zaozhuang industrial wastewater treatment has been quite effective, with water quality improved significantly. The EKC models provided scientific evidence for estimating industrial wastewater discharge, COD, and NH3-N load as well as their changeable trends for Zaozhuang from an economic perspective.
Exnowitz, Franziska; Meyer, Bernd; Hackl, Thomas
2012-03-01
(1)H NMR spectroscopy was used to follow the cleavage of sucrose by invertase. The parameters of the enzyme's kinetics, K(m) and V(max), were directly determined from progress curves at only one concentration of the substrate. For comparison with the classical Michaelis-Menten analysis, the reaction progress was also monitored at various initial concentrations of 3.5 to 41.8mM. Using the Lambert W function the parameters K(m) and V(max) were fitted to obtain the experimental progress curve and resulted in K(m)=28mM and V(max)=13μM/s. The result is almost identical to an initial rate analysis that, however, costs much more time and experimental effort. The effect of product inhibition was also investigated. Furthermore, we analyzed a much more complex reaction, the conversion of farnesyl diphosphate into (+)-germacrene D by the enzyme germacrene D synthase, yielding K(m)=379μM and k(cat)=0.04s(-1). The reaction involves an amphiphilic substrate forming micelles and a water insoluble product; using proper controls, the conversion can well be analyzed by the progress curve approach using the Lambert W function. Copyright Â© 2011. Published by Elsevier B.V.
The Characteristic Curves of Water
Neumaier, Arnold; Deiters, Ulrich K.
2016-09-01
In 1960, E. H. Brown defined a set of characteristic curves (also known as ideal curves) of pure fluids, along which some thermodynamic properties match those of an ideal gas. These curves are used for testing the extrapolation behaviour of equations of state. This work is revisited, and an elegant representation of the first-order characteristic curves as level curves of a master function is proposed. It is shown that Brown's postulate—that these curves are unique and dome-shaped in a double-logarithmic p, T representation—may fail for fluids exhibiting a density anomaly. A careful study of the Amagat curve (Joule inversion curve) generated from the IAPWS-95 reference equation of state for water reveals the existence of an additional branch.
NLINEAR - NONLINEAR CURVE FITTING PROGRAM
Everhart, J. L.
1994-01-01
A common method for fitting data is a least-squares fit. In the least-squares method, a user-specified fitting function is utilized in such a way as to minimize the sum of the squares of distances between the data points and the fitting curve. The Nonlinear Curve Fitting Program, NLINEAR, is an interactive curve fitting routine based on a description of the quadratic expansion of the chi-squared statistic. NLINEAR utilizes a nonlinear optimization algorithm that calculates the best statistically weighted values of the parameters of the fitting function and the chi-square that is to be minimized. The inputs to the program are the mathematical form of the fitting function and the initial values of the parameters to be estimated. This approach provides the user with statistical information such as goodness of fit and estimated values of parameters that produce the highest degree of correlation between the experimental data and the mathematical model. In the mathematical formulation of the algorithm, the Taylor expansion of chi-square is first introduced, and justification for retaining only the first term are presented. From the expansion, a set of n simultaneous linear equations are derived, which are solved by matrix algebra. To achieve convergence, the algorithm requires meaningful initial estimates for the parameters of the fitting function. NLINEAR is written in Fortran 77 for execution on a CDC Cyber 750 under NOS 2.3. It has a central memory requirement of 5K 60 bit words. Optionally, graphical output of the fitting function can be plotted. Tektronix PLOT-10 routines are required for graphics. NLINEAR was developed in 1987.
Directory of Open Access Journals (Sweden)
Je Hyun Baekt
2000-01-01
Full Text Available A numerical study is conducted on the fully-developed laminar flow of an incompressible viscous fluid in a square duct rotating about a perpendicular axis to the axial direction of the duct. At the straight duct, the rotation produces vortices due to the Coriolis force. Generally two vortex cells are formed and the axial velocity distribution is distorted by the effect of this Coriolis force. When a convective force is weak, two counter-rotating vortices are shown with a quasi-parabolic axial velocity profile for weak rotation rates. As the rotation rate increases, the axial velocity on the vertical centreline of the duct begins to flatten and the location of vorticity center is moved near to wall by the effect of the Coriolis force. When the convective inertia force is strong, a double-vortex secondary flow appears in the transverse planes of the duct for weak rotation rates but as the speed of rotation increases the secondary flow is shown to split into an asymmetric configuration of four counter-rotating vortices. If the rotation rates are increased further, the secondary flow restabilizes to a slightly asymmetric double-vortex configuration. Also, a numerical study is conducted on the laminar flow of an incompressible viscous fluid in a 90°-bend square duct that rotates about axis parallel to the axial direction of the inlet. At a 90°-bend square duct, the feature of flow by the effect of a Coriolis force and a centrifugal force, namely a secondary flow by the centrifugal force in the curved region and the Coriolis force in the downstream region, is shown since the centrifugal force in curved region and the Coriolis force in downstream region are dominant respectively.
Global experience curves for wind farms
International Nuclear Information System (INIS)
Junginger, M.; Faaij, A.; Turkenburg, W.C.
2005-01-01
In order to forecast the technological development and cost of wind turbines and the production costs of wind electricity, frequent use is made of the so-called experience curve concept. Experience curves of wind turbines are generally based on data describing the development of national markets, which cause a number of problems when applied for global assessments. To analyze global wind energy price development more adequately, we compose a global experience curve. First, underlying factors for past and potential future price reductions of wind turbines are analyzed. Also possible implications and pitfalls when applying the experience curve methodology are assessed. Second, we present and discuss a new approach of establishing a global experience curve and thus a global progress ratio for the investment cost of wind farms. Results show that global progress ratios for wind farms may lie between 77% and 85% (with an average of 81%), which is significantly more optimistic than progress ratios applied in most current scenario studies and integrated assessment models. While the findings are based on a limited amount of data, they may indicate faster price reduction opportunities than so far assumed. With this global experience curve we aim to improve the reliability of describing the speed with which global costs of wind power may decline
Learning curves in health professions education.
Pusic, Martin V; Boutis, Kathy; Hatala, Rose; Cook, David A
2015-08-01
Learning curves, which graphically show the relationship between learning effort and achievement, are common in published education research but are not often used in day-to-day educational activities. The purpose of this article is to describe the generation and analysis of learning curves and their applicability to health professions education. The authors argue that the time is right for a closer look at using learning curves-given their desirable properties-to inform both self-directed instruction by individuals and education management by instructors.A typical learning curve is made up of a measure of learning (y-axis), a measure of effort (x-axis), and a mathematical linking function. At the individual level, learning curves make manifest a single person's progress towards competence including his/her rate of learning, the inflection point where learning becomes more effortful, and the remaining distance to mastery attainment. At the group level, overlaid learning curves show the full variation of a group of learners' paths through a given learning domain. Specifically, they make overt the difference between time-based and competency-based approaches to instruction. Additionally, instructors can use learning curve information to more accurately target educational resources to those who most require them.The learning curve approach requires a fine-grained collection of data that will not be possible in all educational settings; however, the increased use of an assessment paradigm that explicitly includes effort and its link to individual achievement could result in increased learner engagement and more effective instructional design.
Directory of Open Access Journals (Sweden)
Sergey A. Cherkis
2007-03-01
Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.
Directory of Open Access Journals (Sweden)
Wei Zhao
2017-01-01
Full Text Available The ability to quantitatively evaluate the visual feedback of drivers has been considered as the primary research for reducing crashes in snow and ice environments. Different colored Chevron alignment signs cause diverse visual effect. However, the effect of Chevrons on visual feedback and on the driving reaction while navigating curves in SI environments has not been adequately evaluated. The objective of this study is twofold: (1 an effective and long-term experiment was designed and developed to test the effect of colored Chevrons on drivers’ vision and vehicle speed; (2 a new quantitative effect evaluation model is employed to measure the effect of different colors of the Chevrons. Fixation duration and pupil size were used to describe the driver’s visual response, and Cohen’s d was used to evaluate the colors’ psychological effect on drivers. The results showed the following: (1 after choosing the proper color for Chevrons, drivers reduced the speed of the vehicle while approaching the curves. (2 It was easier for drivers to identify the road alignment after setting the Chevrons. (3 Cohen’s d related to different colors of Chevrons have different effect sizes. The conclusions provide evident references for freeway warning products and the design of intelligent vehicles.
Liquefaction Probability Curves for Surficial Geologic Units
Holzer, T. L.; Noce, T. E.; Bennett, M. J.
2009-12-01
Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both
International Nuclear Information System (INIS)
Szuta, M.; Dąbrowski, L.
2013-01-01
Crossing the experimental critical fuel temperature dependent on burn-up, an onset of fission gas burst release is observed. This observed phenomena can be explained by assumption that the fission gas immobilization in the uranium dioxide irradiated to a fluency of greater than 10 19 fissions/cm 3 is mainly due to radiation induced chemical activity. Application of the “ab initio” method show that the bond energy of Xenon and Krypton is equal to –1.23 eV, and –3.42 eV respectively. Assuming further that the gas chemically bound can be released mainly in the process of re-crystallization and modifying the differential equation of Ainscough of grain growth by including the burn-up dependence and the experimental data of limiting grain size in function of the fuel temperature for the un-irradiated and irradiated fuel we can re-construct the experimental curve of Vitanza. (authors)
DEFF Research Database (Denmark)
Akbari, Abolghasem; Samah, Azizan Abu; Daryabor, Farshid
2016-01-01
. Approximately 5% of the study area was identified as a very high-risk zone and 13% as high-risk zone. However, the spatial extent of a high-risk zone in the downstream end and lowland areas of the KW could be considered to be the main cause of flood damage in recent years. From practical point of view......This study aims to develop a methodology for generating a flood runoff susceptibility (FRS) map using a revised curve number (CN) method. The study area is in the Kuantan watershed (KW), Malaysia, which was seriously affected by floods in December 2013 and December 2014. A revised runoff CN map......, the finding of this research provides a road map for government agencies to effectively implement flood mitigation projects in the study area....
Intersection numbers of spectral curves
Eynard, B
2011-01-01
We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the topological vertex formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV formula, and Mumford formula.
Research on an innovative modification algorithm of NURBS curve interpolation
Zhang, Wanjun; Gao, Shanping; Cheng, Xiyan; Zhang, Feng
2017-04-01
in order to solve the problems of modification algorithm of NURBS curve interpolation, Such as interpolation time bigger, NURBS curve step error and chord error are not easy changed, and so on. A novel proposed a modification algorithm of NURBS curve interpolation. The algorithm has merits such as higher interpolation position accuracy, short processing time and so on. In this simulation, an open five-axis CNC platform based on SIEMENS 840D CNC system is developed for verifying the proposed modification algorithm of NURBS curve interpolation experimentally. The simulation results show that the algorithm is correct; it is consistent with a NURBS curve interpolation requirements.
Reflection of curved shock waves
Mölder, S.
2017-09-01
Shock curvatures are related to pressure gradients, streamline curvatures and vorticity in flows with planar and axial symmetry. Explicit expressions, in an influence coefficient format, are used to relate post-shock pressure gradient, streamline curvature and vorticity to pre-shock gradients and shock curvature in steady flow. Using higher order, von Neumann-type, compatibility conditions, curved shock theory is applied to calculate the flow near singly and doubly curved shocks on curved surfaces, in regular shock reflection and in Mach reflection. Theoretical curved shock shapes are in good agreement with computational fluid dynamics calculations and experiment.
Some issues using the master curve concept
International Nuclear Information System (INIS)
Viehrig, H.W.; Boehmert, J.
2000-01-01
The state-of-the-art structural integrity assessment of reactor pressure vessels (RPV) is based on the Reference Temperature Concept, initially proposed by the American Society of Mechanical Engineers (ASME). An experimentally ensured fracture toughness curve was constructed as the lower boundary of the available fracture toughness, K IC , of RPV steels. This conservative curve describing K IC as a function of temperature is used as a universal curve. For different RPV steels the curve is placed on the temperature axis over a reference temperature. In the initial state the nil-ductility-transition temperature (RT NDT ) is applied as reference temperature. The ductile to brittle transition temperature (DBTT) shift caused by the neutron irradiation is determined by Charpy V impact tests. The Charpy V DBTT shift is one of the results of the RPV surveillance programmes. The concept based on the ASME curve has the following disadvantages: - it is not consistent since it links fracture mechanical and technological parameters and - margins of safety and uncertainties cannot be quantified. (orig.)
Modelling stochastic chances in curve shape, with an application to cancer diagnostics
DEFF Research Database (Denmark)
Hobolth, A; Jensen, Eva B. Vedel
2000-01-01
Often, the statistical analysis of the shape of a random planar curve is based on a model for a polygonal approximation to the curve. In the present paper, we instead describe the curve as a continuous stochastic deformation of a template curve. The advantage of this continuous approach is that t......Often, the statistical analysis of the shape of a random planar curve is based on a model for a polygonal approximation to the curve. In the present paper, we instead describe the curve as a continuous stochastic deformation of a template curve. The advantage of this continuous approach...
Zarzycki, Piotr; Thomas, Fabien
2006-10-15
The parallel shape of the potentiometric titration curves for montmorillonite suspension is explained using the surface complexation model and taking into account the surface heterogeneity. The homogeneous models give accurate predictions only if they assume unphysically large values of the equilibrium constants for the exchange process occurring on the basal plane. However, the assumption that the basal plane is energetically heterogeneous allows to fit the experimental data (reported by Avena and De Pauli [M. Avena, C.P. De Pauli, J. Colloid Interface Sci. 202 (1998) 195-204]) for reasonable values of exchange equilibrium constant equal to 1.26 (suggested by Fletcher and Sposito [P. Fletcher, G. Sposito, Clay Miner. 24 (1989) 375-391]). Moreover, we observed the typical behavior of point of zero net proton charge (pznpc) as a function of logarithm of the electrolyte concentration (log[C]). We showed that the slope of the linear dependence, pznpc=f(log[C]), is proportional to the number of isomorphic substitutions in the crystal phase, which was also observed in the experimental studies.
International Nuclear Information System (INIS)
Moon, B. S.; Han, S. H.; Kim, Y. K.; Chung, C. E.
2001-01-01
The conversion efficiency of a cesium iodine coated micro-channel plate is studied. We use the EGS4 code to transport photons and generated electrons until their energies become less than 1keV and 10keV respectively. Among the generated electrons, the emission from the secondary electrons located within the escape depth of 56mm from the photo-converter boundary is estimated by integrating the product of the secondary electrons with a probability depending only on their geometric locations. The secondary electron emission from the generated electrons of energy higher than 100eV is estimated by the 'universal yield curve'. The sum of these provides an estimate for the secondary electron yield and we show that results of applying this algorithm agree with known experimental results. Using this algorithm, we computed secondary electron emissions from a micro-channel plate used in a gas electron multiplier detector that is currently being developed at Korea Atomic Energy Research Institute
Directory of Open Access Journals (Sweden)
Ronald ORDINOLA-ZAPATA
2014-12-01
Full Text Available Objective: To evaluate the shaping ability of Reciproc and Twisted-File Adaptive systems in rapid prototyping replicas. Material and Methods: Two mandibular molars showing S-shaped and 62-degree curvatures in the mesial root were scanned by using a microcomputed tomography (μCT system. The data were exported in the stereolitograhic format and 20 samples of each molar were printed at 16 µm resolution. The mesial canals of 10 replicas of each specimen were prepared with each system. Transportation was measured by overlapping radiographs taken before and after preparation and resin thickness after instrumentation was measured by μCT. Results: Both systems maintained the original shape of the apical third in both anatomies (P>0.05. Overall, considering the resin thickness in the 62-degree replicas, no statistical difference was found between the systems (P>0.05. In the S-shaped curvature replica, Reciproc significantly decreased the thickness of the resin walls in comparison with TF Adaptive. Conclusions: The evaluated systems were able to maintain the original shape at the apical third of severely curved mesial canals of molar replicas.
International Nuclear Information System (INIS)
Yang, Xiaoli; Hofmann, Ralf; Dapp, Robin; Van de Kamp, Thomas; Rolo, Tomy dos Santos; Xiao, Xianghui; Moosmann, Julian; Kashef, Jubin; Stotzka, Rainer
2015-01-01
High-resolution, three-dimensional (3D) imaging of soft tissues requires the solution of two inverse problems: phase retrieval and the reconstruction of the 3D image from a tomographic stack of two-dimensional (2D) projections. The number of projections per stack should be small to accommodate fast tomography of rapid processes and to constrain X-ray radiation dose to optimal levels to either increase the duration o fin vivo time-lapse series at a given goal for spatial resolution and/or the conservation of structure under X-ray irradiation. In pursuing the 3D reconstruction problem in the sense of compressive sampling theory, we propose to reduce the number of projections by applying an advanced algebraic technique subject to the minimisation of the total variation (TV) in the reconstructed slice. This problem is formulated in a Lagrangian multiplier fashion with the parameter value determined by appealing to a discrete L-curve in conjunction with a conjugate gradient method. The usefulness of this reconstruction modality is demonstrated for simulated and in vivo data, the latter acquired in parallel-beam imaging experiments using synchrotron radiation
Regular homotopy of Hurwitz curves
International Nuclear Information System (INIS)
Auroux, D; Kulikov, Vik S; Shevchishin, V V
2004-01-01
We prove that any two irreducible cuspidal Hurwitz curves C 0 adn C 1 (or, more generally, two curves with A-type singularities) in the Hirzebruch surface F N with the same homology classes and sets of singularities are regular homotopic. Moreover, they are symplectically regular homotopic if C 0 and C 1 are symplectic with respect to a compatible symplectic form
Space curves, anholonomy and nonlinearity
Indian Academy of Sciences (India)
Walker parallel transport [14] of any vector P moved ... of Fermi–Walker parallel transport to the case of a moving space curve. 4. General curve evolution equations .... ear term of the Lamb equation (eq. (34)) is just the time derivative of the total.
Replication and analysis of Ebbinghaus' forgetting curve
Murre, J.M.J.; Dros, J.
2015-01-01
We present a successful replication of Ebbinghaus’ classic forgetting curve from 1880 based on the method of savings. One subject spent 70 hours learning lists and relearning them after 20 min, 1 hour, 9 hours, 1 day, 2 days, or 31 days. The results are similar to Ebbinghaus' original data. We
Electronic properties of curved graphene sheets
Cortijo, Alberto; Vozmediano, Maria A. H.
2006-01-01
A model is proposed to study the electronic structure of slightly curved graphene sheets with an arbitrary number of pentagon-heptagon pairs and Stone-Wales defects based on a cosmological analogy. The disorder induced by curvature produces characteristic patterns in the local density of states that can be observed in scanning tunnel and transmission electron microscopy.
Numerical analysis of thermoluminescence glow curves
International Nuclear Information System (INIS)
Gomez Ros, J. M.; Delgado, A.
1989-01-01
This report presents a method for the numerical analysis of complex thermoluminescence glow curves resolving the individual glow peak components. The method employs first order kinetics analytical expressions and is based In a Marquart-Levenberg minimization procedure. A simplified version of this method for thermoluminescence dosimetry (TLD) is also described and specifically developed to operate whit Lithium Fluoride TLD-100. (Author). 36 refs
Remote sensing used for power curves
DEFF Research Database (Denmark)
Wagner, Rozenn; Ejsing Jørgensen, Hans; Schmidt Paulsen, Uwe
2008-01-01
Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviat...
New Explicit Conditions of Elliptic Curve Traces for FR-Reduction
MIYAJI, Atsuko; NAKABAYASHI, Masaki; TAKANO, Shunzou
2001-01-01
Elliptic curve cryptosystems are based on the elliptic curve discrete logarithm problem (ECDLP). If elliptic curve cryptosystems avoid FR-reduction and anomalous elliptic curve over F_q, then with current knowledge we can construct elliptic curve cryptosystems over a smaller definition field. ECDLP has an interesting property that the security deeply depends on elliptic curve traces rather than definition fields, which does not occur in the case of the discrete logarithm problem (DLP). Theref...
Brachistochrone curve of a fluid filled cylinder
Sarma, Srikanth; Raja, Sharan; Mahapatra, Pallab Sinha; Panchangnula, Mahesh
2017-11-01
The brachistochrone curve for a non-dissipative particle tries to maximize inertia of the particle but for a fluid filled cylinder, increasing inertia would amount to high dissipative losses. Hence the trade off between inertia and dissipation plays a vital role in the dynamics of a fluid filled cylinder. This trade off manifests itself in the form of an integro-differential equation governing the angular acceleration of the cylinder. Here, we compute the brachistochrone curve using optimal control principles and investigate the effect of the fore mentioned trade off on the deviation of the brachistochrone curve from that of a non-dissipative particle. Also, we investigate the effects of the non-dimensional parameters of the problem on the shape of the brachistochrone curve. We analyze the dissipation rate during the cylinder's motion and show that energy based arguments don't hold good for a fluid filled cylinder. We then analyze the stability of the time varying fluid flow in the cylinder and find an admissible region for the terminal point which would ensure the stability of the fluid flow as the cylinder rolls over the brachistochrone curve.
Comparison of power curve monitoring methods
Directory of Open Access Journals (Sweden)
Cambron Philippe
2017-01-01
Full Text Available Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM of wind turbines (WT. In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.
Computational aspects of algebraic curves
Shaska, Tanush
2005-01-01
The development of new computational techniques and better computing power has made it possible to attack some classical problems of algebraic geometry. The main goal of this book is to highlight such computational techniques related to algebraic curves. The area of research in algebraic curves is receiving more interest not only from the mathematics community, but also from engineers and computer scientists, because of the importance of algebraic curves in applications including cryptography, coding theory, error-correcting codes, digital imaging, computer vision, and many more.This book cove
CYCLING CURVES AND THEIR APPLICATIONS
Directory of Open Access Journals (Sweden)
RAICU Lucian
2015-06-01
Full Text Available This paper proposes an analysis of the cyclic curves that can be considered as some of the most important regarding their applications in science, technique, design, architecture and art. These curves include the following: cycloid, epicycloid, hypocycloid, spherical cycloid and special cases thereof. In the first part of the paper the main curves of cycloids family are presented with their methods of generating and setting parametric equations. In the last part some of cycloid applications are highlighted in different areas of science, technology and art.
Designing the Alluvial Riverbeds in Curved Paths
Macura, Viliam; Škrinár, Andrej; Štefunková, Zuzana; Muchová, Zlatica; Majorošová, Martina
2017-10-01
The paper presents the method of determining the shape of the riverbed in curves of the watercourse, which is based on the method of Ikeda (1975) developed for a slightly curved path in sandy riverbed. Regulated rivers have essentially slightly and smoothly curved paths; therefore, this methodology provides the appropriate basis for river restoration. Based on the research in the experimental reach of the Holeška Brook and several alluvial mountain streams the methodology was adjusted. The method also takes into account other important characteristics of bottom material - the shape and orientation of the particles, settling velocity and drag coefficients. Thus, the method is mainly meant for the natural sand-gravel material, which is heterogeneous and the particle shape of the bottom material is very different from spherical. The calculation of the river channel in the curved path provides the basis for the design of optimal habitat, but also for the design of foundations of armouring of the bankside of the channel. The input data is adapted to the conditions of design practice.
Elliptic Curves and Number Theory
Indian Academy of Sciences (India)
R. Sujatha, School of Mathematics, Tata Institute of Fundamental Research, Mumbai, INDIA
1. Aim: To explain the connection between a simple ancient problem in number theory and a deep sophisticated conjecture about Elliptic Curves. ('arithmetic Geometry'). Notation: N : set of natural numbers (1,2,3,...) ...
51Cr - erythrocyte survival curves
International Nuclear Information System (INIS)
Paiva Costa, J. de.
1982-07-01
Sixteen patients were studied, being fifteen patients in hemolytic state, and a normal individual as a witness. The aim was to obtain better techniques for the analysis of the erythrocytes, survival curves, according to the recommendations of the International Committee of Hematology. It was used the radiochromatic method as a tracer. Previously a revisional study of the International Literature was made in its aspects inherent to the work in execution, rendering possible to establish comparisons and clarify phonomena observed in cur investigation. Several parameters were considered in this study, hindering both the exponential and the linear curves. The analysis of the survival curves of the erythrocytes in the studied group, revealed that the elution factor did not present a homogeneous answer quantitatively to all, though, the result of the analysis of these curves have been established, through listed programs in the electronic calculator. (Author) [pt
Management of the learning curve
DEFF Research Database (Denmark)
Pedersen, Peter-Christian; Slepniov, Dmitrij
2016-01-01
Purpose – This paper focuses on the management of the learning curve in overseas capacity expansions. The purpose of this paper is to unravel the direct as well as indirect influences on the learning curve and to advance the understanding of how these affect its management. Design/methodology...... the dimensions of the learning process involved in a capacity expansion project and identified the direct and indirect labour influences on the production learning curve. On this basis, the study proposes solutions to managing learning curves in overseas capacity expansions. Furthermore, the paper concludes...... employs qualitative methodology and draws on a longitudinal, factory-level analysis of an in-depth case study of a Danish wind turbine manufacturer. Findings – This study goes beyond a simplistic treatment of the lead time and learning required to establish a new capacity. The authors examined...
Nonlinear mechanics of rigidifying curves.
Al Mosleh, Salem; Santangelo, Christian
2017-07-01
Thin shells are characterized by a high cost of stretching compared to bending. As a result isometries of the midsurface of a shell play a crucial role in their mechanics. In turn, curves on the midsurface with zero normal curvature play a critical role in determining the number and behavior of isometries. In this paper, we show how the presence of these curves results in a decrease in the number of linear isometries. Paradoxically, shells are also known to continuously fold more easily across these rigidifying curves than other curves on the surface. We show how including nonlinearities in the strain can explain these phenomena and demonstrate folding isometries with explicit solutions to the nonlinear isometry equations. In addition to explicit solutions, exact geometric arguments are given to validate and guide our analysis in a coordinate-free way.
Paner, T M; Riccelli, P V; Owczarzy, R; Benight, A S
1996-12-01
Optical melting curves of 22 DNA dumbbells with the 16-base pair duplex sequence 5'-G-C-A-T-C-A-T-C-G-A-T-G-A-T-G-C-3' linked on both ends by single-strand loops of A, or C, sequences (iota = 2, 3, 4, 6, 8, 10, 14). T sequences (iota = 2, 3, 4, 6, 8, 10), and G iota sequences (iota = 2, 4) were measured in phosphate buffered solvents containing 30, 70, and 120 mM Na+. For dumbbells with loops comprised of at least three nucleotides, stability is inversely proportional to end-loop size. Dumbbells with loops comprised of only two nucleotide bases generally have lower stabilities than dumbbells with three base nucleotide loops. Experimental melting curves were analyzed in terms of the numerically exact (multistate) statistical thermodynamic model of DNA dumbbell melting previously described (T. M. paner, M. Amaratunga & A. S. Benight (1992), Biopolymers 32, 881). Theoretically calculated melting curves were fitted to experimental curves by simultaneously adjusting model parameters representing statistical weights of intramolecular hairpin loop and single-strand circle states. The systematically determined empirical parameters provided evaluations of the energetics of hairpin loop formation as a function of loop size, sequence, and salt environment. Values of the free energies of hairpin loop formation delta Gloop(n > iota) and single-strand circles, delta Gcir(N) as a function of end-loop size, tau = 2-14, circle size, N = 32 + 2 iota, and loop sequence were obtained. These quantities were found to depend on end-loop size but not loop sequence. Their empirically determined values also varied with solvent ionic strength. Analytical expressions for the partition function Q(T) of the dumbbells were evaluated using the empirically evaluated best-fit loop parameters. From Q(T), the melting transition enthalpy delta H, entropy delta S, and free energy delta G, were evaluated for the dumbbells as a function of end-loop size, sequence, and [Na+]. Since the multistate analysis
Matis, J.H.; Kiffe, T.R.; Werf, van der W.; Costamagna, A.C.; Matis, T.I.; Grant, W.E.
2009-01-01
Density dependent feedback, based on cumulative population size, has been advocated to explain and mathematically characterize “boom and bust” population dynamics. Such feedback results in a bell-shaped population trajectory of the population density. Here, we note that this trajectory is
Smith, Garon C.; Hossain, Md Mainul
2017-01-01
Species TOPOS is a free software package for generating three-dimensional (3-D) topographic surfaces ("topos") for acid-base equilibrium studies. This upgrade adds 3-D species distribution topos to earlier surfaces that showed pH and buffer capacity behavior during titration and dilution procedures. It constructs topos by plotting…
Identification and "reverse engineering" of Pythagorean-hodograph curves
Farouki, RT; Giannelli, C; Sestini, A
2015-01-01
© 2015 Elsevier B.V. All rights reserved. Methods are developed to identify whether or not a given polynomial curve, specified by Bézier control points, is a Pythagorean-hodograph (PH) curve - and, if so, to reconstruct the internal algebraic structure that allows one to exploit the advantageous properties of PH curves. Two approaches to identification of PH curves are proposed. The first is based on the satisfaction of a system of algebraic constraints by the control-polygon legs, and the se...
Considerations for reference pump curves
International Nuclear Information System (INIS)
Stockton, N.B.
1992-01-01
This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point
Potekaev, A. I.; Kondratyuk, A. A.; Porobova, S. A.; Klopotov, A. A.; Markova, T. N.; Kakushkin, Yu A.; Klopotov, V. D.
2016-11-01
The paper presents the analysis of binary state diagrams based on elements VIIIA and IB of the periodic table and crystal geometry parameters of solid solutions and intermetallic compositions. The analysis shows an explicit correlation between the type of the evolution of phase diagrams classified by Lebedev depending on the nature of atomic volume deviations observed in solid solutions and intermetallic compounds from Zen law.
Beam, Craig A.; Conant, Emily F.; Krupinski, Elizabeth A.; Kundel, Harold L.; Sickles, Edward A.
2005-04-01
We introduce an interesting interpretation of the ROC Curve that, subsequently, opens a new research paradigm. We define the "Diagnostician Operating Choice" (DOC) Curve to be the set of all (True Positive Probability/True Negative Probability) or ("skill in diseased population"/"skill in non-diseased population" when considered from the diagnostician's perspective) options made available to a particular radiologist when interpreting a particular diagnostic technology. The DOC Curve is, thus, the choice set presented to the diagnostician by their interaction with the technology. This new paradigm calls for tools that can measure the particular choice set of any particular individual radiologist interpreting a particular technology when applied in a particular clinical setting. Fundamental requirements for this paradigm are for the DOC Curve to be unique to individuals and constant across similar experimental conditions. To investigate constancy, we analyzed data from a reading study of 10 radiologists. Each radiologist interpreted the same set of 148 screening mammograms twice using a modified version of BI-RADS. ROC Curves for each radiologist were computed and compared between the two reading occasions with the CORROC2 program. None of the areas were statistically significantly different (p<0.05), providing confirmation (but not proof) of constancy across the two reading conditions. The DOC Curve paradigm suggests new areas of research focusing on the behavior in individuals interacting with technology. A clear need is for more efficient estimation of individual DOC Curves based on limited case sets. Paradoxically, the answer to this last problem might lie in using large population-based ("MRMC") studies to develop highly efficient and externally validated standardized testing tools for assessment of the individual.
Maximum likelihood decay curve fits by the simplex method
International Nuclear Information System (INIS)
Gregorich, K.E.
1991-01-01
A multicomponent decay curve analysis technique has been developed and incorporated into the decay curve fitting computer code, MLDS (maximum likelihood decay by the simplex method). The fitting criteria are based on the maximum likelihood technique for decay curves made up of time binned events. The probabilities used in the likelihood functions are based on the Poisson distribution, so decay curves constructed from a small number of events are treated correctly. A simple utility is included which allows the use of discrete event times, rather than time-binned data, to make maximum use of the decay information. The search for the maximum in the multidimensional likelihood surface for multi-component fits is performed by the simplex method, which makes the success of the iterative fits extremely insensitive to the initial values of the fit parameters and eliminates the problems of divergence. The simplex method also avoids the problem of programming the partial derivatives of the decay curves with respect to all the variable parameters, which makes the implementation of new types of decay curves straightforward. Any of the decay curve parameters can be fixed or allowed to vary. Asymmetric error limits for each of the free parameters, which do not consider the covariance of the other free parameters, are determined. A procedure is presented for determining the error limits which contain the associated covariances. The curve fitting procedure in MLDS can easily be adapted for fits to other curves with any functional form. (orig.)
Curve Digitizer – A software for multiple curves digitizing
Directory of Open Access Journals (Sweden)
Florentin ŞPERLEA
2010-06-01
Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document
Energy Technology Data Exchange (ETDEWEB)
Zimmermann, Iris; Dbeyssi, Alaa; Khaneft, Dmitry; Maas, Frank; Zambrana, Manuel; Mora Espi, Maria Carmen; Morales Morales, Cristina; Lin, Dexu; Froehlich, Bertold; Capozza, Luigi; Noll, Oliver; Deiseroth, Malte; Ahmed, Samer; Ahmadi, Heybat; Valente, Roserio; Rodriguez Pineiro, David [Helmholtz-Institut Mainz (Germany); GSI Darmstadt (Germany); Collaboration: PANDA-Collaboration
2015-07-01
The measurement of the time-like electromagnetic form factors (TL em FF), G{sub E} and G{sub M}, using reactions of anti pp → l{sup +}l{sup -} (l=e,μ) gives access to the structure of the proton. It will be the first time measurement of TL em FF of the proton accessing the muons in the final state. One advantage of using this channel is that radiative corrections due to final state radiation are suppressed by the heavy mass of the muon. Measuring anti pp → μ{sup +}μ{sup -} will also serve as a consistency check of the TL em FF data from anti pp → e{sup +}e{sup -}. Feasibility studies for the individual extraction of G{sub E} and G{sub M} out of the measured angular distribution are in progress for the muonic channel using the software package PANDARoot. Due to the strong hadronic background, mainly reactions of anti pp → π{sup +}π{sup -}, a very good signal-to-background separation is needed. For the analysis of both signal and background channel different multivariate classification methods are used. The current status of the studies is presented.
An efficient modified Elliptic Curve Digital Signature Algorithm | Kiros ...
African Journals Online (AJOL)
Many digital signatures which are based on Elliptic Curves Cryptography (ECC) have been proposed. Among these digital signatures, the Elliptic Curve Digital Signature Algorithm (ECDSA) is the widely standardized one. However, the verification process of ECDSA is slower than the signature generation process. Hence ...
Energy Technology Data Exchange (ETDEWEB)
Miura, N.; Soneda, N. [Central Research Inst. of Electric Power Industry (Japan); Hiranuma, N. [Tokyo Electric Power Co. (Japan)
2004-07-01
The Master Curve method to determine fracture toughness in the brittle-to-ductile transition temperature range is provided in the ASTM standard E 1921. In this study, the method was applied to two types of typical Japanese reactor pressure vessel steels. As a result, it was confirmed that valid reference temperatures as well as master curves could be determined based on the ASTM standard. The ability of the statistical size scaling as well as the propriety of the assumption on the statistical distribution of fracture toughness was exerpiementally validated. The relative position between the master curve and the current K{sub IC} curves was then compared and discussed. (orig.)
International Nuclear Information System (INIS)
Miura, N.; Soneda, N.; Hiranuma, N.
2004-01-01
The Master Curve method to determine fracture toughness in the brittle-to-ductile transition temperature range is provided in the ASTM standard E 1921. In this study, the method was applied to two types of typical Japanese reactor pressure vessel steels. As a result, it was confirmed that valid reference temperatures as well as master curves could be determined based on the ASTM standard. The ability of the statistical size scaling as well as the propriety of the assumption on the statistical distribution of fracture toughness was exerpiementally validated. The relative position between the master curve and the current K IC curves was then compared and discussed. (orig.)
Whiley, David M; Jacob, Kevin; Nakos, Jennifer; Bletchly, Cheryl; Nimmo, Graeme R; Nissen, Michael D; Sloots, Theo P
2012-06-01
Numerous real-time PCR assays have been described for detection of the influenza A H275Y alteration. However, the performance of these methods can be undermined by sequence variation in the regions flanking the codon of interest. This is a problem encountered more broadly in microbial diagnostics. In this study, we developed a modification of hybridization probe-based melting curve analysis, whereby primers are used to mask proximal mutations in the sequence targets of hybridization probes, so as to limit the potential for sequence variation to interfere with typing. The approach was applied to the H275Y alteration of the influenza A (H1N1) 2009 strain, as well as a Neisseria gonorrhoeae mutation associated with antimicrobial resistance. Assay performances were assessed using influenza A and N. gonorrhoeae strains characterized by DNA sequencing. The modified hybridization probe-based approach proved successful in limiting the effects of proximal mutations, with the results of melting curve analyses being 100% consistent with the results of DNA sequencing for all influenza A and N. gonorrhoeae strains tested. Notably, these included influenza A and N. gonorrhoeae strains exhibiting additional mutations in hybridization probe targets. Of particular interest was that the H275Y assay correctly typed influenza A strains harbouring a T822C nucleotide substitution, previously shown to interfere with H275Y typing methods. Overall our modified hybridization probe-based approach provides a simple means of circumventing problems caused by sequence variation, and offers improved detection of the influenza A H275Y alteration and potentially other resistance mechanisms.
Growth curve models and statistical diagnostics
Pan, Jian-Xin
2002-01-01
Growth-curve models are generalized multivariate analysis-of-variance models. These models are especially useful for investigating growth problems on short times in economics, biology, medical research, and epidemiology. This book systematically introduces the theory of the GCM with particular emphasis on their multivariate statistical diagnostics, which are based mainly on recent developments made by the authors and their collaborators. The authors provide complete proofs of theorems as well as practical data sets and MATLAB code.
Bezier Curve Modeling for Neutrosophic Data Problem
Directory of Open Access Journals (Sweden)
Ferhat Tas
2017-02-01
Full Text Available Neutrosophic set concept is defined with membership, non-membership and indeterminacy degrees. This concept is the solution and representation of the problems with various fields. In this paper, a geometric model is introduced for Neutrosophic data problem for the first time. This model is based on neutrosophic sets and neutrosophic relations. Neutrosophic control points are defined according to these points, resulting in neutrosophic Bezier curves.
Calibration curves for biological dosimetry
International Nuclear Information System (INIS)
Guerrero C, C.; Brena V, M. . E-mail cgc@nuclear.inin.mx
2004-01-01
The generated information by the investigations in different laboratories of the world, included the ININ, in which settles down that certain class of chromosomal leisure it increases in function of the dose and radiation type, has given by result the obtaining of calibrated curves that are applied in the well-known technique as biological dosimetry. In this work is presented a summary of the work made in the laboratory that includes the calibrated curves for gamma radiation of 60 Cobalt and X rays of 250 k Vp, examples of presumed exposure to ionizing radiation, resolved by means of aberration analysis and the corresponding dose estimate through the equations of the respective curves and finally a comparison among the dose calculations in those people affected by the accident of Ciudad Juarez, carried out by the group of Oak Ridge, USA and those obtained in this laboratory. (Author)
Rational points on elliptic curves
Silverman, Joseph H
2015-01-01
The theory of elliptic curves involves a pleasing blend of algebra, geometry, analysis, and number theory. This book stresses this interplay as it develops the basic theory, thereby providing an opportunity for advanced undergraduates to appreciate the unity of modern mathematics. At the same time, every effort has been made to use only methods and results commonly included in the undergraduate curriculum. This accessibility, the informal writing style, and a wealth of exercises make Rational Points on Elliptic Curves an ideal introduction for students at all levels who are interested in learning about Diophantine equations and arithmetic geometry. Most concretely, an elliptic curve is the set of zeroes of a cubic polynomial in two variables. If the polynomial has rational coefficients, then one can ask for a description of those zeroes whose coordinates are either integers or rational numbers. It is this number theoretic question that is the main subject of this book. Topics covered include the geometry and ...
The New Keynesian Phillips Curve
DEFF Research Database (Denmark)
Ólafsson, Tjörvi
This paper provides a survey on the recent literature on the new Keynesian Phillips curve: the controversies surrounding its microfoundation and estimation, the approaches that have been tried to improve its empirical fit and the challenges it faces adapting to the open-economy framework. The new...... Keynesian Phillips curve has been severely criticized for poor empirical dynamics. Suggested improvements involve making some adjustments to the standard sticky price framework, e.g. introducing backwardness and real rigidities, or abandoning the sticky price model and relying on models of inattentiveness......, learning or state-dependant pricing. The introduction of openeconomy factors into the new Keynesian Phillips curve complicate matters further as it must capture the nexus between price setting, inflation and the exchange rate. This is nevertheless a crucial feature for any model to be used for inflation...
TELECOMMUNICATIONS INFRASTRUCTURE AND GDP /JIPP CURVE/
Directory of Open Access Journals (Sweden)
Mariana Kaneva
2016-07-01
Full Text Available The relationship between telecommunications infrastructure and economic activity is under discussion in many scientific papers. Most of the authors use for research and analysis the Jipp curve. A lot of doubts about the correctness of the Jipp curve appear in terms of applying econometric models. The aim of this study is a review of the Jipp curve, refining the possibility of its application in modern conditions. The methodology used in the study is based on dynamic econometric models, including tests for nonstationarity and tests for causality. The focus of this study is directed to methodological problems in measuring the local density types of telecommunication networks. This study offers a specific methodology for assessing the Jipp law, through VAR-approach and Granger causality tests. It is proved that mechanical substitution of momentary aggregated variables (such as the number of subscribers of a telecommunication network at the end of the year and periodically aggregated variables (such as GDP per capita in the Jipp�s curve is methodologically wrong. Researchers have to reconsider the relationship set in the Jipp�s curve by including additional variables that characterize the Telecommunications sector and the economic activity in a particular country within a specified time period. GDP per capita should not be regarded as a single factor for the local density of telecommunications infrastructure. New econometric models studying the relationship between the investments in telecommunications infrastructure and economic development may be not only linear regression models, but also other econometric models. New econometric models should be proposed after testing and validating with sound economic theory and econometric methodology.
Electro-Mechanical Resonance Curves
Greenslade, Thomas B.
2018-03-01
Recently I have been investigating the frequency response of galvanometers. These are direct-current devices used to measure small currents. By using a low-frequency function generator to supply the alternating-current signal and a stopwatch smartphone app to measure the period, I was able to take data to allow a resonance curve to be drawn. This is the sort of project that should provide a fascinating research experience for the introductory physics student. In this article I will discuss the galvanometers that I used in this work, and will show a resonance curve for one of them.
Shock detachment from curved wedges
Mölder, S.
2017-09-01
Curved shock theory is used to show that the flow behind attached shocks on doubly curved wedges can have either positive or negative post-shock pressure gradients depending on the freestream Mach number, the wedge angle and the two wedge curvatures. Given enough wedge length, the flow near the leading edge can choke to force the shock to detach from the wedge. This local choking can preempt both the maximum deflection and the sonic criteria for shock detachment. Analytical predictions for detachment by local choking are supported by CFD results.
Afkhami, Abbas; Khajavi, Farzad; Khanmohammadi, Hamid
2009-08-11
The oxidation of the recently synthesized Schiff base 3,6-bis((2-aminoethyl-5-Br-salicyliden)thio)pyridazine (PABST) with hydrogen peroxide was investigated using spectrophotometric studies. The reaction rate order and observed rate constant of the oxidation reaction was obtained in the mixture of N,N-dimethylformamide (DMF):water (30:70, v/v) at pH 10 using multivariate cure resolution alternative least squares (MCR-ALS) method and rank annihilation factor analysis (RAFA). The effective parameters on the oxidation rate constant such as percents of DMF, the effect of transition metals like Cu(2+), Zn(2+), Mn(2+) and Hg(2+) and the presence of surfactants were investigated. The keto-enol equilibria in DMF:water (30:70, v/v) solution at pH 7.6 was also investigated in the presence of surfactants. At concentrations above critical micelle concentration (cmc) of cationic surfactant cetyltrimethylammonium bromide (CTAB), the keto form was the predominant species, while at concentrations above cmc of anionic surfactant sodium dodecyl sulfate (SDS), the enol form was the predominant species. The kinetic reaction order and the rate constant of tautomerization in micellar medium were obtained using MCR-ALS and RAFA. The results obtained by both the methods were in a good agreement with each other. Also the effect of different volume percents of DMF on the rate constant of tautomerization was investigated. The neutral surfactant (Triton X-100) had no effect on tautomerization equilibrium.
Geometric nonlinear dynamic analysis of curved beams using curved beam element
Pan, Ke-Qi; Liu, Jin-Yang
2011-12-01
Instead of using the previous straight beam element to approximate the curved beam, in this paper, a curvilinear coordinate is employed to describe the deformations, and a new curved beam element is proposed to model the curved beam. Based on exact nonlinear strain-displacement relation, virtual work principle is used to derive dynamic equations for a rotating curved beam, with the effects of axial extensibility, shear deformation and rotary inertia taken into account. The constant matrices are solved numerically utilizing the Gauss quadrature integration method. Newmark and Newton-Raphson iteration methods are adopted to solve the differential equations of the rigid-flexible coupling system. The present results are compared with those obtained by commercial programs to validate the present finite method. In order to further illustrate the convergence and efficiency characteristics of the present modeling and computation formulation, comparison of the results of the present formulation with those of the ADAMS software are made. Furthermore, the present results obtained from linear formulation are compared with those from nonlinear formulation, and the special dynamic characteristics of the curved beam are concluded by comparison with those of the straight beam.
Archaeomagnetic SV curve for Belgium
Ech-chakrouni, Souad; Hus, Jozef
2017-04-01
Archaeomagnetic secular variation curves have been established for different countries in Europe, especially when different archeological sites are more or less uniformly distributed in time are available. The disadvantage in that case is that data had to be relocated to a single reference site. The proximity of the reference locality Paris to Belgium makes that we used the French archaeomagnetic SV curve for the last three millennia up to the present for archaeomagnetic dating undated baked structures. In total, 85 baked structures have been examined, unearthed in 24 archaeological sites of the territory of Belgium. The ChRM of each sample was obtained by principal component analysis for at least three demagnetisation steps (Kirschvink 1980). Except for some outliers, the ChRM directions are very coherent with a high confidence factor (α95archaeomagnetism. At present, only six baked structures were dated radiometrically and may be considered as reference data for a limited area about 30500 km2 in Western Europe. The ultimate aim is to construct an archaeomagnetic SV curve for Belgium with Uccle as reference locality, where the first measurement of the geomagnetic field was done in 1895. This curve would include all the available reference data in a radius of about 500 km around Uccle. Keywords: secular variation, archaeomagnetic dating, Belgium.
Indian Academy of Sciences (India)
The cause-effect relationship for a wide variety of biologi- cal processes from molecular to ecosystem levels can be described by a curvilinear function called the rectangular hyperbola. Although a simple algebraic equation adequately describes this curve, biological models have generated different equations incorporating ...
Survival curves for irradiated cells
International Nuclear Information System (INIS)
Gibson, D.K.
1975-01-01
The subject of the lecture is the probability of survival of biological cells which have been subjected to ionising radiation. The basic mathematical theories of cell survival as a function of radiation dose are developed. A brief comparison with observed survival curves is made. (author)
2013-01-01
This software can be used to assist with the assessment of margin of safety for a horizontal curve. It is intended for use by engineers and technicians responsible for safety analysis or management of rural highway pavement or traffic control devices...
Ultrasonic Fetal Cephalometry: Percentiles Curve
Flamme, P.
1972-01-01
Measurements by ultrasound of the biparietal diameter of the fetal head during pregnancy are a reliable guide to fetal growth. As a ready means of comparison with the normal we constructed from 4,170 measurements in 1,394 cases a curve showing the percentiles distribution of biparietal diameters for each week of gestation. PMID:5070162
Interpolation and Polynomial Curve Fitting
Yang, Yajun; Gordon, Sheldon P.
2014-01-01
Two points determine a line. Three noncollinear points determine a quadratic function. Four points that do not lie on a lower-degree polynomial curve determine a cubic function. In general, n + 1 points uniquely determine a polynomial of degree n, presuming that they do not fall onto a polynomial of lower degree. The process of finding such a…
Energy Technology Data Exchange (ETDEWEB)
Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)
2014-05-15
To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)
Cell Proliferation on Planar and Curved Substrates
Gaines, Michelle; Chang, Ya Wen; Cruz, Ricardo; Fragkopoulos, Alexandros; Garcia, Andres; Fernandez-Nieves, Alberto
Aberrant epithelial collective cell growth is one of the major challenges to be addressed in order to treat diseases such as cancer and organ fibrosis. The conditions of the extracellular microenvironment, properties of the cells' cytoskeleton, and interfacial properties of the substratum (the surface in contact with epithelial cells) have a significant influence on the migratory behavior of epithelial cells, cell proliferation and migration. This work focuses on understanding the impact the substratum curvature has on cell behavior. We focus on cell proliferation first and study MDCK cells on both planar and curved hydrogel substrates. The curved hydrogels are based on polyacrylamide and have toroidal shape, with tube radius 200 um and an aspect ratio in the rage between 2 and 9. Proliferation is measured using the Click-it EDU assay (Invitrogen), which measures cells that are synthesizing DNA. Funding Source is Childrens Healthcare of Atlanta.
An interesting elliptic surface over an elliptic curve
Schütt, Matthias; Shioda, Tetsuji
2007-01-01
We study the elliptic modular surface attached to the commutator subgroup of the modular group. This has an elliptic curve as base and only one singular fibre. We employ an algebraic approach and then consider some arithmetic questions.
Ground reaction curve based upon block theory
International Nuclear Information System (INIS)
Yow, J.L. Jr.; Goodman, R.E.
1985-09-01
Discontinuities in a rock mass can intersect an excavation surface to form discrete blocks (keyblocks) which can be unstable. Once a potentially unstable block is identified, the forces affecting it can be calculated to assess its stability. The normal and shear stresses on each block face before displacement are calculated using elastic theory and are modified in a nonlinear way by discontinuity deformations as the keyblock displaces. The stresses are summed into resultant forces to evaluate block stability. Since the resultant forces change with displacement, successive increments of block movement are examined to see whether the block ultimately becomes stable or fails. Two-dimensional (2D) and three-dimensional (3D) analytic models for the stability of simple pyramidal keyblocks were evaluated. Calculated stability is greater for 3D analyses than for 2D analyses. Calculated keyblock stability increases with larger in situ stress magnitudes, larger lateral stress ratios, and larger shear strengths. Discontinuity stiffness controls blocks displacement more strongly than it does stability itself. Large keyblocks are less stable than small ones, and stability increases as blocks become more slender
Parameter Deduction and Accuracy Analysis of Track Beam Curves in Straddle-type Monorail Systems
Directory of Open Access Journals (Sweden)
Xiaobo Zhao
2015-12-01
Full Text Available The accuracy of the bottom curve of a PC track beam is strongly related to the production quality of the entire beam. Many factors may affect the parameters of the bottom curve, such as the superelevation of the curve and the deformation of a PC track beam. At present, no effective method has been developed to determine the bottom curve of a PC track beam; therefore, a new technique is presented in this paper to deduce the parameters of such a curve and to control the accuracy of the computation results. First, the domain of the bottom curve of a PC track beam is assumed to be a spindle plane. Then, the corresponding supposed top curve domain is determined based on a geometrical relationship that is the opposite of that identified by the conventional method. Second, several optimal points are selected from the supposed top curve domain according to the dichotomy algorithm; the supposed top curve is thus generated by connecting these points. Finally, one rigorous criterion is established in the fractal dimension to assess the accuracy of the assumed top curve deduced in the previous step. If this supposed curve coincides completely with the known top curve, then the assumed bottom curve corresponding to the assumed top curve is considered to be the real bottom curve. This technique of determining the bottom curve of a PC track beam is thus proven to be efficient and accurate.
A catalog of special plane curves
Lawrence, J Dennis
2014-01-01
Among the largest, finest collections available-illustrated not only once for each curve, but also for various values of any parameters present. Covers general properties of curves and types of derived curves. Curves illustrated by a CalComp digital incremental plotter. 12 illustrations.
Computation of undulator tuning curves
International Nuclear Information System (INIS)
Dejus, Roger J.
1997-01-01
Computer codes for fast computation of on-axis brilliance tuning curves and flux tuning curves have been developed. They are valid for an ideal device (regular planar device or a helical device) using the Bessel function formalism. The effects of the particle beam emittance and the beam energy spread on the spectrum are taken into account. The applicability of the codes and the importance of magnetic field errors of real insertion devices are addressed. The validity of the codes has been experimentally verified at the APS and observed discrepancies are in agreement with predicted reduction of intensities due to magnetic field errors. The codes are distributed as part of the graphical user interface XOP (X-ray OPtics utilities), which simplifies execution and viewing of the results
Invariance for Single Curved Manifold
Castro, Pedro Machado Manhaes de
2012-08-01
Recently, it has been shown that, for Lambert illumination model, solely scenes composed by developable objects with a very particular albedo distribution produce an (2D) image with isolines that are (almost) invariant to light direction change. In this work, we provide and investigate a more general framework, and we show that, in general, the requirement for such in variances is quite strong, and is related to the differential geometry of the objects. More precisely, it is proved that single curved manifolds, i.e., manifolds such that at each point there is at most one principal curvature direction, produce invariant is surfaces for a certain relevant family of energy functions. In the three-dimensional case, the associated energy function corresponds to the classical Lambert illumination model with albedo. This result is also extended for finite-dimensional scenes composed by single curved objects. © 2012 IEEE.
Incorporating Experience Curves in Appliance Standards Analysis
Energy Technology Data Exchange (ETDEWEB)
Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit
2011-10-31
The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.
A curved resonant flexoelectric actuator
Zhang, Shuwen; Liu, Kaiyuan; Xu, Minglong; Shen, Shengping
2017-08-01
Flexoelectricity is an electro-mechanical coupling effect that exists in all dielectrics and has the potential to replace piezoelectric actuating on the microscale. In this letter, a curved flexoelectric actuator with non-polarized polyvinylidene fluoride is presented and shown to exhibit good electro-mechanical properties. This provides experimental support for a body of theoretical research into converse flexoelectricity in polymeric materials. In addition, this work demonstrates the feasibility of lead-free microscale actuating without piezoelectricity.
Active Particles on Curved Surfaces
Fily, Yaouen; Baskaran, Aparna; Hagan, Michael F.
2016-01-01
Recent studies have highlighted the sensitivity of active matter to boundaries and their geometries. Here we develop a general theory for the dynamics and statistics of active particles on curved surfaces and illustrate it on two examples. We first show that active particles moving on a surface with no ability to probe its curvature only exhibit steady-state inhomogeneities in the presence of orientational order. We then consider a strongly confined 3D ideal active gas and compute its steady-...
Growth curves for Laron syndrome.
Laron, Z; Lilos, P; Klinger, B
1993-06-01
Growth curves for children with Laron syndrome were constructed on the basis of repeated measurements made throughout infancy, childhood, and puberty in 24 (10 boys, 14 girls) of the 41 patients with this syndrome investigated in our clinic. Growth retardation was already noted at birth, the birth length ranging from 42 to 46 cm in the 12/20 available measurements. The postnatal growth curves deviated sharply from the normal from infancy on. Both sexes showed no clear pubertal spurt. Girls completed their growth between the age of 16-19 years to a final mean (SD) height of 119 (8.5) cm whereas the boys continued growing beyond the age of 20 years, achieving a final height of 124 (8.5) cm. At all ages the upper to lower body segment ratio was more than 2 SD above the normal mean. These growth curves constitute a model not only for primary, hereditary insulin-like growth factor-I (IGF-I) deficiency (Laron syndrome) but also for untreated secondary IGF-I deficiencies such as growth hormone gene deletion and idiopathic congenital isolated growth hormone deficiency. They should also be useful in the follow up of children with Laron syndrome treated with biosynthetic recombinant IGF-I.
Kruiver, Pauline P.; Dekkers, M.J.; Heslop, David
2001-01-01
A new method of analysing isothermal remanent magnetisation (IRM) acquisition curves based on cumulative log Gaussian analysis [Robertson and France, Phys. Earth Planet. Inter. 82 (1994) 223-234] is proposed. It is based on the curve fitting of the IRM acquisition curve versus the logarithm of the
Dual Smarandache Curves and Smarandache Ruled Surfaces
Tanju KAHRAMAN; Mehmet ÖNDER; H. Hüseyin UGURLU
2013-01-01
In this paper, by considering dual geodesic trihedron (dual Darboux frame) we define dual Smarandache curves lying fully on dual unit sphere S^2 and corresponding to ruled surfaces. We obtain the relationships between the elements of curvature of dual spherical curve (ruled surface) x(s) and its dual Smarandache curve (Smarandache ruled surface) x1(s) and we give an example for dual Smarandache curves of a dual spherical curve.
Parametrizations of elliptic curves by Shimura curves and by classical modular curves.
Ribet, K A; Takahashi, S
1997-10-14
Fix an isogeny class of semistable elliptic curves over Q. The elements A of have a common conductor N, which is a square-free positive integer. Let D be a divisor of N which is the product of an even number of primes--i.e., the discriminant of an indefinite quaternion algebra over Q. To D we associate a certain Shimura curve X(0)D(N/D), whose Jacobian is isogenous to an abelian subvariety of J0(N). There is a unique A [symbol; see text] A in for which one has a nonconstant map piD : X(0)D(N/D) --> A whose pullback A --> Pic0(X(0)D(N/D)) is injective. The degree of piD is an integer deltaD which depends only on D (and the fixed isogeny class A). We investigate the behavior of deltaD as D varies.
An Autocorrelation Term Method for Curve Fitting
Houston, Louis M.
2013-01-01
The least-squares method is the most popular method for fitting a polynomial curve to data. It is based on minimizing the total squared error between a polynomial model and the data. In this paper we develop a different approach that exploits the autocorrelation function. In particular, we use the nonzero lag autocorrelation terms to produce a system of quadratic equations that can be solved together with a linear equation derived from summing the data. There is a maximum of solutions when th...
Curve fitting for RHB Islamic Bank annual net profit
Nadarajan, Dineswary; Noor, Noor Fadiya Mohd
2015-05-01
The RHB Islamic Bank net profit data are obtained from 2004 to 2012. Curve fitting is done by assuming the data are exact or experimental due to smoothing process. Higher order Lagrange polynomial and cubic spline with curve fitting procedure are constructed using Maple software. Normality test is performed to check the data adequacy. Regression analysis with curve estimation is conducted in SPSS environment. All the eleven models are found to be acceptable at 10% significant level of ANOVA. Residual error and absolute relative true error are calculated and compared. The optimal model based on the minimum average error is proposed.
A note on families of fragility curves
International Nuclear Information System (INIS)
Kaplan, S.; Bier, V.M.; Bley, D.C.
1989-01-01
In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve
Differential geometry curves, surfaces, manifolds
Kühnel, Wolfgang
2015-01-01
This carefully written book is an introduction to the beautiful ideas and results of differential geometry. The first half covers the geometry of curves and surfaces, which provide much of the motivation and intuition for the general theory. The second part studies the geometry of general manifolds, with particular emphasis on connections and curvature. The text is illustrated with many figures and examples. The prerequisites are undergraduate analysis and linear algebra. This new edition provides many advancements, including more figures and exercises, and-as a new feature-a good number of so
Principal Curves on Riemannian Manifolds
DEFF Research Database (Denmark)
Hauberg, Søren
2015-01-01
Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only...... in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimize a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend...
Lightlike contractions in curved spacetime
International Nuclear Information System (INIS)
Aichelburg, P.C.; Embacher, F.
1984-01-01
The technique of lightlike contractions in flat and curved space is described. The method consists in boosting a classical field configuration to the velocity of light by an appropriate generalized Lorentz transformation. Within this framework the gravitational field of a massless neutral particle is a meaningful concept. For electrically charged particles, however, the field equations seem to prevent an analogous procedure. We thus conjecture that general relativity forbids the existance of charged point particles moving with the velocity of light. Further examples for lightlike contractions of a self-dual electromagnetic field and of a linearized Rarita-Schwinger (spin-3/2) field are given. (Author)
A NURBS approximation of experimental stress-strain curves
International Nuclear Information System (INIS)
Fedorov, Timofey V.; Morrev, Pavel G.
2016-01-01
A compact universal representation of monotonic experimental stress-strain curves of metals and alloys is proposed. It is based on the nonuniform rational Bezier splines (NURBS) of second order and may be used in a computer library of materials. Only six parameters per curve are needed; this is equivalent to a specification of only three points in a stress-strain plane. NURBS-functions of higher order prove to be surplus. Explicit expressions for both yield stress and hardening modulus are given. Two types of curves are considered: at a finite interval of strain and at infinite one. A broad class of metals and alloys of various chemical compositions subjected to various types of preliminary thermo-mechanical working is selected from a comprehensive data base in order to test the methodology proposed. The results demonstrate excellent correspondence to the experimental data. Keywords: work hardening, stress-strain curve, spline approximation, nonuniform rational B-spline, NURBS.
Projection of curves on B-spline surfaces using quadratic reparameterization
Yang, Yijun
2010-09-01
Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying completely on the surfaces by using iso-parameter curves of the reparameterized surfaces. The Hausdorff distance between the projected curve and the original curve is controlled under the user-specified distance tolerance. The projected curve is T-G 1 continuous, where T is the user-specified angle tolerance. Examples are given to show the performance of our algorithm. © 2010 Elsevier Inc. All rights reserved.
Differential geometry and topology of curves
Animov, Yu
2001-01-01
Differential geometry is an actively developing area of modern mathematics. This volume presents a classical approach to the general topics of the geometry of curves, including the theory of curves in n-dimensional Euclidean space. The author investigates problems for special classes of curves and gives the working method used to obtain the conditions for closed polygonal curves. The proof of the Bakel-Werner theorem in conditions of boundedness for curves with periodic curvature and torsion is also presented. This volume also highlights the contributions made by great geometers. past and present, to differential geometry and the topology of curves.
Flow characteristics of curved ducts
Directory of Open Access Journals (Sweden)
Rudolf P.
2007-10-01
Full Text Available Curved channels are very often present in real hydraulic systems, e.g. curved diffusers of hydraulic turbines, S-shaped bulb turbines, fittings, etc. Curvature brings change of velocity profile, generation of vortices and production of hydraulic losses. Flow simulation using CFD techniques were performed to understand these phenomena. Cases ranging from single elbow to coupled elbows in shapes of U, S and spatial right angle position with circular cross-section were modeled for Re = 60000. Spatial development of the flow was studied and consequently it was deduced that minor losses are connected with the transformation of pressure energy into kinetic energy and vice versa. This transformation is a dissipative process and is reflected in the amount of the energy irreversibly lost. Least loss coefficient is connected with flow in U-shape elbows, biggest one with flow in Sshape elbows. Finally, the extent of the flow domain influenced by presence of curvature was examined. This isimportant for proper placement of mano- and flowmeters during experimental tests. Simulations were verified with experimental results presented in literature.
Classical optics and curved spaces
International Nuclear Information System (INIS)
Bailyn, M.; Ragusa, S.
1976-01-01
In the eikonal approximation of classical optics, the unit polarization 3-vector of light satisfies an equation that depends only on the index, n, of refraction. It is known that if the original 3-space line element is d sigma 2 , then this polarization direction propagates parallely in the fictitious space n 2 d sigma 2 . Since the equation depends only on n, it is possible to invent a fictitious curved 4-space in which the light performs a null geodesic, and the polarization 3-vector behaves as the 'shadow' of a parallely propagated 4-vector. The inverse, namely, the reduction of Maxwell's equation, on a curve 'dielectric free) space, to a classical space with dielectric constant n=(-g 00 ) -1 / 2 is well known, but in the latter the dielectric constant epsilon and permeability μ must also equal (-g 00 ) -1 / 2 . The rotation of polarization as light bends around the sun by utilizing the reduction to the classical space, is calculated. This (non-) rotation may then be interpreted as parallel transport in the 3-space n 2 d sigma 2 [pt
Placement Design of Changeable Message Signs on Curved Roadways
Directory of Open Access Journals (Sweden)
Zhongren Wang, Ph.D. P.E. T.E.
2015-01-01
Full Text Available This paper presented a fundamental framework for Changeable Message Sign (CMS placement design along roadways with horizontal curves. This analytical framework determines the available distance for motorists to read and react to CMS messages based on CMS character height, driver's cone of vision, CMS pixel's cone of legibility, roadway horizontal curve radius, and CMS lateral and vertical placement. Sample design charts were developed to illustrate how the analytical framework may facilitate CMS placement design.
p-Curve and p-Hacking in Observational Research.
Bruns, Stephan B; Ioannidis, John P A
2016-01-01
The p-curve, the distribution of statistically significant p-values of published studies, has been used to make inferences on the proportion of true effects and on the presence of p-hacking in the published literature. We analyze the p-curve for observational research in the presence of p-hacking. We show by means of simulations that even with minimal omitted-variable bias (e.g., unaccounted confounding) p-curves based on true effects and p-curves based on null-effects with p-hacking cannot be reliably distinguished. We also demonstrate this problem using as practical example the evaluation of the effect of malaria prevalence on economic growth between 1960 and 1996. These findings call recent studies into question that use the p-curve to infer that most published research findings are based on true effects in the medical literature and in a wide range of disciplines. p-values in observational research may need to be empirically calibrated to be interpretable with respect to the commonly used significance threshold of 0.05. Violations of randomization in experimental studies may also result in situations where the use of p-curves is similarly unreliable.
Directory of Open Access Journals (Sweden)
Yuegang Li
2016-01-01
Full Text Available An outstanding issue in the oil and gas industry is how to evaluate quantitatively the influences of water production on production performance of gas wells. Based on gas–water flow theories, therefore, a new method was proposed in this paper to evaluate quantitatively the production performance of water-producing gas wells by using gas & water relative permeability curves after a comparative study was conducted thoroughly. In this way, quantitative evaluation was performed on production capacity, gas production, ultimate cumulative gas production and recovery factor of water-producing gas wells. Then, a case study was carried out of the tight sandstone gas reservoirs with strong heterogeneity in the Sulige gas field, Ordos Basin. This method was verified in terms of practicability and reliability through a large amount of calculation based on the actual production performance data of various gas wells with different volumes of water produced. Finally, empirical formula and charts were established for water-producing gas wells in this field to quantitatively evaluate their production capacity, gas production, ultimate cumulative gas production and recovery factor in the conditions of different water–gas ratios. These formula and charts provide technical support for the field application and dissemination of the method. Study results show that water production is serious in the west of this field with water–gas ratio varying in a large range. If the average water–gas ratio is 1.0 (or 2.0 m3/104 m3, production capacity, cumulative gas production and recovery factor of gas wells will be respectively 24.4% (or 40.2%, 24.4% (or 40.2% and 17.4% (or 33.2%.
Smarandache Curves In Terms of Sabban Frame of Fixed Pole Curve
Directory of Open Access Journals (Sweden)
Süleyman Şenyurt
2016-06-01
Full Text Available In this paper, we study the special Smarandache curve interms of Sabban frame of Fixed Pole curve and we give some characterization of Smarandache curves. Besides, we illustrate examples of our results.
Transition curves for highway geometric design
Kobryń, Andrzej
2017-01-01
This book provides concise descriptions of the various solutions of transition curves, which can be used in geometric design of roads and highways. It presents mathematical methods and curvature functions for defining transition curves. .
2002-01-01
The Atlas of Stress-Strain Curves, Second Edition is substantially bigger in page dimensions, number of pages, and total number of curves than the previous edition. It contains over 1,400 curves, almost three times as many as in the 1987 edition. The curves are normalized in appearance to aid making comparisons among materials. All diagrams include metric (SI) units, and many also include U.S. customary units. All curves are captioned in a consistent format with valuable information including (as available) standard designation, the primary source of the curve, mechanical properties (including hardening exponent and strength coefficient), condition of sample, strain rate, test temperature, and alloy composition. Curve types include monotonic and cyclic stress-strain, isochronous stress-strain, and tangent modulus. Curves are logically arranged and indexed for fast retrieval of information. The book also includes an introduction that provides background information on methods of stress-strain determination, on...
Integration over Tropical Plane Curves and Ultradiscretization
Iwao, Shinsuke
2008-01-01
In this article we study holomorphic integrals on tropical plane curves in view of ultradiscretization. We prove that the lattice integrals over tropical curves can be obtained as a certain limit of complex integrals over Riemannian surfaces.
Integrable System and Motion of Curves in Projective and Similarity Geometries
International Nuclear Information System (INIS)
Hou Yuqing
2006-01-01
Based on the natural frame in the projective geometry, motions of curves in projective geometry are studied. It is shown that several integrable equations including Sawada-Kotera and KK equations arise from motion of plane curves in projective geometries. Motion of space curves described by acceleration field and governed by endowing an extra space variable in similarity geometry P 3 is also studied.
Strong laws for generalized absolute Lorenz curves when data are stationary and ergodic sequences
R. Helmers (Roelof); R. Zitikis
2004-01-01
textabstractWe consider generalized absolute Lorenz curves that include, as special cases, classical and generalized L - statistics as well as absolute or, in other words, generalized Lorenz curves. The curves are based on strictly stationary and ergodic sequences of random variables. Most of the
Estimation method of the fracture resistance curve
Energy Technology Data Exchange (ETDEWEB)
Cho, Sung Keun; Lee, Kwang Hyeon; Koo, Jae Mean; Seok, Chang Sung [Sungkyunkwan Univ., Suwon (Korea, Republic of); Park, Jae Sil [Samsung Electric Company, Suwon (Korea, Republic of)
2008-07-01
Fracture resistance curves for concerned materials are required in order to perform elastic-plastic fracture mechanical analysis. Fracture resistance curve is built with J-integral values and crack extension values. The objective of this paper is to propose the estimation method of the fracture resistance curve. The estimation method of the fracture resistance curve for the pipe specimen was proposed by the load ratio method from load - displacement data for the standard specimen.
M-curves and symmetric products
Indian Academy of Sciences (India)
Indranil Biswas
2017-08-03
Aug 3, 2017 ... Since M-curves play a special role in the topology of real algebraic varieties, it is useful to have a criterion for M-curves. It was proved earlier that a curve defined over R is an. M-curve if and only if its Jacobian is an M-variety [5]. We use this result of [5] and the. Picard bundle to prove that the n-th symmetric ...
Gabauer, Douglas J; Li, Xiaolong
2015-04-01
The purpose of this study was to investigate motorcycle-to-barrier crash frequency on horizontally curved roadway sections in Washington State using police-reported crash data linked with roadway data and augmented with barrier presence information. Data included 4915 horizontal curved roadway sections with 252 of these sections experiencing 329 motorcycle-to-barrier crashes between 2002 and 2011. Negative binomial regression was used to predict motorcycle-to-barrier crash frequency using horizontal curvature and other roadway characteristics. Based on the model results, the strongest predictor of crash frequency was found to be curve radius. This supports a motorcycle-to-barrier crash countermeasure placement criterion based, at the very least, on horizontal curve radius. With respect to the existing horizontal curve criterion of 820 feet or less, curves meeting this criterion were found to increase motorcycle-to-barrier crash frequency rate by a factor of 10 compared to curves not meeting this criterion. Other statistically significant predictors were curve length, traffic volume and the location of adjacent curves. Assuming curves of identical radius, the model results suggest that longer curves, those with higher traffic volume, and those that have no adjacent curved sections within 300 feet of either curve end would likely be better candidates for a motorcycle-to-barrier crash countermeasure. Copyright © 2015 Elsevier Ltd. All rights reserved.
Futa, Yuichi; Okazaki, Hiroyuki; Shidama, Yasunari
2013-01-01
In this paper, we introduce our formalization of the definitions and theorems related to an elliptic curve over a finite prime field. The elliptic curve is important in an elliptic curve cryptosystem whose security is based on the computational complexity of the elliptic curve discrete logarithm problem.
Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves
Energy Technology Data Exchange (ETDEWEB)
Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)
1998-11-01
Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)
Modeling fertility curves in Africa
Directory of Open Access Journals (Sweden)
Ezra Gayawan
2010-02-01
Full Text Available The modeling of fertility patterns is an essential method researchers use to understand world-wide population patterns. Various types of fertility models have been reported in the literature to capture the patterns specific to developed countries. While much effort has been put into reducing fertility rates in Africa, models which describe the fertility patterns have not been adequately described. This article presents a flexible parametric model that can adequately capture the varying patterns of the age-specific fertility curves of African countries. The model has parameters that are interpretable in terms of demographic indices. The performance of this model was compared with other commonly used models and Akaike's Information Criterion was used for selecting the model with best fit. The presented model was able to reproduce the empirical fertility data of 11 out of 15 countries better than the other models considered.
Euler characteristics and elliptic curves.
Coates, J; Howson, S
1997-10-14
Let E be a modular elliptic curve over [symbol, see text], without complex multiplication; let p be a prime number where E has good ordinary reduction; and let Finfinity be the field obtained by adjoining [symbol, see text] to all p-power division points on E. Write Ginfinity for the Galois group of Finfinity over [symbol, see text]. Assume that the complex L-series of E over [symbol, see text] does not vanish at s = 1. If p >/= 5, we make a precise conjecture about the value of the Ginfinity-Euler characteristic of the Selmer group of E over Finfinity. If one makes a standard conjecture about the behavior of this Selmer group as a module over the Iwasawa algebra, we are able to prove our conjecture. The crucial local calculations in the proof depend on recent joint work of the first author with R. Greenberg.
Bacterial streamers in curved microchannels
Rusconi, Roberto; Lecuyer, Sigolene; Guglielmini, Laura; Stone, Howard
2009-11-01
Biofilms, generally identified as microbial communities embedded in a self-produced matrix of extracellular polymeric substances, are involved in a wide variety of health-related problems ranging from implant-associated infections to disease transmissions and dental plaque. The usual picture of these bacterial films is that they grow and develop on surfaces. However, suspended biofilm structures, or streamers, have been found in natural environments (e.g., rivers, acid mines, hydrothermal hot springs) and are always suggested to stem from a turbulent flow. We report the formation of bacterial streamers in curved microfluidic channels. By using confocal laser microscopy we are able to directly image and characterize the spatial and temporal evolution of these filamentous structures. Such streamers, which always connect the inner corners of opposite sides of the channel, are always located in the middle plane. Numerical simulations of the flow provide evidences for an underlying hydrodynamic mechanism behind the formation of the streamers.
The estimation of I–V curves of PV panel using manufacturers’ I–V curves and evolutionary strategy
International Nuclear Information System (INIS)
Barukčić, M.; Hederić, Ž.; Špoljarić, Ž.
2014-01-01
Highlights: • The approximation of a I–V curve by two linear and a sigmoid functions is proposed. • The sigmoid function is used to estimate the knee of the I–V curve. • Dependence on irradiance and temperature of sigmoid function parameters is proposed. • The sigmoid function is used to estimate maximum power point (MPP). - Abstract: The method for estimation of I–V curves of photovoltaic (PV) panel by analytic expression is presented in the paper. The problem is defined in the form of an optimization problem. The optimization problem objective is based on data from I–V curves obtained by manufacturers’ or measured I–V curves. In order to estimate PV panel parameters, the optimization problem is solved by using an evolutionary strategy. The proposed method is tested for different PV panel technologies using data sheets. In this method the I–V curve approximation with two linear and a sigmoid function is proposed. The method for estimating the knee of the I–V curve and maximum power point at any irradiance and temperature is proposed
AKLSQF - LEAST SQUARES CURVE FITTING
Kantak, A. V.
1994-01-01
The Least Squares Curve Fitting program, AKLSQF, computes the polynomial which will least square fit uniformly spaced data easily and efficiently. The program allows the user to specify the tolerable least squares error in the fitting or allows the user to specify the polynomial degree. In both cases AKLSQF returns the polynomial and the actual least squares fit error incurred in the operation. The data may be supplied to the routine either by direct keyboard entry or via a file. AKLSQF produces the least squares polynomial in two steps. First, the data points are least squares fitted using the orthogonal factorial polynomials. The result is then reduced to a regular polynomial using Sterling numbers of the first kind. If an error tolerance is specified, the program starts with a polynomial of degree 1 and computes the least squares fit error. The degree of the polynomial used for fitting is then increased successively until the error criterion specified by the user is met. At every step the polynomial as well as the least squares fitting error is printed to the screen. In general, the program can produce a curve fitting up to a 100 degree polynomial. All computations in the program are carried out under Double Precision format for real numbers and under long integer format for integers to provide the maximum accuracy possible. AKLSQF was written for an IBM PC X/AT or compatible using Microsoft's Quick Basic compiler. It has been implemented under DOS 3.2.1 using 23K of RAM. AKLSQF was developed in 1989.
Sibling curves of quadratic polynomials | Wiggins | Quaestiones ...
African Journals Online (AJOL)
Sibling curves were demonstrated in [1, 2] as a novel way to visualize the zeroes of real valued functions. In [3] it was shown that a polynomial of degree n has n sibling curves. This paper focuses on the algebraic and geometric properites of the sibling curves of real and complex quadratic polynomials. Key words: Quadratic ...
Legendre Elliptic Curves over Finite Fields
Auer, Roland; Top, Jakob
2002-01-01
We show that every elliptic curve over a finite field of odd characteristic whose number of rational points is divisible by 4 is isogenous to an elliptic curve in Legendre form, with the sole exception of a minimal respectively maximal elliptic curve. We also collect some results concerning the
Cubic spline functions for curve fitting
Young, J. D.
1972-01-01
FORTRAN cubic spline routine mathematically fits curve through given ordered set of points so that fitted curve nearly approximates curve generated by passing infinite thin spline through set of points. Generalized formulation includes trigonometric, hyperbolic, and damped cubic spline fits of third order.
Trigonometric Characterization of Some Plane Curves
Indian Academy of Sciences (India)
IAS Admin
(Figure 1). A relation between tan θ and tanψ gives the trigonometric equation of the family of curves. In this article, trigonometric equations of some known plane curves are deduced and it is shown that these equations reveal some geometric characteristics of the families of the curves under consideration. In Section 2,.
Holomorphic curves in exploded manifolds: Kuranishi structure
Parker, Brett
2013-01-01
This paper constructs a Kuranishi structure for the moduli stack of holomorphic curves in exploded manifolds. To avoid some technicalities of abstract Kuranishi structures, we embed our Kuranishi structure inside a moduli stack of curves. The construction also works for the moduli stack of holomorphic curves in any compact symplectic manifold.
Automated Blazar Light Curves Using Machine Learning
Energy Technology Data Exchange (ETDEWEB)
Johnson, Spencer James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-07-27
This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.
Hardware Accelerators for Elliptic Curve Cryptography
Directory of Open Access Journals (Sweden)
C. Puttmann
2008-05-01
Full Text Available In this paper we explore different hardware accelerators for cryptography based on elliptic curves. Furthermore, we present a hierarchical multiprocessor system-on-chip (MPSoC platform that can be used for fast integration and evaluation of novel hardware accelerators. In respect of two application scenarios the hardware accelerators are coupled at different hierarchy levels of the MPSoC platform. The whole system is implemented in a state of the art 65 nm standard cell technology. Moreover, an FPGA-based rapid prototyping system for fast system verification is presented. Finally, a metric to analyze the resource efficiency by means of chip area, execution time and energy consumption is introduced.
Magneto-electro-elastic buckling analysis of nonlocal curved nanobeams
Ebrahimi, Farzad; Reza Barati, Mohammad
2016-09-01
In this work, a size-dependent curved beam model is developed to take into account the effects of nonlocal stresses on the buckling behavior of curved magneto-electro-elastic FG nanobeams for the first time. The governing differential equations are derived based on the principle of virtual work and Euler-Bernoulli beam theory. The power-law function is employed to describe the spatially graded magneto-electro-elastic properties. By extending the radius of the curved nanobeam to infinity, the results of straight nonlocal FG beams can be rendered. The effects of magnetic potential, electric voltage, opening angle, nonlocal parameter, power-law index and slenderness ratio on buckling loads of curved MEE-FG nanobeams are studied.
Symmetric digit sets for elliptic curve scalar multiplication without precomputation.
Heuberger, Clemens; Mazzoli, Michela
2014-08-28
We describe a method to perform scalar multiplication on two classes of ordinary elliptic curves, namely [Formula: see text] in prime characteristic [Formula: see text], and [Formula: see text] in prime characteristic [Formula: see text]. On these curves, the 4-th and 6-th roots of unity act as (computationally efficient) endomorphisms. In order to optimise the scalar multiplication, we consider a width- w -NAF (Non-Adjacent Form) digit expansion of positive integers to the complex base of τ , where τ is a zero of the characteristic polynomial [Formula: see text] of the Frobenius endomorphism associated to the curve. We provide a precomputationless algorithm by means of a convenient factorisation of the unit group of residue classes modulo τ in the endomorphism ring, whereby we construct a digit set consisting of powers of subgroup generators, which are chosen as efficient endomorphisms of the curve.
Ice detection on wind turbines using observed power curve
DEFF Research Database (Denmark)
Davis, Neil; Byrkjedal, Øyvind; Hahmann, Andrea N.
2016-01-01
be used to separate iced production periods from non-iced production periods. The first approach relies on a percentage deviation from the manufacturer’s power curve. The other two approaches fit threshold curves based on the observed variance of non-iced production data. These approaches are applied......Icing on the blades of a wind turbine can lead to significant production losses during the winter months for wind parks in cold climate regions. However, there is no standard way of identifying ice-induced power loss. This paper describes three methods for creating power threshold curves that can...... to turbines in four wind parks and compared with each other and to observations of icing on the nacelle of one of the turbines in each park. It is found that setting an ice threshold curve using 0.1 quantile of the observed power data during normal operation with a 2-h minimum duration is the best approach...
Object-Image Correspondence for Algebraic Curves under Projections
Directory of Open Access Journals (Sweden)
Joseph M. Burdis
2013-03-01
Full Text Available We present a novel algorithm for deciding whether a given planar curve is an image of a given spatial curve, obtained by a central or a parallel projection with unknown parameters. The motivation comes from the problem of establishing a correspondence between an object and an image, taken by a camera with unknown position and parameters. A straightforward approach to this problem consists of setting up a system of conditions on the projection parameters and then checking whether or not this system has a solution. The computational advantage of the algorithm presented here, in comparison to algorithms based on the straightforward approach, lies in a significant reduction of a number of real parameters that need to be eliminated in order to establish existence or non-existence of a projection that maps a given spatial curve to a given planar curve. Our algorithm is based on projection criteria that reduce the projection problem to a certain modification of the equivalence problem of planar curves under affine and projective transformations. To solve the latter problem we make an algebraic adaptation of signature construction that has been used to solve the equivalence problems for smooth curves. We introduce a notion of a classifying set of rational differential invariants and produce explicit formulas for such invariants for the actions of the projective and the affine groups on the plane.
Power forward curves: a managerial perspective
International Nuclear Information System (INIS)
Nagarajan, Shankar
1999-01-01
This chapter concentrates on managerial application of power forward curves, and examines the determinants of electricity prices such as transmission constraints, its inability to be stored in a conventional way, its seasonality and weather dependence, the generation stack, and the swing risk. The electricity forward curve, classical arbitrage, constructing a forward curve, volatilities, and electricity forward curve models such as the jump-diffusion model, the mean-reverting heteroscedastic volatility model, and an econometric model of forward prices are examined. A managerial perspective of the applications of the forward curve is presented covering plant valuation, capital budgeting, performance measurement, product pricing and structuring, asset optimisation, valuation of transmission options, and risk management
Retrograde curves of solidus and solubility
International Nuclear Information System (INIS)
Vasil'ev, M.V.
1979-01-01
The investigation was concerned with the constitutional diagrams of the eutectic type with ''retrograde solidus'' and ''retrograde solubility curve'' which must be considered as diagrams with degenerate monotectic transformation. The solidus and the solubility curves form a retrograde curve with a common retrograde point representing the solubility maximum. The two branches of the Aetrograde curve can be described with the aid of two similar equations. Presented are corresponding equations for the Cd-Zn system and shown is the possibility of predicting the run of the solubility curve
Handbook of elliptic and hyperelliptic curve cryptography
Cohen, Henri; Avanzi, Roberto; Doche, Christophe; Lange, Tanja; Nguyen, Kim; Vercauteren, Frederik
2005-01-01
… very comprehensive coverage of this vast subject area … a useful and essential treatise for anyone involved in elliptic curve algorithms … this book offers the opportunity to grasp the ECC technology with a diversified and comprehensive perspective. … This book will remain on my shelf for a long time and will land on my desk on many occasions, if only because the coverage of the issues common to factoring and discrete log cryptosystems is excellent.-IACR Book Reviews, June 2011… the book is designed for people who are working in the area and want to learn more about a specific issue. The chapters are written to be relatively independent so that readers can focus on the part of interest for them. Such readers will be grateful for the excellent index and extensive bibliography. … the handbook covers a wide range of topics and will be a valuable reference for researchers in curve-based cryptography. -Steven D. Galbraith, Mathematical Reviews, Issue 2007f.
Modeling of alpha mass-efficiency curve
International Nuclear Information System (INIS)
Semkow, T.M.; Jeter, H.W.; Parsa, B.; Parekh, P.P.; Haines, D.K.; Bari, A.
2005-01-01
We present a model for efficiency of a detector counting gross α radioactivity from both thin and thick samples, corresponding to low and high sample masses in the counting planchette. The model includes self-absorption of α particles in the sample, energy loss in the absorber, range straggling, as well as detector edge effects. The surface roughness of the sample is treated in terms of fractal geometry. The model reveals a linear dependence of the detector efficiency on the sample mass, for low masses, as well as a power-law dependence for high masses. It is, therefore, named the linear-power-law (LPL) model. In addition, we consider an empirical power-law (EPL) curve, and an exponential (EXP) curve. A comparison is made of the LPL, EPL, and EXP fits to the experimental α mass-efficiency data from gas-proportional detectors for selected radionuclides: 238 U, 230 Th, 239 Pu, 241 Am, and 244 Cm. Based on this comparison, we recommend working equations for fitting mass-efficiency data. Measurement of α radioactivity from a thick sample can determine the fractal dimension of its surface
Defining a learning curve for laparoscopic cardiomyotomy.
Grotenhuis, Brechtje A; Wijnhoven, Bas P L; Jamieson, Glyn G; Devitt, Peter G; Bessell, Justin R; Watson, David I
2008-08-01
This study was designed to determine whether there is a learning curve for laparoscopic cardiomyotomy for the treatment of achalasia. All patients who underwent a primary laparoscopic cardiomyotomy for achalasia between 1992 and 2006 in our hospitals were identified from a prospective database. The institutional and the individual surgeon's learning experiences were assessed based on operative and clinical outcome parameters. The outcomes of cardiomyotomies performed by consultant surgeons versus supervised trainees also were compared. A total of 186 patients met the inclusion criteria; 144 procedures were undertaken by consultant surgeons and 42 by a surgical trainee. The length of operation decreased after the first ten cases in both the institutional and each individual experience. The rate of conversion to open surgery also was significantly higher in the first 20 cases performed. Intraoperative complications, overall satisfaction with the outcome, reoperation rate, and postoperative dysphagia were not associated with the institutional or the surgeon's operative experience. Although the length of the operation was greater for surgical trainees (93 versus 79 minutes; p learning curve for laparoscopic cardiomyotomy for achalasia can be defined. The clinical outcome for laparoscopic cardiomyotomy does not differ between supervised surgical trainees and consultant surgeons.
Flood damage curves for consistent global risk assessments
de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek
2016-04-01
Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra
Page curves for tripartite systems
International Nuclear Information System (INIS)
Hwang, Junha; Lee, Deok Sang; Nho, Dongju; Oh, Jeonghun; Park, Hyosub; Zoe, Heeseung; Yeom, Dong-han
2017-01-01
We investigate information flow and Page curves for tripartite systems. We prepare a tripartite system (say, A , B , and C ) of a given number of states and calculate information and entropy contents by assuming random states. Initially, every particle was in A (this means a black hole), and as time goes on, particles move to either B (this means Hawking radiation) or C (this means a broadly defined remnant, including a non-local transport of information, the last burst, an interior large volume, or a bubble universe, etc). If the final number of states of the remnant is smaller than that of Hawking radiation, then information will be stored by both the radiation and the mutual information between the radiation and the remnant, while the remnant itself does not contain information. On the other hand, if the final number of states of the remnant is greater than that of Hawking radiation, then the radiation contains negligible information, while the remnant and the mutual information between the radiation and the remnant contain information. Unless the number of states of the remnant is large enough compared to the entropy of the black hole, Hawking radiation must contain information; and we meet the menace of black hole complementarity again. Therefore, this contrasts the tension between various assumptions and candidates of the resolution of the information loss problem. (paper)
Curved-Line Cutting Using a Flexible Circular Saw
Yamada, Yohei; Osumi, Nobuyuki; Takasugi, Akio; Sasahara, Hiroyuki
We propose a flexible circular saw for high-speed cutting of curved lines in carbon fiber-reinforced plastic (CFRP). A conventional circular saw is appropriate for straight line cutting, but it cannot be applied to curved line cutting because of the interference between the saw body and the machined surface. To eliminate this problem, the flexible circular saw is deflected into a bowl shape by circular forced displacement, and the cross-section of the saw becomes a circular arc. A curved line can be cut by the bowl-like-deflection. The deflection shape is very important to realize the curved-line cutting without interference. We investigated the deflection of the flexible circular saw by a finite element method (FEM) analysis. Suitable slit shapes for the saw body are also proposed, based on the FEM results regarding stress in the saw body, the minimum radius of curvature, and the effects of cutting force and centrifugal force and eigenvalue. We also conducted a curved-line cutting test on a CFRP plate, and we found that the flexible circular saw can cut curved lines with high accuracy and high speed without interference between the saw body and the machined surface.
An analysis on the environmental Kuznets curve of Chengdu
Gao, Zijian; Peng, Yue; Zhao, Yue
2017-12-01
In this paper based on the environmental and economic data of Chengdu from 2005 to 2014, the measurement models were established to analyze 3 kinds of environmental flow indicators and 4 kinds of environmental stock indicators to obtain their EKC evolution trajectories and characters. The results show that the relationship curve between the discharge of SO2 from industry and the GDP per capita is a positive U shape, just as the curve between discharge of COD from industry and the GDP per person. The relationship curve between the dust discharge from industry and the GDP per capita is an inverted N shape. In the central of the urban the relationship curve between the concentration of SO2 in the air and the GDP per person is a positive U shape. The relationship curves between the concentration of NO2 in the air and the GDP per person, between the concentration of the particulate matters and the GDP per person, and between the concentration of the fallen dusts and the GDP per person are fluctuating. So the EKC curves of the 7 kinds of environmental indicators are not accord with inverted U shape feature. In the development of this urban the environmental problems can’t be resolved only by economic growth. The discharge of industrial pollutants should be controlled to improve the atmospheric environmental quality and reduce the environmental risks.
D-branes in a big bang/big crunch universe: Nappi-Witten gauged WZW model
Energy Technology Data Exchange (ETDEWEB)
Hikida, Yasuaki [School of Physics and BK-21 Physics Division, Seoul National University, Seoul 151-747 (Korea, Republic of); Nayak, Rashmi R. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' ' , Rome 00133 (Italy); Panigrahi, Kamal L. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy)
2005-05-01
We study D-branes in the Nappi-Witten model, which is a gauged WZW model based on (SL(2,R) x SU(2))/(U(1) x U(1)). The model describes a four dimensional space-time consisting of cosmological regions with big bang/big crunch singularities and static regions with closed time-like curves. The aim of this paper is to investigate by D-brane probes whether there are pathologies associated with the cosmological singularities and the closed time-like curves. We first classify D-branes in a group theoretical way, and then examine DBI actions for effective theories on the D-branes. In particular, we show that D-brane metric from the DBI action does not include singularities, and wave functions on the D-branes are well behaved even in the presence of closed time-like curves.
An appraisal of the learning curve in robotic general surgery.
Pernar, Luise I M; Robertson, Faith C; Tavakkoli, Ali; Sheu, Eric G; Brooks, David C; Smink, Douglas S
2017-11-01
Robotic-assisted surgery is used with increasing frequency in general surgery for a variety of applications. In spite of this increase in usage, the learning curve is not yet defined. This study reviews the literature on the learning curve in robotic general surgery to inform adopters of the technology. PubMed and EMBASE searches yielded 3690 abstracts published between July 1986 and March 2016. The abstracts were evaluated based on the following inclusion criteria: written in English, reporting original work, focus on general surgery operations, and with explicit statistical methods. Twenty-six full-length articles were included in final analysis. The articles described the learning curves in colorectal (9 articles, 35%), foregut/bariatric (8, 31%), biliary (5, 19%), and solid organ (4, 15%) surgery. Eighteen of 26 (69%) articles report single-surgeon experiences. Time was used as a measure of the learning curve in all studies (100%); outcomes were examined in 10 (38%). In 12 studies (46%), the authors identified three phases of the learning curve. Numbers of cases needed to achieve plateau performance were wide-ranging but overlapping for different kinds of operations: 19-128 cases for colorectal, 8-95 for foregut/bariatric, 20-48 for biliary, and 10-80 for solid organ surgery. Although robotic surgery is increasingly utilized in general surgery, the literature provides few guidelines on the learning curve for adoption. In this heterogeneous sample of reviewed articles, the number of cases needed to achieve plateau performance varies by case type and the learning curve may have multiple phases as surgeons add more complex cases to their case mix with growing experience. Time is the most common determinant for the learning curve. The literature lacks a uniform assessment of outcomes and complications, which would arguably reflect expertise in a more meaningful way than time to perform the operation alone.
Localized qubits in curved spacetimes
Palmer, Matthew C.; Takahashi, Maki; Westman, Hans F.
2012-04-01
We provide a systematic and self-contained exposition of the subject of localized qubits in curved spacetimes. This research was motivated by a simple experimental question: if we move a spatially localized qubit, initially in a state |ψ1>, along some spacetime path Γ from a spacetime point x1 to another point x2, what will the final quantum state |ψ2> be at point x2? This paper addresses this question for two physical realizations of the qubit: spin of a massive fermion and polarization of a photon. Our starting point is the Dirac and Maxwell equations that describe respectively the one-particle states of localized massive fermions and photons. In the WKB limit we show how one can isolate a two-dimensional quantum state which evolves unitarily along Γ. The quantum states for these two realizations are represented by a left-handed 2-spinor in the case of massive fermions and a four-component complex polarization vector in the case of photons. In addition we show how to obtain from this WKB approach a fully general relativistic description of gravitationally induced phases. We use this formalism to describe the gravitational shift in the Colella-Overhauser-Werner 1975 experiment. In the non-relativistic weak field limit our result reduces to the standard formula in the original paper. We provide a concrete physical model for a Stern-Gerlach measurement of spin and obtain a unique spin operator which can be determined given the orientation and velocity of the Stern-Gerlach device and velocity of the massive fermion. Finally, we consider multipartite states and generalize the formalism to incorporate basic elements from quantum information theory such as quantum entanglement, quantum teleportation, and identical particles. The resulting formalism provides a basis for exploring precision quantum measurements of the gravitational field using techniques from quantum information theory.
Directory of Open Access Journals (Sweden)
Soren Ventegodt
2003-01-01
Full Text Available In this paper we present a new research paradigm for alternative, complementary, and holistic medicine — a low-cost, effective, and scientifically valid design for evidence-based medicine. Our aim is to find the simplest, cheapest, and most practical way to collect data of sufficient quality and validity to determine: (1 which kinds of treatment give a clinically relevant improvement to quality of life, health, and/or functionality; (2 which groups of patients can be aided by alternative, complementary, or holistic medicine; and (3 which therapists have the competence to achieve the clinically relevant improvements. Our solution to the problem is that a positive change in quality of life must be immediate to be taken as caused by an intervention. We define “immediate” as within 1 month of the intervention. If we can demonstrate a positive result with a group of chronic patients (20 or more patients who have had their disease or state of suffering for 1 year or more, who can be significantly helped within 1 month, and the situation is still improved 1 year after, we find it scientifically evidenced that this cure or intervention has helped the patients. We call this characteristic curve a “square curve”. If a global, generic, quality-of-life questionnaire like QOL5 or, even better, a QOL-Health-Ability questionnaire (a quality-of-life questionnaire combined with a self-evaluated health and ability to function questionnaire is administered to the patients before and after the intervention, it is possible to document the effect of an intervention to a cost of only a few thousand Euros/USD. A general acceptance of this new research design will solve the problem that there is not enough money in alternative, complementary, and holistic medicine to pay the normal cost of a biomedical Cochrane study. As financial problems must not hinder the vital research in nonbiomedical medicine, we ask the scientific community to accept this new research
Construction of calibration curve for accountancy tank
International Nuclear Information System (INIS)
Kato, Takayuki; Goto, Yoshiki; Nidaira, Kazuo
2009-01-01
Tanks are equipped in a reprocessing plant for accounting solution of nuclear material. The careful measurement of volume in tanks is very important to implement rigorous accounting of nuclear material. The calibration curve relating the volume and level of solution needs to be constructed, where the level is determined by differential pressure of dip tubes. Several calibration curves are usually employed, but it's not explicitly decided how many segment are used, where to select segment, or what should be the degree of polynomial curve. These parameters, i.e., segment and degree of polynomial curve are mutually interrelated to give the better performance of calibration curve. Here we present the construction technique of giving optimum calibration curves and their characteristics. (author)
Using the generalized Radon transform for detection of curves in noisy images
DEFF Research Database (Denmark)
Toft, Peter Aundal
1996-01-01
In this paper the discrete generalized Radon transform will be investigated as a tool for detection of curves in noisy digital images. The discrete generalized Radon transform maps an image into a parameter domain, where curves following a specific parameterized curve form will correspond to a peak...... in the parameter domain. A major advantage of the generalized Radon transform is that the curves are allowed to intersect. This enables a thresholding algorithm in the parameter domain for simultaneous detection of curve parameters. A threshold level based on the noise level in the image is derived. A numerical...
Regional Marginal Abatement Cost Curves for NOx
U.S. Environmental Protection Agency — Data underlying the figures included in the manuscript "Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and...
Hacke, Uwe G; Venturas, Martin D; MacKinnon, Evan D; Jacobsen, Anna L; Sperry, John S; Pratt, R Brandon
2015-01-01
The standard centrifuge method has been frequently used to measure vulnerability to xylem cavitation. This method has recently been questioned. It was hypothesized that open vessels lead to exponential vulnerability curves, which were thought to be indicative of measurement artifact. We tested this hypothesis in stems of olive (Olea europea) because its long vessels were recently claimed to produce a centrifuge artifact. We evaluated three predictions that followed from the open vessel artifact hypothesis: shorter stems, with more open vessels, would be more vulnerable than longer stems; standard centrifuge-based curves would be more vulnerable than dehydration-based curves; and open vessels would cause an exponential shape of centrifuge-based curves. Experimental evidence did not support these predictions. Centrifuge curves did not vary when the proportion of open vessels was altered. Centrifuge and dehydration curves were similar. At highly negative xylem pressure, centrifuge-based curves slightly overestimated vulnerability compared to the dehydration curve. This divergence was eliminated by centrifuging each stem only once. The standard centrifuge method produced accurate curves of samples containing open vessels, supporting the validity of this technique and confirming its utility in understanding plant hydraulics. Seven recommendations for avoiding artefacts and standardizing vulnerability curve methodology are provided. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.
Guidelines for using the Delphi Technique to develop habitat suitability index curves
Crance, Johnie H.
1987-01-01
Habitat Suitability Index (SI) curves are one method of presenting species habitat suitability criteria. The curves are often used with the Habitat Evaluation Procedures (HEP) and are necessary components of the Instream Flow Incremental Methodology (IFIM) (Armour et al. 1984). Bovee (1986) described three categories of SI curves or habitat suitability criteria based on the procedures and data used to develop the criteria. Category I curves are based on professional judgment, with 1ittle or no empirical data. Both Category II (utilization criteria) and Category III (preference criteria) curves have as their source data collected at locations where target species are observed or collected. Having Category II and Category III curves for all species of concern would be ideal. In reality, no SI curves are available for many species, and SI curves that require intensive field sampling often cannot be developed under prevailing constraints on time and costs. One alternative under these circumstances is the development and interim use of SI curves based on expert opinion. The Delphi technique (Pill 1971; Delbecq et al. 1975; Linstone and Turoff 1975) is one method used for combining the knowledge and opinions of a group of experts. The purpose of this report is to describe how the Delphi technique may be used to develop expert-opinion-based SI curves.
Hong Shen
2011-01-01
The concepts of curve profile, curve intercept, curve intercept density, curve profile area density, intersection density in containing intersection (or intersection density relied on intersection reference), curve profile intersection density in surface (or curve intercept intersection density relied on intersection of containing curve), and curve profile area density in surface (AS) were defined. AS expressed the amount of curve profile area of Y phase in the unit containing surface area, S...
Polar representation of centrifugal pump homologous curves
International Nuclear Information System (INIS)
Veloso, Marcelo Antonio; Mattos, Joao Roberto Loureiro de
2008-01-01
Essential for any mathematical model designed to simulate flow transient events caused by pump operations is the pump performance data. The performance of a centrifugal pump is characterized by four basic parameters: the rotational speed, the volumetric flow rate, the dynamic head, and the hydraulic torque. Any one of these quantities can be expressed as a function of any two others. The curves showing the relationships between these four variables are called the pump characteristic curves, also referred to as four-quadrant curves. The characteristic curves are empirically developed by the pump manufacturer and uniquely describe head and torque as functions of volumetric flow rate and rotation speed. Because of comprising a large amount of points, the four-quadrant configuration is not suitable for computational purposes. However, it can be converted to a simpler form by the development of the homologous curves, in which dynamic head and hydraulic torque ratios are expressed as functions of volumetric flow and rotation speed ratios. The numerical use of the complete set of homologous curves requires specification of sixteen partial curves, being eight for the dynamic head and eight for the hydraulic torque. As a consequence, the handling of homologous curves is still somewhat complicated. In solving flow transient problems that require the pump characteristic data for all the operation zones, the polar form appears as the simplest way to represent the homologous curves. In the polar method, the complete characteristics of a pump can be described by only two closed curves, one for the dynamic head and other for the hydraulic torque, both in function of a single angular coordinate defined adequately in terms of the quotient between volumetric flow ratio and rotation speed ratio. The usefulness and advantages of this alternative method are demonstrated through a practical example in which the homologous curves for a pump of the type used in the main coolant loops of a
PLOTTAB, Curve and Point Plotting with Error Bars
International Nuclear Information System (INIS)
1999-01-01
1 - Description of program or function: PLOTTAB is designed to plot any combination of continuous curves and/or discrete points (with associated error bars) using user supplied titles and X and Y axis labels and units. If curves are plotted, the first curve may be used as a standard; the data and the ratio of the data to the standard will be plotted. 2 - Method of solution: PLOTTAB: The program has no idea of what data is being plotted and yet by supplying titles, X and Y axis labels and units the user can produce any number of plots with each plot containing almost any combination of curves and points with each plot properly identified. In order to define a continuous curve between tabulated points, this program must know how to interpolate between points. By input the user may specify either the default option of linear x versus linear y interpolation or alternatively log x and/or log Y interpolation. In all cases, regardless of the interpolation specified, the program will always interpolate the data to the plane of the plot (linear or log x and y plane) in order to present the true variation of the data between tabulated points, based on the user specified interpolation law. Tabulated points should be tabulated at a sufficient number of x values to insure that the difference between the specified interpolation and the 'true' variation of a curve between tabulated values is relatively small. 3 - Restrictions on the complexity of the problem: A combination of up to 30 curves and sets of discrete points may appear on each plot. If the user wishes to use this program to compare different sets of data, all of the data must be in the same units
Comparison and evaluation of mathematical lactation curve ...
African Journals Online (AJOL)
p2492989
and on the log of 305-d divided by day in lactation (linear and quadratic) were better than the Gamma function. A study of lactation curves in dairy cattle on farms in central Mexico showed that the Dijkstra function was superior to the Wood, Wilmink and Rook functions for describing the lactation curve (Val-. Arreola et al.
Spectral Curves of Operators with Elliptic Coefficients
Directory of Open Access Journals (Sweden)
J. Chris Eilbeck
2007-03-01
Full Text Available A computer-algebra aided method is carried out, for determining geometric objects associated to differential operators that satisfy the elliptic ansatz. This results in examples of Lamé curves with double reduction and in the explicit reduction of the theta function of a Halphen curve.
Inverse Problem for a Curved Quantum Guide
Directory of Open Access Journals (Sweden)
Laure Cardoulis
2012-01-01
Full Text Available We consider the Dirichlet Laplacian operator −Δ on a curved quantum guide in ℝ n(n=2,3 with an asymptotically straight reference curve. We give uniqueness results for the inverse problem associated to the reconstruction of the curvature by using either observations of spectral data or a boot-strapping method.